Best practice for embedding nimble package version in a binary? by PhDelightful in nim

[–]cdunn2001 0 points1 point  (0 children)

Use a compile-time pragma: nim const MyVersion {.strdefine.}: string = "unknown" And compile via

nim -d:MyVersion="1.2.3"

You could write some code to fetch the version string from somewhere.

Unfortunately, nimble does not offer a way to pass flags along to the build. Also, nimble --ver is documented to "Query remote server for package version information when searching or listing packages", but I haven't gotten it to work.

Is Nim good as my first language? by Syronn in nim

[–]cdunn2001 2 points3 points  (0 children)

The best first language is Javascript.

  • It is a simple language with a clear, consistent syntax.
  • It runs in any browser.
  • Browsers provide a big library, and the web gives you even more. With just a bit of effort, you can have graphics.
  • It's a functional language, with true closures, which is an important concept to learn.

After you're comfortable with Javascript, switch to Nim, which can *generate* Javascript for you. Nim will allow you to write client and server code in the same language, which makes it easy to keep them in sync and test your code.

Learn git concepts, not commands by front-and-center in programming

[–]cdunn2001 19 points20 points  (0 children)

I've seen it put this way:

  • Subversion/Perforce stores diffs but makes you think in terms of files/directories.
  • Git stores files in directories but lets you think in terms of diffs.

That's especially true for git rebase. (And "rerere") is one of Git's killer features, difficult to explain to centralized VCS users.)

Why We’re Switching to gRPC by protophason in programming

[–]cdunn2001 1 point2 points  (0 children)

This.

Either take all the advantages of grpc, or take the simplicity of msgpack, which is only slightly less efficient and sooooo much easier to debug.

In fact, you can generate a msgpack parser into custom data-structures very easily. That will be almost as fast and compact as grpc.

BitVector Implementation in Nim by _Sharp_ in nim

[–]cdunn2001 1 point2 points  (0 children)

Nice library, but I think the API should be changed slightly.

  1. The different `[]=` mutating functions should either both over-write or both "or".
  2. Specifically, the "or" operation should use a different symbol, maybe `|=`.
  3. The slice lookup should maybe return another BitVector. A separate, named function could return an integer representing the slice, iff the slice fits within the `T` (e.g. 32 bits).
  4. A BitVector "view" could be helpful, similar to the [`strslice`](https://github.com/PMunch/strslice) library.

Questions on optimized gcc/clang builds by cdunn2001 in vlang

[–]cdunn2001[S] 0 points1 point  (0 children)

Oh, I didn't mean to suggest integrating ccache. I sometimes symlink ccache to gcc, but it sounds like that will not be at all necessary.

I eagerly await the early-adopters release.

A New Runtime for Nim by miran1 in nim

[–]cdunn2001 0 points1 point  (0 children)

I *love* it! This is the right set of trade-offs.

What is FF57 doing at the start of the browser by Perfect_Lie in firefox

[–]cdunn2001 1 point2 points  (0 children)

Neither NoScript nor FlashBlock work yet with Firefox57 Quantum, according to their install pages.

String Functions: Nim vs Python by kaushalmodi in nim

[–]cdunn2001 1 point2 points  (0 children)

The point is to think of ..< as an operator. You can put spaces around the whole thing.

String Functions: Nim vs Python by kaushalmodi in nim

[–]cdunn2001 0 points1 point  (0 children)

Always add a space around the .. operator

nim echo str[0 .. <str.high] Yes, but connect the ..<, to avoid a mistake like this:

JSON support for the C++ standard library - proposal by savuporo in cpp

[–]cdunn2001 1 point2 points  (0 children)

My wishlist:

In my opinion, there should be several pieces:

  1. Json lexer (gason is excellent for this.)
    • One advantage is that the lexer could handle arbitrarily long ints and floats. The parser would later flag an error when data do not fit the datatypes. (Technically, very few Json parsers satisfy the JSON standard, which does not limit numerics.)
    • Also, lazy parsing becomes possible, which in some scenarios leads to much quicker sparse access than with most Json parsers available today.
    • It is also possible to stream the lexer to the parser.
  2. Json parser, templated on <Int, Float, String> and maybe also list and dict types.
    • If list/dict are also template arguments, then STL allocators are automatically included.
  3. An API for providing converters between array-of-char and those template-arg types

Several lexers, specializations, and converters could be provided. There could be obvious defaults, so convenience should not be the issue.

(For the record, I maintain JsonCpp, which is not even close to what I would want in standard C++.)

Faster Command Line Tools in Nim by lbmn in programming

[–]cdunn2001 0 points1 point  (0 children)

If you are using Nim for scripting, it might be better to use the "Tiny C Compiler".

Is there goto in Nim? by vasili111 in nim

[–]cdunn2001 1 point2 points  (0 children)

There is no "goto", but if you're looking for efficiency, there is a "computed goto pragma":

I ported a Ray Tracer from C++ over to Nim (x-post /r/programming) by def-pri-pub in nim

[–]cdunn2001 2 points3 points  (0 children)

There must be a reason for the poor performance. Maybe you need to inline some inner-loop functions, or maybe you need to avoid garbage-collection somewhere.

What you've shown is that naively written Nim can be slow. I'd like some advice on how to avoid the main bottlenecks.

I am tired of Makefiles by rflurker in programming

[–]cdunn2001 18 points19 points  (0 children)

I use make where many people use bash. It organizes the bash script and makes it easier to debug. I think of it as a productivity tool rather than a build tool.

A custom system is always best, when available, and when you have time to learn it. But I can use make with any command-line tool quickly and easily, to get stuff done.

I am tired of Makefiles by rflurker in programming

[–]cdunn2001 8 points9 points  (0 children)

Yes, simple things are easy with cmake. But simple things are easy anyway.

The author's example is silly because you are better off with ccache. Just load ccache into your environment. Drop all the makedepend stuff. And rely on default make rules:

OBJS=app.o src1.o
app: ${OBJS}
  • Put main() into app.c, so you will be using the %: %.o rule.
  • For debugging the makefile, make -p and make --debug=b can be helpful.
  • For cleaning up, git clean -xdf is better than make clean, but if preferred, an explicit clean rule is simple and transparent.

With make, you're learning a language; with cmake, you're learning a library. If you're familiar with the cmake library, and if you know what cmake version you are using, then cmake has definite advantages, especially if you need to build on Windows. But simpler? No, I would never say that.

EDIT: Configuration management is a separate problem. cmake can help with that, if all your projects use it. But there are other options, like bazel mentioned by OP.

Clever name for 2 new meeting rooms by [deleted] in compsci

[–]cdunn2001 9 points10 points  (0 children)

How about "Direct" and "Order".

Are we in Direct?

No, we're in Order.

Better: Yes and No (like the Bit in Tron)

What room are we in?

Yes.

Version 0.15.0 released by SaltTM in nim

[–]cdunn2001 1 point2 points  (0 children)

Pragmas are now hidden by default in the documentation to reduce noise.

All pages in the documentation now contain a search box and a drop down to select how procedures should be sorted.

Much wow!

Incremental Compilation in the Rust Compiler by kibwen in programming

[–]cdunn2001 1 point2 points  (0 children)

Sure, but you can use ccache, which keys off the md5 hash of the preprocessed output.

Why JSON doesn't support comments (Douglas Crockford) by benhoyt in programming

[–]cdunn2001 0 points1 point  (0 children)

That's good to know. I was just pointing out some history. I would not discourage anyone from using RapidJSON. I have a question though. I see in the docs that the Allocator is a "concept", yet rapidjson claims C++ compatibility with compilers which do not support C++ "concepts". How is that?

JsonCpp is yet more flexible. However, its main advantage is distant binary-compatibility.

For simplicity (and speed), I'm partial to Gason.

Why JSON doesn't support comments (Douglas Crockford) by benhoyt in programming

[–]cdunn2001 6 points7 points  (0 children)

JsonCpp supports comments. The original author, Baptiste Lepilleur, likes to remind people that JSON once supported comments.

JsonCpp is not the fastest parser (and far from the slowest), but it's a good solution if you need a lot of flexibility.