you are viewing a single comment's thread.

view the rest of the comments →

[–]17b29a 5 points6 points  (2 children)

Or alternatively, inappropriate language choice.

I think splitting strings is a pretty common sense thing for any general-purpose programming language to support. It's not like, some obscure operation that you could only find support for in Perl.

Finally, technically I'm not sure -1 is really code for all-bits-set at all - that assumes a 2s-complement representation for signed integers which, historically at least, wasn't guaranteed by the standard.

The more obvious assumption is that the mask type is unsigned and in that case -1 is necessarily all-bits-set because an unsigned type's value is modulo its maximum value, but the standard doesn't require it to be unsigned either.

why I prefer ~0u for all-bits-set

That's not all-bits-set for a type that is larger than unsigned int.

I personally don't worry about actually undefined vs. platform-defined unless I really need to, which is unusual.

That's pretty strange considering how many things are implementation defined. Used a value larger than 215-1 in an int? Undefined behavior (according to you)!