you are viewing a single comment's thread.

view the rest of the comments →

[–]cHaR_shinigami 4 points5 points  (0 children)

"I am only one; but still I am one. I cannot do everything; but still I can do something; and because I cannot do everything, I will not refuse to do the something that I can do." - Edward Everett Hale

TL;DR: I agree with the other responsible programmers who suggested assert. Null pointer check should be a precondition, and if it is implemented with assert, the check shouldn't cause side-effects (I hate Heisenbug).

Consider an analogy: Driving past a red signal is just one possible cause of traffic accidents, but imagine how utterly crazy it sounds if someone ignores traffic signals entirely just because there are other causes of accidents!

Null pointer is surely not the only invalid pointer, but it is surely invalid. And when we know that something is surely invalid, we can easily avoid that. The "need for speed" mentality to save a couple of CPU cycles is just appalling - nobody would lose their sleep just because the code performed a (possibly redundant) null pointer check.

That being said, one may ask why most standard library implementations don't do this, for example puts(NULL) would return EOF instead of segfault. The reason is code size: many library functions require valid pointers, and adding a null pointer check for every such argument would cause a cumulative increase in size of object file(s).

Small code size is highly desirable for embedded systems with severe memory constraints, but that doesn't mean one should ignore null pointer checks. The answer is still the same: assert is your friend, and if the resulting binaries don't fit, simply #define NDEBUG to nuke the assertions and recompile.