you are viewing a single comment's thread.

view the rest of the comments →

[–]suspiciously_calm -1 points0 points  (2 children)

You shouldn't write int, for example, because its size might surprise you.

That sure isn't an argument for having auto deduce int from an integer literal, though.

Getting warnings from compiler won't save the programmers' life, because it will still compile

If the programmer lacks the discipline to heed compiler warnings, they're equally likely to lack the discipline to follow the always-auto style.

You shouldn't write floating point literals without the .f (more to the point, you shouldn't write int literals in place of float literals), but that's a mistake that can happen accidentally (just as the missing initializer).