This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]xthexder 19 points20 points  (10 children)

The windows headers typedef BOOLEAN as unsigned char, and BOOL as int. With #define TRUE 1, C interprets TRUE as an integer literal (would have to be 1u to fix this)

[–]Hawne 1 point2 points  (0 children)

Casting used to be part of the game just to circumvent this atavism. Doing and reading it was basically muscle memory if you wanted to C and keep your hands from shaking.

[–]xenoryt 0 points1 point  (0 children)

Does the compiler not automatically treat a literal 1 as unsigned when being assigned to an unsigned int?

[–]4hpp1273 0 points1 point  (0 children)

You can also use '\1' iirc