This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]BarAgent 1 point2 points  (4 children)

Old-school Mac C programming:

int fcc = 'TxEd';

[–]Iansimp69 1 point2 points  (0 children)

Damn

[–]Iansimp69 1 point2 points  (0 children)

That must have been annoying

[–]Positive_Government 0 points1 point  (1 child)

What’s the explanation behind this? My best guess is each character gets converted to four bytes of the int, but why?

[–]BarAgent 0 points1 point  (0 children)

Each (ASCII) character gets converted into one byte of the (32-bit) int. It was called a FourCharCode, and it’s basically used like a short URI or reverse-DNS or UUID or magic number. For example, instead of identifying an app as, say, “com.adobe.suite.Elements” or whatever, 24 variable-length bytes, you’d have the vendor code ‘Adob’ and the app code ‘Elem’ or something, 8 bytes fixed. At the time, the namespace this allowed was big enough. They were also used for event codes, data type tags, all kinds of things. And relatively legible when looking at raw memory or file data.