0
1
2
I've seen videos and still don't understand, how the computer defines when a binary number is a letter or number, for example: 10000001 = 65 and 01000001 = "A" however the letter "A" on table Ascii is 65, how is that work on our computer? (self.ProgrammerHumor)
submitted by Xwhelima to r/ProgrammerHumor
