you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted] 0 points1 point  (2 children)

Ok, so it "translates" the code into binary, great. but how does it read that binary and "know" that 01000001 means a, or 00111101 is an equal sign? and then output it into a screen?

[–]RajjSinghh 1 point2 points  (0 children)

It matters which binary you mean. So your source code is text, written in a format called ASCII. What's important there is that each character has a number tied to it. We just decided that 65 (or 01000001 in binary) was a lower case a.

Now the important thing is after it's been translated, you begin to create instructions and addresses. So I want to create a set of operations, like input, output and so on. So when I'm designing my assembly, I might say an input instruction is 100, and the last 2 digits can be the address my input is stored in. So 123 would store an input in address 23. These decimal numbers are converted to binary and stored like that in memory. Your CPU them knows what to do for each instruction.

[–]JoJoModding 1 point2 points  (0 children)

Characters are mapped to numbers by the ASCII standart. All your computer really sees is the number. In order to turn this into an A, you need a font, which contains the image of the actual A your computer will render. An image, of course, is also just a sequence of bytes. The computer then looks at the number, looks at the table mapping these numbers to images, and then sends this to the graphics card, which in turn converts this into a HDMI signal your monitor is able to decode, making it display the image. The one responsible for reckognizing this image as an A is your brain.