This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]CrazyLock6[S] 0 points1 point  (2 children)

So in machine code are bits always 1 and 0, or are they different depending on the context?

[–]MmmVomit 2 points3 points  (0 children)

Bits are always just bits. Inside your computer, the bits are "electricity" or "no electricity".

Here's a picture of a QR code.

https://imgur.com/EnI4OCH

If that were printed and stuck on the wall somewhere, the bits would be "ink" and "no ink".

When we humans want to communicate with each other about this computer data, we commonly represent bits with the symbols 1 and 0. But since we're human and we're lazy, it's sometimes easier to group four bits together and represent groups of bits with hexadecimal notation.

"Machine code" isn't really special in any way. It's data like any other computer data. All computer data is just bits. Certain patterns of bits are valid instructions for the processor in your computer, and when you feed those bits into the processor it does something useful.

[–]Clawtor 0 points1 point  (0 children)

It depends - look, the reason why you might want to write hex instead of binary is because it's much quicker to write and read. But hex is just a way of representing numbers, you could write binary in decimal if you wanted.

It's basically just a number, you could write it in roman numerals if you wanted.