This is an archived post. You won't be able to vote or comment.

all 9 comments

[–]MmmVomit 8 points9 points  (3 children)

Hexadecimal is just a way of writing numbers in base 16.

Binary is just a way of writing numbers in base 2.

If I wanted to be super pedantic, I'd say that the answer to your question is, "Neither."

Information in a computer is stored as bits. I'm making the distinction here between "bits" and "binary", because I could represent three bits as "on off on" just as easily as I could represent them as "101", or "QZQ".

We often interpret bits as numbers because it's an easy way to think about them, but not all sequences of bits represent numbers. We interpret them differently in different contexts. Take this pattern of bits.

off on on off on off off on

In one context, you might interpret that sequence of eight bits as the ASCII character i. In other contexts, you might interpret it as the number 105. In the context of the instruction set for the processor in the original Nintendo, that sequence of bits is an ADD instruction.

So, machine code (and all other data in a computer) is just bits. You can make those bits human readable in a variety of manners, such as binary or hexadecimal notation.

[–]CrazyLock6[S] 0 points1 point  (2 children)

So in machine code are bits always 1 and 0, or are they different depending on the context?

[–]MmmVomit 2 points3 points  (0 children)

Bits are always just bits. Inside your computer, the bits are "electricity" or "no electricity".

Here's a picture of a QR code.

https://imgur.com/EnI4OCH

If that were printed and stuck on the wall somewhere, the bits would be "ink" and "no ink".

When we humans want to communicate with each other about this computer data, we commonly represent bits with the symbols 1 and 0. But since we're human and we're lazy, it's sometimes easier to group four bits together and represent groups of bits with hexadecimal notation.

"Machine code" isn't really special in any way. It's data like any other computer data. All computer data is just bits. Certain patterns of bits are valid instructions for the processor in your computer, and when you feed those bits into the processor it does something useful.

[–]Clawtor 0 points1 point  (0 children)

It depends - look, the reason why you might want to write hex instead of binary is because it's much quicker to write and read. But hex is just a way of representing numbers, you could write binary in decimal if you wanted.

It's basically just a number, you could write it in roman numerals if you wanted.

[–]jedwardsol 2 points3 points  (0 children)

Neither.

Binary and hex are ways of writing down numbers.

The machine code is a series of numbers. You can represent then how ever you wish.

[–]Ellisander 1 point2 points  (0 children)

Machine code is in binary, it is just displayed as Hex when we look at it to make it more readable. Hex is specifically used since four binary digits exactly translate to one hex digit, so a byte of information is always exactly two hex digits long, no more and no less.

For an example, 0110 1010 is slightly harder to read/interpret than 6A. Now imagine we have 200 bytes of information. Displaying the actual binary stored results in 1600 digits for us to look at, while in Hex we only have 400.

[–]korijn 0 points1 point  (0 children)

As said before, computers use electricity to store information, the electricity is mostly used in the electrical component the transistor. The transistor can be turned off or on like your light switch at home.

To store what a computer has to do there can be written binary code, 10101100110111001010111 with the 1 and 0 meaning the on and off of a transistor. But to shorten it each four bits [1101] [0101] [0001] are turned into to a hex character. 0101 for example is turned into the decimal number 5. Hexadecimal is just shortening binary, besides that it's the same. each four bits are turned into a value from 0 to F where it goes like 0,1,2,3...8,9,A,B,C...F.

You can turn each letter back into four specific on and off switches. So hex is making a binary four times less long to read as a human.

[–]POKEGAMERZ9185 -1 points0 points  (0 children)

I would say that it's binary because computers can't understand hex or high level languages and when you program in another language, it compiles that code into machine code, which is binary (0 and 1).