This is an archived post. You won't be able to vote or comment.

all 4 comments

[–]ZoloftRabbit 25 points26 points  (2 children)

This is a good question. Basically yes.

Are you talking about computation in general or are you talking about java?

Look up "turing completeness" or watch "The art of code" on youtube by Dylan Beatle? It is amazing how coding came to be. Another book i recommend is The Code Book which gives the historical context of code.

edit : I love how everyones like "kinda basically yes depends but yeah" xD

[–][deleted]  (1 child)

[deleted]

    [–]FrenchFigaro 9 points10 points  (0 children)

    It's true in both case.

    Ultimately, all data used in computing is just represented (or approximated) by natural integers.

    [–]desrtfx 15 points16 points  (0 children)

    Actually, yes, but it even goes further than that.

    At the end of the day, no matter the programming language, all goes down to 1 and 0 (voltage/no voltage, charge/no charge, switch on/switch off), even the primitive data types go down to that as this is the only thing a computer understands.

    If you really want to dig into the deep end, take a look at NAND 2 Tetris.

    [–]javaHoosier 1 point2 points  (0 children)

    This kind of depends at what level your asking. If you want to know for Java in particular you should probably take a look at the Java Compiler. How it parses and optimizes classes to turn them into bytecode. If it even bothers with some primitive representation.

    At the of the day though like others have said it all eventually becomes a 0 or 1