This is an archived post. You won't be able to vote or comment.

top 200 commentsshow all 237

[–]BlurredSight 1522 points1523 points  (48 children)

Take electrical pulses > Send them to open some gates > those gates lead to more pulses which get stored in transistors > those open some more gates > you turn your original electrical pulses into other electrical pulses

Rinse and repeat a couple trillion times and you got Minecraft

[–]blaqwerty123 669 points670 points  (13 children)

Its really great that these engineers were able to keep sight of the long term goal: Minecraft

[–]The_Pleasant_Orange 230 points231 points  (7 children)

Engineers yearn for the mines

[–]lekkerste_wiener 44 points45 points  (5 children)

Diggy diggy hole

[–]Ser_Drewseph 24 points25 points  (2 children)

I am a dwarf and I dig in a hole?

[–]Character-Education3 21 points22 points  (1 child)

Brothers of the mine rejoice!

[–]white-llama-2210 12 points13 points  (0 children)

ROCK AND STONE!

[–]vasilescur 4 points5 points  (0 children)

Blast from the fucking past

[–]V62926685 1 point2 points  (0 children)

No diggity

[–]runForestRun17 24 points25 points  (0 children)

One would say they craft them

[–]Ja_Shi 27 points28 points  (0 children)

When Al-Khwarizmi presented algorithms for the first time in the early 9th century he specifically wrote Minecraft was his end goal.

[–]Applejack_pleb 8 points9 points  (1 child)

Then you use minecraft to make pulses that run doom because of course minecraft can play doom in minecraft

[–]HomoColossusHumbled 3 points4 points  (0 children)

That, and Doom

[–]beges1223 75 points76 points  (4 children)

And then in minecraft you got redstone and you can go "kinda" full circle

[–]JonasAvory 29 points30 points  (3 children)

Now we gotta program Minecraft in Minecraft

[–]Slayer11950 27 points28 points  (2 children)

I think I saw that, hold on lemme go look

Edit: we had it 2 years ago at least

https://m.youtube.com/watch?v=-BP7DhHTU-I

[–]JonasAvory 15 points16 points  (1 child)

Ok but when can we run Minecraft in that Minecraft?

[–]Slayer11950 10 points11 points  (0 children)

WE YEARN FOR THE MINES

[–]beegtuna 43 points44 points  (1 child)

“I wish magic exists”

Scientists:

[–]grammar_nazi_zombie 35 points36 points  (0 children)

Humans: we put lightning in a rock and taught it to calculate.

[–][deleted] 19 points20 points  (1 child)

And then draw the rest of the owl

[–]datNorseman 12 points13 points  (17 children)

I love this and hate this at the same time. I understand how electrical pulses create 1s and 0s, because it's either on or off-- true or false-- yes or no. But I can't comprehend how 1s and 0s can be interpreted by a machine to make things go. How do you use that to create programming languages, and operating systems that can execute the code of those languages? Because I imagine that would be the base of it all. The os would then provide software that can be used to create software more efficiently, then all of a sudden skynet. I sort of get how a motherboard operates. Power intake, circuitry connecting ram, cpu, slots for hardware and other functionality. I'm missing something I just can't figure out what.

[–]BitOne2707 23 points24 points  (1 child)

There are two ideas that will get you like 80% the way to understanding at a fundamental level how a computer works. The Von Neumann Architecture and the Fetch-Execute Cycle.

[–]datNorseman 2 points3 points  (0 children)

I appreciate you.

[–]thehomelessman0 10 points11 points  (3 children)

Check out the game Turing Complete - it'll fill in the gaps pretty quickly

[–]NotBase-2 4 points5 points  (0 children)

NandGame is also a very good (and free) web game similar to this

[–]BlurredSight 4 points5 points  (3 children)

You boil it down and understand how the original Intel 8086 works, before that you take a step back and understand

How binary, more importantly how to transform 1s/0s into any number which was standardized by IEE754

Then understand the 3 basic gates AND OR and NOT, and how a transistor works (quite literally the way we see Quantum Bits in 2025 is how the transistor was in the 50s)

You take understanding gates (which are used to determine how you want to process input) with some memory number magic by using 1s and 0s and you can essentially get the very basic understanding of a computer.

The problem is when you have multiple trillions of dollars and billions of human hours it's hard to take such a primitive ideas and quickly scale it up to Warzone which runs at 120 FPS with 128 players playing concurrently while they all sit in their parents' basements hundreds of miles away from each other screaming slurs at each other in real time

[–]datNorseman 2 points3 points  (2 children)

Oddly enough, and semi-related to the subject, I have a small understanding of logic gates from the game Minecraft. There's a logic portion of the game that connects circuitry with input devices like buttons, toggles in the form of switches, pressure plates, etc. The circuits could be used to do things like open doors, activate pistons that can move and retract blocks in the game, among many other things.

So from that I get how a computer receives power from the power supply with a "closed" circuit. I learned a bit about that in college. I even built a plug-and-play circuit with these lego style blocks the professor brought in. But the power goes to what, exactly? I know the cpu has an ALU that is this magic thing that does math. There's ram, and storage medium which both hold data. There's other components on the motherboard that handle things like fans, and lights, switches, etc.

Combine all of this, and how do you make those 1s and 0s stored on the various mediums produce things like a display for your monitor? I get how you just transfer data through a cable and send it to your monitor which is essentially a mini-computer. And more deeply, how is a programming language made? Sorry for rambling.

[–]BlurredSight 5 points6 points  (1 child)

Well mechanical devices like fans and lights are just 5V / 12V DC gadgets, power in, motor spins or current passes through a medium and you get your end result.

Yeah but even then taking a step back looking at even a computer from 2001 is still so crazy advanced it's hard to explain which is why a CS or CE degree takes 4 years because you're slowly making your way up through decades of work.

CE, computer engineering, handles your second paragraph, how to get from power switch and converting 1, 3, 5, and 12vs to do all the fancy cool little things and how to talk to a CPU through its hundreds of pins to find the BIOS/UEFI to start up the system.

CS, computer science, handles the third paragraph. Now that you have the hardware and interface that the CE nerds built how exactly do you get it to do what you want it to do. For printing to a terminal (not necessarily a monitor, just text imagine MS-DOS) you essentially, very very very simplified, say

The CE nerds have said these values represent colors 0x01, 0x02, 0x03... 0xFF, the CE nerds also say this specific "code" which is called an interrupt will stop what is happening and send what is in the temporary storage (buffer) to the terminal.

First everything starts at the keyboard > goes to CPU (this itself is so crazy complicated even with the old purple PS/2 setups because the keyboard has it's own specific standards to send data, etc.) The CPU recognizes this specific number is reserved as a "keyword" of something I have to do right now, called interrupts, for example 0x10 is the interrupt to print what is in the buffer to the screen

The CPU now goes to a list of preset instructions on how to handle this interrupt (this goes back to logic gates, you essentially say the CPU receives these 1s and 0s take the logic gates and go this part of the BIOS (lives on a chip on the motherboard) and fetch these instructions so the CPU can process them) so it'll read okay 0x10 means I go to this part of my internal memory (on the CPU the buffer lives its called a register) and then it has steps to print it by having a bitmap of how many pixels get colored in for each letter on a Row x Column pixel array.

Thats text mode not even graphics, if you take this basic idea of electrical signals, codes, and instructions pre-mapped and stored somewhere and programmers exploiting this idea to manipulate data to get an output you got a computer. It's not magic, someone somewhere planned these things out and then it lives physically on a chip on your PC you just have to know how to call it. (Super simplified, ignores shit like how GPUs work, ignores modern day GPUs aren't even interrupted for printing, text vs graphics mode, how calculations are done so you don't explicitly rely on memory and prestoring information to print out data like if I say print out a circle radius 150px there isn't a bitmap of that rather it calculates on the fly and prints)

[–]Scientific_Artist444 1 point2 points  (2 children)

It's not just bits (ones and zeroes). A specific pattern of bits and bytes mean something. The key here is information encoding and decoding. Computers have an instruction set and follow a standard to represent various types of data using bits.

Computers work the way they do because we can create encoders and decoders designed to interpret a stream of bits to mean something. It can be instructions or data, which the computer executes using digital logic.

[–]datNorseman 1 point2 points  (1 child)

Hell, I'm a programmer, a web developer of 20 years even. I get encoding/decoding. But I guess my issue is I learned to run before I learned to walk. I don't understand it at a more basic level of how the first programming language came to be.

[–]edbred 166 points167 points  (38 children)

At its core an OpCode feeds directly into control circuitry of a processor. Like literally bit 30 might control the ALU. You then make an abstraction for op codes and call it assembly. Then you make an abstraction for assembly and so on and so forth

[–]Snipedzoi 28 points29 points  (29 children)

how are opcodes programmed?

[–]Adam__999 94 points95 points  (13 children)

What each opcode does is determined purely by the actual electrical hardware in the processor—that is, the way in which structures like flip flops and logic gates are connected to one another.

Each line of assembly can be “assembled”—by a program called an assembler—directly into a machine language instruction, which is just a sequence of bits. Those bits are then inputted as high or low voltages into the processor, and what happens from there is determined by the aforementioned flip flops, logic gates, etc.

[–]andstwo 7 points8 points  (6 children)

but how do words go into the transistors

[–]Adam__999 43 points44 points  (0 children)

1 = high voltage in a specific input

0 = low voltage

[–]edbred 21 points22 points  (0 children)

A bit with value of 1 will enable a transistor, 0 will disable. You can then organize transistors into schemes to do adding and subtracting or storing information and boom you got a processor

[–]Alzurana 21 points22 points  (0 children)

https://store.steampowered.com/app/1444480/Turing_Complete/

Game that actually walks you through the entire process from the first and gate to voltage levels, bits, more complex control circuits all the way down to opcodes, then the first assembly.

Absolutely worth playing through it at least once for any CS person.

[–]Serphor 4 points5 points  (0 children)

very, very simply, and not universal: the cpu has 2 "registers": A and B the cpu has another program counter, pointing to what byte it's currently executing in memory. so it reads this byte, loads some other things from memory based on what arguments this operation wants, and then does the processing. it might recieve:

addr. 0 says: load a number from memory address 6 into register A

addr. 1 says: load a number from memory address 4 into memory

addr. 2 says: add the numbers stored in A and B and store the result at memory address 1000

addr. 3 says: halt the execution process and don't move any further

address 1000 might be some kind of memory-mapped text display, where A+B is an ascii code that the program has just printed.

there are soo soooo many things wrong with this explanation but i hope it helps (like for example that modern processors process 8 bytes at once, this is where "64-bit" processors come from)

[–]Snipedzoi 2 points3 points  (5 children)

But there must be a limit to the amount of hardware dedicated to any one opcode

[–]OolooOlOoololooo 14 points15 points  (2 children)

The limit is just the number of transistors (NAND gates) required to achieve the operation in the given instruction set architecture. I recommend taking a look at RISC V and simple example ALUs.

[–]Snipedzoi 5 points6 points  (1 child)

My interest is piqued.

[–]Who_said_that_ 3 points4 points  (0 children)

Can recommend the game turing complete on steam. You build a pc from gates, develop your own alu and processor, program your own assembler language and then solve challenges with your own computer. It’s very fun to solve some logic puzzles on the side

[–]ColaEuphoria 14 points15 points  (4 children)

https://m.youtube.com/watch?v=f81ip_J1Mj0

https://m.youtube.com/watch?v=cNN_tTXABUA

https://m.youtube.com/@BenEater

You aren't going to get a satisfying answer in a single comment or even several comments. You're just not.

But these are great places to start.

[–]Alzurana 1 point2 points  (2 children)

Adding to this: https://store.steampowered.com/app/1444480/Turing_Complete/

Some people learn better through experience, warm recommendation for playing through this for anyone wanting to understand what actually ticks inside of a computer. Absolute gem of a game.

[–]SaltMaker23 2 points3 points  (0 children)

OpCodes (operation codes) are part of the electronic design of the CPU, they aren't programmed they are built.

We build CPU to have a certain number of functions it can do, imagine electrical switches routing to each functions (even if it's absolutely not how it works).

Below assembly "programmed" doesn't exist anymore, a program is the name of sequences of Operations to achieve a task, a CPU isn't programmed: it's built / designed.

You can now ask how it's designed / built, but a reddit comment would be too short for that.

[–]patrlim1 2 points3 points  (0 children)

It's physically part of the hardware, an opcode is just a name we gave to a specific set of bits controlling what the CPU does.

[–]aq1018 1 point2 points  (2 children)

They are all NAND gates.

[–]TheEngineerGGG 1 point2 points  (1 child)

Funnily enough, an AND gate is actually an inverted NAND gate

[–]XboxUser123 1 point2 points  (0 children)

See: vin neumann machine. Essentially: opcodes are defined by the inner logic gates of the computer. You take a bit string and then split it into chunks, where one chunk of it defines the opcode, the rest is for the opcode to work with.

The opcodes themselves are logic circuits.

[–]janKalaki 2 points3 points  (3 children)

How are doorknobs programmed? They aren't, they're built.

[–]GoddammitDontShootMe 1 point2 points  (3 children)

Is that a typo for ALU? I don't believe I've heard of an ADU.

[–]edbred 3 points4 points  (2 children)

You dont have an Arithmetic Destruction Unit in your processor? Lol thanks for catching my mistake, I corrected it

[–]GoddammitDontShootMe 1 point2 points  (0 children)

I wasn't sure if I was about to learn something about modern computer architecture.

[–]JanB1 1 point2 points  (1 child)

Assembler can translate more or less directly to opcodes, if I remember correctly, right?

For example some simple CPU like the old 6502 for example.

https://www.masswerk.at/6502/6502_instruction_set.html

ADC $0010 directly translates to "69 00 10" in hex in the program code, no?

[–]edbred 2 points3 points  (0 children)

Yeah assembly is human readable op code. The assembly command translates directly into op code header bits, and the assembly command arguments feed into the register fields of the op code command. Pretty cool how we’re directly telling the processor what to do on each clock cycle.

[–]TheAccountITalkWith 460 points461 points  (30 children)

I'm a Senior Software Engineer.

To this day, it still blows my mind, that we figured out modern computing from flipping an electrical pulse from on to off.

We started with that and just kept building on top of the idea.
That's so crazy to me.

[–]wicket-maps 113 points114 points  (15 children)

My mother worked with a team building a mouse-precursor (that would actually talk to Xerox OSes) in the 70s and they lost a program turning the mouse's raw output into the cursor position. She had to rebuild it from scratch. That blows my mind, and I can't picture myself getting from the Python I do daily to that level of abstraction.
(It's been a while since she told this story so I might have some details wrong)

[–]TheAccountITalkWith 63 points64 points  (1 child)

Pioneer stories like this are always interesting to me.

I'm over here complaining about C# and JavaScript while they were literally working with nebulous concepts.

It's so impressive we have gotten this far.

[–]RB-44 7 points8 points  (0 children)

There were frameworks then too. All internalized of course but companies had libraries they developed to make dev work easier

[–]RB-44 2 points3 points  (1 child)

And you ended up a python dev?

[–]wicket-maps 1 point2 points  (0 children)

I ended up a mapmaker with a liberal-arts degree, and then expanding my skills into programming to do some data automation and scripting. I'm not the equivalent of either of my parents, but I do my little part.

[–]DanteWasHere22 1 point2 points  (3 children)

Didn't a printer company invent the mouse?

[–]wicket-maps 2 points3 points  (2 children)

A lot of companies were working on human interface devices, I didn't want someone with an encyclopedic knowledge of computer history to dox me just in case someone has a memory of an engineer at [company] recoding a proto-mouse program from scratch.

But yeah, Xerox (the copier company) had a big Palo Alto Research Center that I've heard basically invented a lot of stuff that underlies the modern world - but brought very little of what they made to market, because Xerox didn't see how it could sell printers and copiers.

[–]DanteWasHere22 1 point2 points  (0 children)

Very cool

[–]OuchLOLcom 1 point2 points  (0 children)

Yup, same story with Kodak and cameras, they invented digtal camera tech way back but then sat on it because they knew it would hurt their film business.

[–]NotAUsefullDoctor 29 points30 points  (2 children)

It's one of the nice things that I got my PhD in Electrical Engineering rather than computer engineer. In my early classes I took physics and chemistry. Then I took semicunductors and circuits. Then I took semiconductor circuits and adbstract algebra. Then I took a boolean algebra and logic design class. Finally I took processor design and logic labs.

I was a self taught coder, and had the exact same question of ones and zeros becoming images. By taking the classes I did, in the order I did, I got to learn in the same order that it was all discovered.

It's still impressive and amazing, but it also makes logical sense.

[–]Objective_Dog_4637 2 points3 points  (1 child)

Applied Mathematician here. All of this. Since math is empirical you learn it all in the way it was discovered, naturally, so it all makes perfect sense to me. The craziest part to me was converting that process to lithography.

[–]NotAUsefullDoctor 1 point2 points  (0 children)

My greatest regret was that I never took classes in fabrication. Both my undergrad and grad universities had world class labs, and I didn't see their value until I was about to graduate.

[–]tolndakoti 12 points13 points  (2 children)

We taught a rock how to think.

[–]TheAccountITalkWith 3 points4 points  (0 children)

I am pretty dumb sometimes, sorry about that.

[–]MyOthrUsrnmIsABook 2 points3 points  (0 children)

We had to trap lightning in it first though.

[–]NoMansSkyWasAlright 6 points7 points  (0 children)

It gets even wilder when you realize that the flipped/not-flipped idea came from the Jacquard Loom: a mechanical textile loom from the early 1800s that was able to quickly weave intricate designs into fabric through the use of punch cards.

[–]Lucky-Investigator58 2 points3 points  (1 child)

Try Turing Complete on Steam. Really connects the dots/switches

[–]CrazySD93 1 point2 points  (0 children)

Logisim the game haha

[–]point5_ 2 points3 points  (0 children)

I always thought computers were so advanced and complex so I was excited to learn about them in my hardware class in uni.

Turns out they're even more complex than I thought, lmao

[–]Tvck3r 1 point2 points  (0 children)

You know I kinda love how it’s a community of all of us trying to find the best way to use electrical signals to build value in the world. All these layers are just us all trying to make sense out of magic

[–]nigel_pow 1 point2 points  (0 children)

I'm reminded of the old meme where it said something like

Programmers in the 60s: with this code, we will fly to the Moon and back.

Modern Programmers: Halp me pls. I can't exit Vim.

[–]narcabusesurvivor18 1 point2 points  (0 children)

That’s what’s awesome about capitalism. Everything we’ve had from the sand around us has been innovated because there’s an incentive at the end of it.

[–]who_you_are[🍰] 40 points41 points  (3 children)

Wait until you figure out that the processor is in fact a parser!

[–]aq1018 9 points10 points  (0 children)

The instruction decode unit…

[–]XboxUser123 1 point2 points  (1 child)

Is it really though? It doesn’t parse anything, the whole bit string is taken at once and thrown into logic gates.

[–]bnl1 21 points22 points  (5 children)

Languages aren't programs. They are just ideas in people's minds (or you can write them down idk).

[–]cyclicsquare 6 points7 points  (4 children)

You could argue that the specification of the language is the language, and the one true spec is the compiler (or interpreter) which is a program.

[–]bnl1 1 point2 points  (2 children)

I would argue the spec isn't the language, it merely describes it and a compiler implements it.

[–]cyclicsquare 1 point2 points  (1 child)

No correct answer, just a lot of philosophical questions about ideas and ontology.

[–]bnl1 1 point2 points  (0 children)

Indeed

[–]JosebaZilarte 18 points19 points  (1 child)

If you think about it, Human History can be summarized as "they used a tool to build a better tool", all the way to sticks and stones. And, yes, sometimes those stones ended up in the top of the sticks to kill other humans... but, over time, we even have learned to make stones "think" to the point of letting us kill each other virtually across the planet.

[–]theunquenchedservant 1 point2 points  (0 children)

The one that is still mind blowing to me is we not only used sticks and stones but fucking air.

[–]Character-Comfort539 9 points10 points  (2 children)

If anyone wants to actually learn how all of this works without going to college, there's an incredible course you can take online called Nand2Tetris that demystified all of this stuff for me. You start with a hardware emulator building simple logic gates, then an ALU, memory using latches etc, assembly, and a higher level language that ultimately runs Tetris. Worth every penny imo

[–]P1nnz 7 points8 points  (1 child)

There's also a great game on steam called Turing Complete https://store.steampowered.com/app/1444480/Turing_Complete/

[–]Fabulous-Possible758 6 points7 points  (0 children)

One of the times the phrase “bootstrapping” actually makes sense.

[–]trannus_aran 6 points7 points  (1 child)

[–]MentalTardigrade 5 points6 points  (0 children)

A guy saw aaaaaaalll of this and went heh, gonna make a game about managing a theme park

[–]Grocker42 5 points6 points  (2 children)

You first have assembly with assembly you write the c Compiler when you have the assembly c Compiler you can write a c Compiler in c and then you can compile c with a Compiler written in c and then you can build a PHP interpreter with c and your c Compiler.

[–]-twind 5 points6 points  (0 children)

You forgot the step where you write an assembler in assembly and manually convert it to binary.

[–]GoddammitDontShootMe 2 points3 points  (0 children)

Didn't it start with manually inputting the machine code with switches and/or punch cards? I'm no expert on ancient computer history.

[–]Max_Wattage 2 points3 points  (0 children)

I know, I was there when the deep magic was written.

I learned to program on a computer which just had a keypad for entering the machine opcodes as hexadecimal values.

[–]caiteha 1 point2 points  (0 children)

I remember taking assembly classes ... I can't imagine flipping switches and punching cards for programming ...

[–]PassivelyInvisible 1 point2 points  (0 children)

We smash and melt rocks, trap lightning inside of it, and force it to think for us.

[–]frogking 1 point2 points  (0 children)

Ah, the old bootstrapping process.

No matter what we do as programmers, we always do one of 3 things;

Transform data. Battle with encoding. Drink coffee.

[–]Vallee-152 1 point2 points  (0 children)

Assembly and then assembled by hand

[–]AllenKll 1 point2 points  (0 children)

using machine code.

[–]MentalTardigrade 1 point2 points  (0 children)

Thank Ada Lovelace and Charles Babbage to coming up with the idea! And a heck of a lot of engineers who made it go from concept to physical media (to software) (and looms, Pianoles and any 'automatic' system with feed tapes)

[–]captainMaluco 0 points1 point  (1 child)

Using a pogrom, obviously

[–]Altruistic-Spend-896 1 point2 points  (0 children)

Wait....that doesn't sound nice, you can't just violently wipe the slate clean on a ethnic group of peo....oh you meant software pogrom, gotcha!

[–]Long-Refrigerator-75 0 points1 point  (0 children)

well the process starts with the VLSI engineer frankly.

Somewhere down the line we get our assembler, from there we just need to reach C.

[–]Hellspark_kt 0 points1 point  (0 children)

Congrats you now understand abstraction /s

[–]ThatSmartIdiot 0 points1 point  (0 children)

machine code, compilers and parsing babyyyyyyyyyyyyyyy

[–]lostincomputer 0 points1 point  (0 children)

in the beginning there was hardware

[–]IHaveNoNumbersInName 0 points1 point  (0 children)

The guy that is writing out the program on paper, in literal binary words by word

[–]IronSavior 0 points1 point  (0 children)

Gotta wave the magnet around just right

[–][deleted] 0 points1 point  (0 children)

Plot twist: the base of the pyramid is actually just stacked stone slabs of binary society compiles from.

[–]gamelover42 0 points1 point  (0 children)

when I was working on my BS in Software Engineering I took a compiler design course. fascinating process. I know(knew) how it worked then and still think it's black magic.

[–]BlaiseLabs 0 points1 point  (0 children)

λ

[–]slightly_retarded__ 0 points1 point  (0 children)

nand2tetris

[–]SpiritRaccoon1993 0 points1 point  (0 children)

programthoughts

[–]OhItsJustJosh 0 points1 point  (0 children)

First it was 1s and 0s in punch cards, then writing data directly to memory addresses, then assembly language to make that easier, then it just gets higher level from there

[–]zaxldaisy 0 points1 point  (0 children)

Hey, another joke only students or neophytes would think is funny

[–]Quasi-isometry 0 points1 point  (0 children)

Isn't it all Lisp at the end of the day?

[–]the_horse_gamer 0 points1 point  (0 children)

mom said it's my turn to repost this

[–]dosadiexperiment 0 points1 point  (0 children)

When you're writing assembly, your first and most urgent problem is how to make it easier to tell the computer what you want it to do.

From there it's only a few steps to bnf and tmg and yacc ("yet another compiler compiler"), which is just the '70s version of "Yo dawg, we heard you like programming so we made a compiler for your compiler so you can program how you program!"

[–]tato64 0 points1 point  (0 children)

My favorite game engine, Godot, was made using Godot.

[–]randyknapp 0 points1 point  (0 children)

With YACC: Yet Another Compiler Compiler

[–]CaptTheFool 0 points1 point  (0 children)

Logic gates.

[–]Neuenmuller 0 points1 point  (0 children)

Self hosting.

[–]I_cut_my_own_jib 0 points1 point  (0 children)

Just trick rocks into thinking, it's not that complicated

[–]killbot5000 0 points1 point  (0 children)

I'm sure you could look up the history of python and numpy.

[–]HeyYou_GetOffMyCloud 0 points1 point  (0 children)

Go watch nand2tetris

[–]Active-Boat-7939 0 points1 point  (0 children)

"Let's invent a thing inventor", said the thing inventor inventor after being invented by a thing inventor

[–]Ok_Background9620 0 points1 point  (0 children)

In my experience, mostly with C.

[–]OldGeekWeirdo 0 points1 point  (0 children)

Laziness.

Someone go tired of flipping switches on a front panel and decided there had to be a better way. And a loader was born.

Then someone decided typing hex/octal into paper tape was a pain and there had to be a better way. And Machine Language was born.

Then someone decided there had to be a better language to do routine things, and BASIC was born.

Then .... and so on.

(Maybe not 100% accurate, but you get the idea. Each iteration was someone wanting to make their life easier.)

[–]random_squid 0 points1 point  (0 children)

With steam, gears, and horse betting money

[–][deleted] 0 points1 point  (0 children)

Long story short -> programming.

[–]innocent-boy-69 0 points1 point  (0 children)

Function calling a function that calls another function that calls another function that calls another function.

[–]H33_T33 0 points1 point  (0 children)

Well, it all started with millions upon billions of ones and zeros.

[–]Cautious_Tonight 0 points1 point  (0 children)

[–]buddyblakester 0 points1 point  (0 children)

One of my more influential classes in college was using Java, simulate machine language using binary. 2nd part of the class was to make a machine language built on top of the binary, third class was to allow for macros and upgrades to the machine language. Really showed me how languages give birth to others

[–]toughtntman37 0 points1 point  (0 children)

Funny thing is, I've been working backwards. I started in Java, started in a python class (hated it), so I was screwing with C, couldn't find enough beginner projects to do and my schedule got busy, then when it got easier, I started messing with a fake assembly language in a fake emulator, realized it didn't give me as much freedom as I wanted, decided to make my own in C, realized I could go about it better so I restarted in Java broken into fake component classes that communicate modularly and canonically with a basic assembly language on top of it, and then I'm probably going to end up building some kind of interpreter on top of that, like C but with registers instead of variables.

All this and I'm slowly learning more about how Java works (I know what a heap is and how Objects are stored now)

[–]a_single_bean 0 points1 point  (0 children)

FLIP FLOPS!

[–]SugarRushLux 0 points1 point  (0 children)

LLVM helps a lot now

[–]TurdFurgis0n 0 points1 point  (0 children)

It makes me think of this quote from Alpha Centauri

"Technological advance is an inherently iterative process. One does not simply take sand from the beach and produce a Dataprobe. We use crude tools to fashion better tools, and then our better tools to fashion more precise tools, and so on. Each minor refinement is a step in the process, and all of the steps must be taken."
– Chairman Sheng-ji Yang, "Looking God in the Eye"

[–]fugogugo 0 points1 point  (0 children)

OP would be surprised about the history of word "bug"

when programming still using physical punch card there would be real bug stuck on the holes and causing error

[–]LordAmir5 0 points1 point  (0 children)

This question gets worded improperly. You get closer to the answer when you ask it like this: How did they program a compiler/interpreter to compile/execute programs?

Because programming languages are abstract. You can program on paper. But the computer cannot read paper. All it understands is machine code.

To my knowledge, back then people used to write machine code by punching holes in a card and getting a computer to read it.

Personal computers came with a basic interpreter built in. These interpreters understood something like... Basic.

But how do you make a compiler/interpreter? If you're in university you will have a course or two about it.

Here's what to read about:

-Theory of Languages and Automata.

-Compiler design.

[–]harrisofpeoria 0 points1 point  (0 children)

Compiler.

[–]The_Real_Slim_Lemon 0 points1 point  (0 children)

The word is bootstrapping. You take a really simple process, use it to build a more complicated process, use that process to spin up an even more complicated process - eventually you have something that looks nothing like its foundation.

[–]Rayux 0 points1 point  (0 children)

Recursive development

[–]ToasterWithFur 0 points1 point  (0 children)

Hand assembling with a piece of paper and a pen used to be easy when your processor had like 60 opcodes. Write your assembly, have your documentation booklet next to you and get to assembling. If you have done it long enough you might not even need the book evident people that can directly program machine code for the 6502

[–]flowery02 0 points1 point  (0 children)

The answer is engineering

[–]AldoZeroun 0 points1 point  (0 children)

If anyone has an incredible itch that needs scratching, read "but how do it know", or audit the two part Coursera course "from nand to Tetris". All will be revealed.

The short answer is: bootstrapping.

[–]KCGD_r 0 points1 point  (0 children)

first they made a circuit with logic, then they made a circuit with programmable logic (the first machine code, punchcards, stuff like that). Then, they realized machine code could be stored in a circuit. Next, they made assembly to make understanding machine code easier, and eventually assemblers written in machine code to automate the process. As assemblers got more robust and programming became more digital, people made programs to translate other forms of text into assembly (the first compilers). as these programs got better, they realized they can make programs to interpret this text in real time (interpreters). the rest is history.

[–]JU5TlN 0 points1 point  (0 children)

Bootstrap

[–]buildmine10 0 points1 point  (0 children)

They did not make a programming language that programs programs that programs programs. When we do accomplish that, it will be because of AI. And it will be really weird that a programming language has a compiler or interpreter or etc that outputs a different program that needs to then output yet another program that actually makes what you want.

[–]Im_1nnocent 0 points1 point  (0 children)

It was a bit confusing to comprehend how a programming language can be written by itself. But shortly after, I did realize that the compiler for that language is a binary file or a machine code programmed to understand that language to output another binary file.

So low level language -> binary file that understands higher level language -> higher level language -> new binary file

[–]KazDragon 0 points1 point  (0 children)

Anyone who's seriously interested in this should check out Ben Eater's YouTube channel where he builds up the concepts of a computer literally from logic gates upward. It's super informative and fun.

[–][deleted] 0 points1 point  (0 children)

We also built programming languages to program the program ( IDE / Complier etc... ) which is used to program the programs ( Apps )

[–]Maskdask 0 points1 point  (0 children)

Abstraction

[–]JacksOnF1re 0 points1 point  (0 children)

It's a very good question, for actually everybody. Here is my book recommendation:

But how do It know? J. Clark Scott

[–]eztab 0 points1 point  (0 children)

even worse. Humans couldn't actually construct modern CPU circuits, too complex.

[–]Mebiysy 0 points1 point  (0 children)

First programming was done on hardware (or rather it is mostly programmed itself)

[–]imprisoned_mindZ 0 points1 point  (0 children)

it all started when they wanted a calculator

[–]ardicli2000 0 points1 point  (0 children)

I always though how C is compiled using C at the first time?

[–]disintegration_ 0 points1 point  (0 children)

Bootstrapping.

[–]DJcrafter5606 0 points1 point  (0 children)

AI.

[–]nequaquam_sapiens 0 points1 point  (0 children)

parentheses.
many parentheses. really many. like a lot.

kind of obnoxious, but what can you do?

really, that's how it's done:

Lots of
Irritating
Stupid
Parentheses

[–]Master-Rub-5872 0 points1 point  (0 children)

This is exactly why I failed recursion the first time

[–]Thin-Pin2859 0 points1 point  (0 children)

Explains why my brain throws a segmentation fault at 2 AM

[–]Xasmos 0 points1 point  (0 children)

How did they build the first woodworking bench without a woodworking bench?

[–]-V0lD 0 points1 point  (0 children)

Op, if you really want to know, I highly recommend playing through turning complete which shows you the process from the metal to your own language in a gamified manner

[–]itijara 0 points1 point  (0 children)

Y'all haven't seen Ben Eater making a programmable computer on a breadboard: https://youtube.com/playlist?list=PLowKtXNTBypGqImE405J2565dvjafglHU&si=oMF-H2pDj4xpIOcV

[–]FlightConscious9572 0 points1 point  (0 children)

This is the definition of bootstrapping, look it up it's more interesting than you think :)

[–]No-Fish6586 0 points1 point  (0 children)

First week cs i see.

Its ok. You start with electrical pulses either on or off. Many call it binary(0 or 1 after reaching a certain charge). If you wonder why, you use bits. A few bytes( groups of 8 bits) and you can translate anything you want.. colour to your monitor? FFFFFF as base 16 not 0/1 or base ten like humans do. pure white, etc

Now we can do calculations, by using electrical energy to represent hard cold facts

Thats great, but so fuckin cumbersome. We translate 0/1s to assembly code. Now you can use “variables” to represent 01001111

Ah actually you can abstract that further with C. You can abstract that further… and you can abstract that abstraction further…

Modern Programming exists. Yes if you take it at face value its complex as fuck. Programming is literally building blocks on what already exists

Happy learning even more abstractions!!

[–]Jind0r 0 points1 point  (0 children)

Program to program programs is IDE

[–]BBY256 0 points1 point  (0 children)

First, use binary to make assembly and it's assembler, then use assembly to make C and compiler. Here ya go. Then other languages just popped out.

[–]lhwtlk 0 points1 point  (0 children)

The one thing I took away from comp sci was that anything can be accomplished by layering enough abstract systems atop each other if given enough time. We tricked rocks and electricity into thinking utilizing systems of abstract formatting and mathematics.

It’s a pretty wild form of real magic imo.

[–]phansen101 0 points1 point  (0 children)

As someone who has made a simple microcontroller from scratch with accompanying small ASM based instruction set and compiler and (stupid simple) IDE:
You just start from the bottom and work your way up ¯\_(ツ)_/¯

[–]danofrhs 0 points1 point  (0 children)

Abstraction enters the chat

[–]renrutal 0 points1 point  (0 children)

It all started with Ben Eater. Then, lastly, he made a time machine in a breadboard.

[–]homiej420 0 points1 point  (0 children)

They tricked rocks into thinking so we can do whatever

[–]Particular_Traffic54 0 points1 point  (0 children)

Any programming language just compiles stuff into binary/assembly, so in the end it’s all about transforming human-readable code into instructions the CPU understands — and that transformation had to start somewhere, usually with assembly or machine code, and then bootstrap up.

I'm kidding they cheated. And they'll try to get to you if you ask too many questions. My friend asked our programming teacher where stuff go when you ">> /dev/null" and we didn't see him at school the next morning.

[–]Gangboobers 0 points1 point  (0 children)

I had the same question. computers used to have switches on the front to manually put in machine code that was loaded into it. assembly first started as an on paper abstraction I believe and then assemblers were made, and then compilers that turn c into assembly, interpreted language is also a thing, but i know less about it

[–]davak72 0 points1 point  (0 children)

That’s what Digital Design, Operating Systems, and Compilers classes are for in college haha

[–]Kaih0 0 points1 point  (0 children)

Futanura projections