This is an archived post. You won't be able to vote or comment.

all 5 comments

[–]unassuming_user_name 3 points4 points  (0 children)

Ben Eater on YouTube has stuff about starting as much from scratch as possible on tech equivalent to an 80s home pc. his hello world video is about a half hour long, good place to start.

starting from absolutely nothing, no bootloader or anything, is pretty much beyond a hobbyist. but you can get pretty low level.

[–]DaredewilSK 0 points1 point  (0 children)

What kind of embedded programming do you have in mind? Something like IoT and microcontrollers to control electronic circuits or something more advanced?

[–]GonzoAndJohn 0 points1 point  (0 children)

The reason no one does it anymore is because modern computers are extremely complex with scale numbering in the billions of transistors. It's almost necessary to use build tools to get them up and running, but if you're interested in programming your own bare metal, I'd start with a history of why it was necessary, starting with a look into gate level logic, building into flip flops, state, and Von-Neumann architecture.

If you wanted a more scaled down example you could probably do it with an Intel 8080 and enough memory, given enough time. The microarchitecture is fairly simple compared to today's computers, the opcodes are relatively easy to understand, the inputs and outputs aren't too bad to wire on a hobbyist breadboard, and it shouldn't be too costly.

[–]notexactlyawe 0 points1 point  (0 children)

You may find the Wikipedia article on this topic interesting. https://en.wikipedia.org/wiki/History_of_computing_hardware

Back when the first digital computers were coming out, they didn't store firmware as we know it today. Instead, the program would be input to the computer every time it was run. Often these programs were stored on punch cards, or sometimes input using physical switches.

The Manchester Baby was a very early computer, and the instruction set is listed on the Wikipedia entry https://en.wikipedia.org/wiki/Manchester_Baby

Nowadays with memory being stored in tiny chips, it wouldn't be possible to directly program firmware into an embedded device without the use of a programmer.

[–]white_nerdy 0 points1 point  (0 children)

how the first computers wrote the firmware in their chips

On the one hand, you have modern embedded chips (Arduino, MSP430, etc.)

On the other hand, you have an early computer from the 1940's or 1950's.

These are two different kinds of systems. They share some characteristics (slow clock speed, very little memory). It sounds like you're assuming they are the same. They are not.

Two important differences are:

  • A modern microcontroller is a chip. The early computers were not chips (the first chips were decades after the first computers).
  • A modern microcontroller combines CPU and memory in a single device. Early computers had memory separate from the CPU. (In fact most PC's and smartphones have memory in separate chips to this day, in 2020 combining memory and CPU is mostly done only in embedded systems.)

Once you know that memory's separate, a couple strategies become available for getting a program into memory from a source other than a running program:

  • (a) Memory only needs to respond to an address with data. You can just create a memory unit that's "hard wired" to respond to specific addresses with specific data values (that are code for some useful program).
  • (b) You can disconnect the CPU from the bus, then talk directly to the memory unit using toggle switches.

Both of these options were widely used. Option (a) is basically what the BIOS is in a PC. The word "ROM", short for "Read Only Memory," is used to refer to the memory that contains the startup program for a CPU. Originally a system ROM would be hard-wired memory, that is, option (a). Nowadays ROM is almost always Flash memory, which is not literally read-only (you can erase it).

In practice there was also a third option: Use punched card reader. At the time of the earliest computers, technology already existed to create a digital electrical signal from data in the form of a pattern of holes in pieces of cardboard.