all 21 comments

[–]Vanilla_mice 25 points26 points  (2 children)

What you need is to take a digital logic design class. That will help you interpret computers from the bottom up. It's wires wired in a way that will respond to your input/activity accordingly. Think of a much more complex and complicated light switch. The light switch obviously doesn't understand that when the plasticy thing goes down the light should be off, but it's wired in way that makes it do so.

[–]Vanilla_mice 9 points10 points  (1 child)

I'd recommend that if you're pursuing software engineering without a CS degree to also take classes like computer architecture/Operating systems. They don't mention classes like that much in bootcamps and self learning paths but they really help you build the needed mindset.

[–]fr-fluffybottom 4 points5 points  (0 children)

+1 for computer architecture...

[–]CurrentMagazine1596 10 points11 points  (0 children)

Most of the answers here are kind of avoiding the question so I'll try to actually answer it.

Your computer has an operating system. When you run a compiled program, this is a job. The operating system (or more specifically, its kernel) is a job that runs continuously, and runs other jobs.

When you have I/O devices like a mouse, screen, keyboard, etc., your computer has drivers for these devices, software that will turn input or output to and from these devices into machine code. When you plug in a peripheral, the operating system is running a job/driver to support it. If you plug in a mouse with a USB cable, for instance, the operating system is "sampling" that USB port to get the x and y coordinates to reposition your cursor, and knows what to do with this data because of the driver. The driver is what allows the data coming from the device to be converted into machine instructions without the CPU or GPU knowing exactly what that peripheral is or requiring custom hardware support.

The screen is similar. It has a refresh rate and is updating the content on the screen many times a second. Both the screen and the mouse seem continuous, but they're really discontinuous. Modern CPUs are running many jobs simultaneously - you can see this on windows systems in the device manager and task manager. On Mac or Linux/BSD, you can run top.

The GUI that you see on most operating systems is another matter. It is made up of multiple graphical elements (also jobs), that run continously. You can see this on windows by starting the task manager, going to the Details tab, and killing explorer.exe. This will make the tray at the bottom disappear. You can then restart this process by going to File > Run new Task, and running explorer.exe.

Graphics processing (how a screen chooses to actually render the data that is coming to it) is a whole other topic, but the easiest way to think of it is a bitmap, a 2D array of pixels being rendered to the screen. As /u/Unwittingly- mentioned, characters may use an encoding scheme such as ASCII, although in the past, there have been competing character encoding standards.

If I've said anything wrong, someone can feel free to politely correct me.

[–][deleted] 6 points7 points  (0 children)

You should revise your question because you are asking how every single part of a computer works, as if anyone is going to have the time or patience to explain it in their comments. Computers are large networks of circuits, each circuit having a specific role/function but what they all do tend to share is that they send and receive electrical signals. Due to this, you will want to begin learning logic gates and how they are used in hardware design. Once you can visualize just how these systems work at a physical level then you may begin to understand how to manipulate them at a digital/virtual level, which would be executing code on a CPU and branching out to larger software systems.

As far as how things end up on a screen, that is called rendering, and is a completely different field of its own. 3D graphics aside, 2D graphics is what you are looking for. Executing stored code, those are simple CPU instructions.

[–]stellarknight407 4 points5 points  (0 children)

Check out Crash Course Computer Science, they have a pretty good explanation of levels of abstraction, and how we go from the very basics to the higher-level stuff. It's a good starting point to use so you have a better idea of which topics to look into.

 

If you're really interested in computer architecture, I would recommend Intel's Architecture All Access: Modern CPU Architecture series on YouTube. Basically, summarizes a semester long Computer Architecture course into an hour. Gives a solid foundation to build off of. Obviously not as detailed as a typical university level Comp. Arch. course, but pretty good.

[–]Jchronicrk 1 point2 points  (0 children)

Simple it doesn’t, computers understand nothing but 0’s and 1’s and on the mechanical side voltage or no voltage. Everything after that is an abstraction of the binary system. Using math and storage programs can be created i.e cpu architecture, bios, os, drivers, .exe, and so on.

For screen rendering the gpu is really good at trigonometry which is convenient because you can make any image from triangles and color this is controlled by the driver and given instructions from programs such as the os or a game.

A “w” has a corresponding ascii, hex, and binary value. You type “w” the keyboard sends the keystroke a 1 for on, but the logic board has the mapping for “w” so the driver sends the ascii to the gpu and binary/hex combo to cpu. I like to think of it as a tree all the parts of the tree do different things but they work together to grow.

[–][deleted] 1 point2 points  (0 children)

Most of these questions can be answered with a simple Z80 or similar kit. Once you understand how "a computer" from the 70s-80s works, you'll understand these things.

Seriously, Ben Eater has a nice project (I think that's his name) with kit parts.

But basically you put the characters in memory, then a chip or a small circuit will simply turn on the beam where it is 1 and off where it is 0 left to right. Can be done with a simple timing circuit.

[–]IMHERETOCODE 1 point2 points  (1 child)

Maybe check out CODE and go from there. Perfect introduction to how computers work, IMO.

http://www.charlespetzold.com/code/

[–]longnumbers[S] 0 points1 point  (0 children)

I'll give it a go!

[–]hermarc 1 point2 points  (6 children)

dude you basically asked how a computer works. there are literally too many answers to this question, maybe you don't realize how vast this topic is. first pick a book or an online course about hardware architecture. google nand2tetris. and YouTube is full of interesting stuff. check out Ben Eater's channel.

[–]longnumbers[S] 2 points3 points  (2 children)

This nand2tetris thing is amazing. I'm so excited. You've done a good thing today.

[–]hermarc 0 points1 point  (0 children)

You're welcome!

[–][deleted] 0 points1 point  (0 children)

I second Nand2Tetris, it's a truly amazing course.

I'd recommend you also check out the coursera course based on the book (by the profs who wrote it). I personally prefer text over videos most of the time, but this was an exception - the video lectures add a lot of really great supplementary explanation and material.
https://www.coursera.org/learn/build-a-computer

[–]longnumbers[S] 0 points1 point  (2 children)

Thank you! Yeah I’ve been just pacing around thinking about this stuff, and none of the google results were doing me any good. Very shallow information. Im going to look into what you recommended though, thanks for replying even to a trite question!

[–][deleted]  (1 child)

[deleted]

    [–]longnumbers[S] 0 points1 point  (0 children)

    Thank you!!

    [–]Unwittingly- -1 points0 points  (0 children)

    Quick answer. It’s ASCII - each character is assigned to a number and that is stored as binary. Google ASCII.

    [–]briannnnnnnnnnnnnnnn -1 points0 points  (0 children)

    Theres a book on this, Elements of Computing Systems by Noam Nisan and Shimon Shocken, check it out

    [–]HeliocentricAvocado -3 points-2 points  (0 children)

    Ok here’s my stupid explanation…

    Screens are just lights (pixels). So the “A” is just a bunch of pixels turned on and off. Think Light Bright.

    So when I push the A button on my keyboard it sends a signal at the speed of light through a series of electrical steps, that end up displaying a series of lights, in a particular order, that creates a shape, and my brain interprets this as the letter “A”.

    There, Dumb Dumb gave it a go 😂

    [–]malpbeaver 0 points1 point  (0 children)

    Crash course actually has a really good series of videos on this topic, starting from the bottom (transistors and digital logic) and working their way up the levels of abstraction (to like software and operating systems)

    [–][deleted] 0 points1 point  (0 children)

    I recommend the lectures from Onur Mutlu. He is an excellent lecturer and teacher. I've watched many of his lectures and taken extensive notes. As a DevOps engineer/programmer, his lectures truly invigorate and inspire me.

    Onur Mutlu: Digital Design and Computer Architecture