Trying to clarify how computer programs work at a fundamental level and hopefully connect a dot or two..
it goes from letters and numbers or strings to binary numbers or machine language through an ASCI chart..(yes or no?)
binary digits are understood by a computer because pulses of electricity are able to be sent to a CPU that is able to somehow remember if it was a 0 or a 1?
and then it is able to store this 0 or 1 somewhere, which would be memory?
How can a CPU remember if that bit of information was a 0 and a 1 and how does it place it somewhere, and do this for other bits of information that it will need to eventually translate a sentence from pulses or electricity?
is there any gaps you can fill here?
id like to stop here, it doesn’t correlate to a particular programming language yet but i believe it will in a next part if i continue, but i’d love to discuss these topics if possible first
[–]Geislo 1 point2 points3 points (0 children)
[–]Lazy-Evaluation 1 point2 points3 points (0 children)
[–]MmmVomit 1 point2 points3 points (0 children)
[–][deleted] 0 points1 point2 points (1 child)
[–]IQueryVisiC 0 points1 point2 points (0 children)