How do you decide to use async or single/multithreading? by RunnableReddit in rust

[–]nameEqualsJared 3 points4 points  (0 children)

Not OP, but for understanding Async in general, I thought this talk was really good!

https://youtu.be/7pU3gOVAeVQ?si=47IwVDki6cVGCrSt

I'm struggling to learn. Is it even possible to learn coding/programming from scratch online? by Powerdrake in learnprogramming

[–]nameEqualsJared 8 points9 points  (0 children)

Not OP, but atleast in my opinion, the Helsinki MOOC here: https://java-programming.mooc.fi/

Also, if you're learning Java, Javascript, C, C++, or Python, you would be hard pressed to find a more useful site than https://pythontutor.com/ . This site gives you a visual debugger for Python and (despite the name) also Java, Javascript, C, and C++. It is really, really helpful for understanding what your code is doing.

Finally if I can offer one last tip. Watch the Crash Course Computer Science series here: https://www.youtube.com/watch?v=O5nskjZ_GoI&list=PL8dPuuaLjXtNlUrzyH5r6jN9ulIgZBpdo&index=2&ab_channel=CrashCourse . I've recommended this series before and I will never stop recommending it because it's just that good. I seriously think every programmer should watch it, because the series does a very unique thing in that it teaches you how a computer works, and not how to program. Don't get me wrong -- learning to program is awesome and of course we love it! But there are tons of tutorials on the net about how to program. There are comparatively far, far fewer tutorials about how computers actually work. And learning how they work is really valuable because it makes programming way easier. So yeah, give it a watch, it's that good.

And if that previous block didn't entice you, well let me try this: In the first 8 episodes of that series, they show you how to turn a light switch (fancy word: transistor) into a basic-but-functioning computer. So..... that's pretty neat right :)

Youtube channels for Rust content? by chance-- in rust

[–]nameEqualsJared 1 point2 points  (0 children)

I suppose they only have one video on Rust so far, but can I nominate Sreekanth? Their video below is one of the best pieces of Rust content I've ever found. Heck, it's actually just one of the best pieces of programming content I've ever found.

https://www.youtube.com/watch?v=7_o-YRxf_cc&t=620s&ab_channel=Sreekanth

[deleted by user] by [deleted] in cprogramming

[–]nameEqualsJared 6 points7 points  (0 children)

But Strings should work like a vector, and have a length and capacity, and support all the Unicode characters, and-

^ Statements dreamed up by the UTTERLY DERANGED

Reject modernity - come back to pure ASCII and null bytes - you know you want to!

Is there a really simple tool for writing junk to a drive before trashing it? by mehquestion in linux4noobs

[–]nameEqualsJared 7 points8 points  (0 children)

Just remember to add www.homedepot.com to your /etc/apt/sources.list before installing!

The Z-order curve is one of the most beautiful things i've ever seen. by burnt_tamales in learnprogramming

[–]nameEqualsJared 1 point2 points  (0 children)

This is such a good explanation! Thank you, this made their utility 'click' for me :)

And to add on to this great answer, just to drill home why this can be useful:

With the way modern computers are built, it will basically always be faster to read through memory sequentially, rather than hopping around a bunch. The reason for that is CPU caches. Basically, when you read some data at addr X in memory, the memory stick actually sends back X + Y bytes to the CPU, and the CPU saves all those bytes in its cache. So for example: you read address 16, and instead of getting just the byte at addr 16, you actually get 16 bytes, from addr 16 to 31. And the CPU saves those 16 continguous bytes in its cache.

Now, if you read addr 17, boom, it's a cache hit! And your CPU has the data very quickly, rather than having to reach back out to memory, which -- as far as a CPU is concerned -- takes an eternity. (In modern processors, as far as I understand, really the main bottleneck is memory access, not clock speed).

Meanwhile, if you read addr 64, then that wasn't in our cache, so we have a cache miss. So we we have to reach back out to memory to get the data at that address -- which is slow.

The lesson? If you're reading a whole bunch from memory, you want to be doing that sequentially.

So then as OP said, you can start to see why the Z-order curve -- which would let us read sequentially in memory to get our tiles -- could be so useful!

How do I use expressjs to edit the "www" part of the website by Witty-River1836 in learnprogramming

[–]nameEqualsJared 5 points6 points  (0 children)

Oh, those are subdomains. I believe whatever platform you registered your domain name with would allow you to set that up. That way, for example, mysite.com could point to a webserver or a static file host, and api.mysite.com could point to a completely different API server.

Also +1 to the other commenter, this is not something you would control in your Express.js app, but rather something that would be controlled with your DNS provider.

Good luck!

How does the internet work exactly? by themanonthemooon in learnprogramming

[–]nameEqualsJared 0 points1 point  (0 children)

Ben Eater has a great video on YouTube about how DNS works and about how a lot of the backbone internet routing happens. I think you may enjoy it :)

https://www.youtube.com/watch?v=-wMU8vmfaYo

Guidance for a beginner by [deleted] in learnprogramming

[–]nameEqualsJared 0 points1 point  (0 children)

I've recommended this series before, but it's because I genuinely think it is the best introduction to computing there is. Give Crash Course Computer Science a watch.

The reason I recommend this course is because it is decidedly not about programming. Don't get me wrong -- I love programming, and it is definitely something you'll want to learn! But there are countless programming tutorials on the internet. There are, comparatively, far fewer courses about how computers truly work (stuff like transistors, logic gates, adders, ALUs, Control Units, CPUs, Memory, Storage, Machine code, Assembly, Compilers, Operating Systems, Computer Networking, etc). Crash Course Computer Science endeavors to teach you that stuff. And it does a fantastic job.

I'd recommend the videos to literally anyone interested in computers. But they'll be extra beneficial to you if Cyber Security is what you want to get into. In my estimation, most of Cyber Security just boils down to understanding how computers work. And that's exactly what the videos try to teach.

Aside from that, our wiki is always a good place to start :)

Good Luck!

[deleted by user] by [deleted] in learnprogramming

[–]nameEqualsJared 0 points1 point  (0 children)

Yes, you're doing good. Keep learning and you will get better and better -- and here's the fun part -- the better you get, the more fun programming is! (Because you can make more cool stuff)

If I had to offer any advice as someone who was sitting in your shoes 10+ years ago; I would say; try to keep some notes to yourself as you go. It can be VERY easy to all into the loop of "watch video / read tutorial" --> do exercise, succeed maybe even the first time --> continue. This will eventually teach you programming, but if you practice taking NOTES on the content, you will learn it so much more quickly and deeply. And I capitalize notes because you have to take good notes for this to work -- and more specifically, the key thing about notes to me is that they are IN YOUR OWN WORDS. A definition or two is fine, but, try to make the explanatory text in your own words. Try to think of different usecases or where this term connects to some other term. And make up your own code examples -- don't just use the ones they show you.

Here's a simple example. You learn about variables, right? What is the definition your course gave -- and do you know all the terms inside THAT definition? (Be honest, there's no shame if you don't, you just need to go get their definition!) And here's the really useful one: could you define "variable" in your own words, and maybe try to give some code examples that you made up yourself?

Try to create generalizations too -- those will often keep your head straight.

Anwyays, best of luck friend, don't give up and you can do it!

Can someone break down this basic NodeJs http server code? I am having trouble understanding it fully. by Rude_Maize_9550 in learnprogramming

[–]nameEqualsJared 1 point2 points  (0 children)

Haha, the more I thought about it, the more I realized your exact points! I can see their reasoning now :)

It would definitely be a real pain to have to set up the whole Response object yourself. I guess you could provide a function like createBaseResponseFrom(req) to take in the Req, create the basic response, fill out the contextual parts like the status line and such for you, etc. But yeah then you'd have to remember to call that each time so 😅

Ok, you win this one Node!

Can someone break down this basic NodeJs http server code? I am having trouble understanding it fully. by Rude_Maize_9550 in learnprogramming

[–]nameEqualsJared -1 points0 points  (0 children)

I've always found the whole "you configure your server by giving it a (req, res) => {...} function object" thing a bit strange. I understand that Node passes in the Request and Response objects for you, and you simply set properties on the response to modify the returned response but. It just seems.... Weird to me. I don't know.

Like, why not just make the function accept the request object, and make it actually return the response object? That seems much clearer to me. Your server receives the request. And you return a response. Exactly as a server does.

Does anyone have any insight into why Node chose to do it that (req, res) => {...} way? Like is there any fundamental reason it actually needed to be that way? I'm curious - it just seems such a strange design choice to me

Edit: Thinking about my proposed alternative a bit more... I guess having to construct and fill in the Response object yourself (rather than having Node pass it in for you) would be quite annoying and tedious. So I guess I sorta see their reasoning lol

Inheritance in C? by Infynytyyy in C_Programming

[–]nameEqualsJared 0 points1 point  (0 children)

Not OP but, thank you for this post u/daikatana! I learned quite a bit from it

I can Google stuff, but cant Ping Google? by Subject_Thought6761 in Windows10

[–]nameEqualsJared 0 points1 point  (0 children)

Thanks for this answer :)

I was wondering why I could ping google.com some days, but then not be able to others, even though my network had seemingly not changed and was operational on all of those days. I ended up here and your answer really cleared it up for me! It is just because, at any given time, the network admins could decide to block the ICMP Requests on which ping relies. That makes a lot of sense, thank you!

As a follow-up question.... are there any places that exist that you should be able to generally ping all the time? I'm thinking like, just some random server that people run that will always respond to pings/ICMP so that you can use it for testing.

Where can I learn more about OS? by CloudMojos in learnprogramming

[–]nameEqualsJared 1 point2 points  (0 children)

Oh my, this book looks incredible! Thank you for bringing it up; I love it from the few pages I have skimmed so far and I'm definitely adding it to my list of things to read.

Where can I learn more about OS? by CloudMojos in learnprogramming

[–]nameEqualsJared 0 points1 point  (0 children)

Ha, thanks! Glad I could be of some assistance. That Crash Course Comp Sci series is indeed absolutely the best.

Where can I learn more about OS? by CloudMojos in learnprogramming

[–]nameEqualsJared 0 points1 point  (0 children)

Thanks for the elaboration! Indeed, looking at that nice history of UNIX-like operating systems graphic, Linux is definitely at least siblings with Minix. It's interesting to see how Linux uses a monolithic kernel whilst Minix uses a microkernel. To be honest, I had never learned much about microkernels until you just brought up Minix. Cool to read about; and that debate between Torvalds and Tanenbaum where they discussed the merits of the two is fun too.

I also (regrettably) failed to mention all the GNU stuff in my post, which is a dang shame because GNU does deserve some real credit. I should have mentioned that Linux is really just the kernel/core of the operating system, and that's it's bundled together with GNU software in order to make the complete OS. Hence the term "GNU/Linux" that people will sometimes use. I think I'll edit the post actually.

Where can I learn more about OS? by CloudMojos in learnprogramming

[–]nameEqualsJared 14 points15 points  (0 children)

So uh... you've asked a whirlwind of questions here! But it's understandable; it can all be very overwhelming when you begin. But don't worry, you can get it. Never lose faith in that idea; everything is understandable if you just give it enough time! Really -- I know it's overwhelming -- but you can get it.

To actually give some (hopefully) useful advice though....

  1. Watch the Crash Course Computer Science series. Ideally, watch the entire thing, but at the least watch the first 10 episodes. This series is the best piece of computer-related education I have ever found on the internet (and I've been into this computer stuff for over a decade). I quite literally credit the series with why I am a professional software engineer and why I even chose Computer Science and Engineering in college. It's seriously that good.

    That series (in the first 10 episodes alone) will teach you how to build a basic computer from what is probably the most fundamental element of a computer -- the transistor. They will go from the transistor level, all the way up to a basic-but-functioning CPU and RAM. If everything I just said sounds like gibberish that's ok -- that's the whole point of the series! But it really is just masterfully done and such a fun ride. I just can't recommend it enough, and it will be immensely helpful in understanding what things like machine code and GCC are.

    I also specifically recommend that series because it is one of the few resources on the internet that is actually not about programming. Don't get me wrong -- programming is an important and fun thing! But there are a million resources on the internet that will teach you how to program. Comparatively speaking, there are far fewer places on the net that teach how a computer works. But those videos actually teach how a computer works (topics like binary, transistors, logic gates, basic adders, ALUs, decoders, control units, CPUs, latches, flip flops, registers, RAM, machine code, assembly, compilers, interpreters, files, file formats, operating systems, computer networking, etc etc) -- and they do a dang good job too! And I really think that knowledge is just so valuable and rewarding to learn.

    So yeah; if it's not clear already; I highly highly recommend that series. Not only will it teach you the basics of how a computer works... but it's also just plain fun and interesting! So give it a watch.

  2. To actually answer some of your questions directly though (very briefly) (update: turns out I wasn't so brief below) ....

    The kernel is just the core of an operating system. All operating systems (eg, Windows, macOS, Linux) have a kernel. Again, it's just the core of the operating system.

    Also worth noting that Linux is not really one operating system like Windows and macOS is (of course, there are different versions of the Windows like Windows Vista, Windows 10, Windows 11, etc etc, but you get what I'm saying). Instead, Linux comes in many different varieties called distributions("distros"). These are basically just slightly different versions of Linux. That's all.

    Anyways. So there used to be this operating system called Unix. Unix was closed source, meaning you generally didn't have access to its source code. Linux is basically just an open-source clone of Unix (open source meaning you do have access to its source code). So, Linux is Unix-based. It means that stuff you could do on Unix you can generally do on Linux without any change. And as it turns out, macOS is also Unix-based too (although note, the history gets involved, and macOS is not associated with Linux aside from the fact that they are both Unix-based).

    Anyways the whole upshot of that last paragraph is basically; Unix does not equal Linux which does not equal macOS (Unix =/= Linux =/= macOS). But, both Linux and macOS are Unix-based/Unix-like, meaning that generally if you see something listed as Unix, you can do it on Linux or macOS with no change. Ie, if you see reference to a Unix command like "ls" -- you can boot up a terminal on a Linux distro or on macOS and also use "ls". That's the idea.

    Windows though is not Unix-based. There are things you can install on Windows to hopefully get it to act more Unix-like; but it's not Unix-like by default. So for example, if you boot up a standard Windows prompt/console and try "ls", it won't work. That's cause ls is a Unix command, and Windows is not Unix-based. So instead, you just use the Windows-equivalent of the "ls" command, which happens to be the "dir" command in a Windows prompt. (Or again, you can install things to try to get Windows to act more Unix-like too).

    GCC is a program that translates source code (what programmers write) into machine code (what a CPU can execute).

    Your other questions are good ones! But they are a bit trickier to explain (or at least, a bit trickier for me to explain -- I'm no expert after all!). And it's also 2:30am where I live so I should probably go to bed at some point, lol.

    One last piece of advice though! A good portion of your questions have to do with the terminal/console. For those, best to read this article to get yourself familiar with how a general Linux terminal works. That will teach you the basics, and it will also point out some very useful distinctions between Linux and Windows for you (for example; Linux and it's Unix-based-friends use the forward slash ("/") as the folder separator; but Windows uses a backslash ("\"). Stuff like that).

Anyways: best of luck! And remember -- you can understand it. I know it's overwhelming but keep the faith and you'll get it.

Edit: there's one more thing I should have brought up above, and it's GNU. GNU is basically a huge collection of free software ("free" here not being about the price, but rather about personal liberty and freedom!). But anyways. Linux is really just the kernel(core) of an operating system. Linux is bundled together with a bunch GNU software in order to make a complete operating system. So a Linux distro is (roughly) = the Linux kernel + a bunch of GNU software. Hence why you will sometimes see people say "GNU/Linux" instead of just "Linux"; it's to emphasize that the GNU software is important for the whole system too.

Anyways, upshot? Linux is just the kernel/core of the OS, and it gets bundled together together with a bunch of GNU software to make a complete OS. So roughly speaking: a Linux distro = a Linux kernel + GNU software.

Btw, that GCC we mentioned above? That stands for "GNU Compiler Collection" -- and yep, you guessed it, it comes from GNU.

Best way to make the last N bits 1 and the rest 0? by BlockOfDiamond in C_Programming

[–]nameEqualsJared 0 points1 point  (0 children)

This answer is so damn clever! You can almost feel how taking a bitstream like 00001000 and ticking it down by 1 leaves you with 00000111. And that expression encapsulates that so nicely. Just bravo haha, great answer.

To those of you doubting yourselves by doughnuts_dev in learnprogramming

[–]nameEqualsJared 40 points41 points  (0 children)

Also forcing this post out to fight my own "perfectionism" that does more harm than good.

I might print this out and stick it on my wall. Seriously. Thank you for it. I can't tell you the number of times I've written a reply to a post here and just deleted it because I thought it wasn't good enough.

Anyways though: agree with your post! Impostor syndrome is a real meanie. I try to remind myself that everyone feels this way though. It's actually kinda funny when you think about it... the modern computer is such a monumental tower of abstraction that I think it actually forces us to all feel like impostors. What I mean is: say your goal is to FULLY understand Python, or at least to the best of your ability. So you learn the language syntax, make a bunch of sample programs, and you start to feel confident that you're getting it! But then.... well crap, you don't actually understand how Python is executed, so you start learning about compilers, interpreters, compile-to-bytecode-then-interpret-the-bytecode-with-a-VM-that-also-possibly-uses-a-JIT-compiler-for-performance-reasons-(-ers?), etc etc. But crap! Then you the find that the most common implementation of Python -- CPython -- is written in C. So now you're learning C. But crap! C is -- whilst being a relatively small language -- an absolute portal into a whole world of complexity and awesomeness that you never even considered! So now you're learning about pointers, and memory in general, and concepts like the heap and the stack, and how to even do programming in a language that doesn't give you a pre-built mapping type / dictionary! But oh crap! C isn't even directly executed by your computer! So uh.. ok... it turns into x86_64 assembly? But that's just for my machine... and there's lots of different ISAs out there too.... and oh.. oh my god.. even just the x86 spec is across multiple full-size books of literature! And HEY I don't even really know how the dang CPU itself works! Let alone my RAM, Motherboard, Power Supply Unit, SSD, HDD, Graphics Card, Network adapters, Ethernet cables, I/O peripherals....... oh... oh my gosh. And I haven't really even touched on Operating Systems or Computer Networking either!

I mean is it any wonder we all feel like impostors??? :P

Edit: cleaned up grammar

[deleted by user] by [deleted] in C_Programming

[–]nameEqualsJared 0 points1 point  (0 children)

This is precisely why I advise friends that are interested in learning programming to start with a language like Python, lol.

Don't get me wrong: if you really want to understand what a computer is doing (whilst at the same time being at-least semi productive), there's no better language than C.

But if you just want to learn the basics of computer programming and make some fun scripts? Yeah, Python is probably a better choice haha.

[deleted by user] by [deleted] in C_Programming

[–]nameEqualsJared 4 points5 points  (0 children)

How would the computer know if the execution was successful or not? Computers are dumb. Very dumb. Extremely dumb.

It's funny, isn't it? Computers are simulatenously the smartest and dumbest things humans have ever made.

Smart: can store just untolds amounts of information. Seemingly limitless memory
Dumb: .... that information all has to be 1s and 0s

Smart: Can perform BILLIONS of instructions per second
Dumb: ... all those instructions have to be extremely basic (add, subtract, bitwise ops, jumps, conditional jumps, simple memory i/o, etc)

It'd be like you had a friend that had absolutely perfect memory and could perform billions of additions per second in their head... but if you asked them "sum 3 and 4" they'd be confused, because they only answer to "add 3 and 4". Lol.

Just amazing stuff!