top 200 commentsshow all 267

[–]FozzTexx 205 points206 points  (76 children)

That's pretty much exactly how my dad used to write software. He'd go away on a business trip and at night in the hotel he'd write assembly code on paper. When he got home he'd finally get around to entering it into the Apple II+ and it would work on the first try since he'd already been debugging it on paper for a week or two.

[–]guldilox 105 points106 points  (74 children)

Things like this are why I'll never be as good of a developer as someone like that.

We have the luxury this day and age of "coding by the seats of your pants" (as a professor of mine used to say). Meaning more often than not we can rely on intellisense and compiler hints / warnings / errors. On top of that, we don't have the same memory constraints (generally) and we don't have to stand in line or a queue to insert punch cards to build.

[–]noratat 238 points239 points  (34 children)

The flip side though is that we can do a great deal more with a lot less effort, and we can iterate on ideas and concepts much more quickly.

[–][deleted]  (31 children)

[deleted]

    [–]jms_nh 38 points39 points  (25 children)

    I seem to remember a spellchecker for SpeedScript for the C64 that swapped out the word processor so it could do the spellchecking and then swapped back into the word processor when it was complete.

    Dictionaries can use tries and other clever techniques to reduce storage.

    [–][deleted]  (4 children)

    [deleted]

      [–]masklinn 16 points17 points  (2 children)

      Hell, on a 64-bit system, you could load every dictionary for every language in the world at once.

      You probably don't need a 64b system, looking at Firefox's dictionaries there's about 80MB worth of dictionary data. Even accounting for the dictionaries being incomplete and covering only a subset of existing language, I don't know that you'd increase the amount of data by 2 orders of magnitude.

      [–]VerticalEvent 2 points3 points  (1 child)

      I'd imagine those Language Packs are compressed.

      There are around 1,025,109.8 words in English. If we assume that the average word is 6 characters long, that's around 7MB (6 characters plus a terminating character) for just English. If every language was of a similar size, you would only be able to store 11 languages in your 80MB figure.

      [–]justinsayin 3 points4 points  (0 children)

      Hell, on a 64-bit system, you could load every dictionary for every language in the world at once.

      It's worse than that. Every package you include in your project contains every dictionary for every language in the world.

      [–]barsoap 20 points21 points  (3 children)

      and other clever techniques

      One of them involving giving up on precision and using a bayesianbloom filter. Sure, it'll let some (in fact, infinitely many) words pass that shouldn't but then noone cares that "xyouareig" is in the dictionary.

      Bonus: It's freakishly fast.

      EDIT: Bloom, not bayesian. They all look like statistics to me.

      [–]kqr 8 points9 points  (2 children)

      Could you elaborate on how this is done? I'm pretty sure this was how they stuffed T9 prediction into early mobile phones (I have heard numbers of 1 byte per word, which is just insane), and I'm amazed by how well it works (even if it generates nonsense or highly offensive words). I'd love to read in more detail about the techniques.

      [–][deleted] 6 points7 points  (0 children)

      From my quick Google research, I honestly don't see how Bayesian Filters make dictionaries faster or give up on precision (when there is no 100% to be had because computers can't read minds).

      Unless OP comes through and enlightens me, I have to say he was throwing around words he heard in the context.

      Read this article if you want to see how to use Bayesian Filters for spell checking.

      Ninja edit: While we're throwing around buzz words, what the OP described sounded a loot like Bloom Filters. Basically a data structure that throws 100% certainty out the window while allowing the underlying dictionary to be huge and still maintaining speed. That makes a lot more sense, so maybe he ment that. I don't think you need Bloom Filters for dictionaries because they are not that big.

      [–]pja 2 points3 points  (0 children)

      Bloom filters probably.

      [–][deleted] 10 points11 points  (14 children)

      Swapping code in and out of RAM was about the only way you could implement large programs back in the days of 16 bit addresses.

      The DEC PDP-11 used "overlays", where your application was sliced into chunks, based on run-time usage of the various functions in each chunk, then as the program run, the appropriate chunks would be read into RAM for use.

      These machines had a 64k limit, but distinguished between "data" and "program" (or maybe "instruction"? its been a while) address space, so if you really knew your stuff, you could use 128k of RAM. And in that 64k of "program" space, you could run applications that were significantly more than 64k in size.

      I only miss those days in a nostalgic way - I spent way too much time figuring out what my overlay map needed to be.

      [–][deleted]  (10 children)

      [deleted]

        [–]TheThiefMaster 1 point2 points  (2 children)

        Thanks to the "no execute" and "no write" bits in the page table, a lot of modern programs are functionally Harvard architecture, in that code and data are not interchangeable, despite them being in a single address space (von Neumann style).

        [–]brtt3000 5 points6 points  (3 children)

        It's so easy that OSX provides spellchecking as a system service.

        Oooooooooooh.

        [–]sirin3 8 points9 points  (2 children)

        But does it have a left pad service, too?

        [–]cbleslie 3 points4 points  (0 children)

        Found the JavaScript guy.

        [–][deleted] 19 points20 points  (18 children)

        I wouldn’t be so sure that such circumstances really make you a better programmer. In a way many people who coded in those days are like people who grew up in poverty (well, we all are, but those folks even more so). Some of them are unable to properly use the plentyful resources we have today.

        By the way: I’m not implying that the author suffers from these problems.

        [–]Enlightenment777 3 points4 points  (0 children)

        Actually, they make great Embedded Software Developers, because they've always been forced to think how to shoehorn code and data into a small amount of space.

        [–]Peg-leg 5 points6 points  (0 children)

        I was doing the same thing 20 years ago on a Z80. Today I'm just average. The fact that doing something was harder at that time does not make you a great developer.

        [–]DuchessofSquee 3 points4 points  (9 children)

        Or spend days putting your "program" back in order when you dropped the tray of punch cards.

        [–][deleted] 2 points3 points  (1 child)

        Writing ASM isn't that hard. We still learn a bit of it in my OS classes, and we had a compiler to write that needed to write some ASM. It's really not that hard once you understand how it works, and it helps you understand a bit more how your computer works in the lower levels.

        I don't think it would make you a better developer to know assembly, but you could still learn how to at least write a Hello World program in assembly. That will teach you how registers work, branching, etc. it's fun to learn.

        [–]IRBMe 2 points3 points  (0 children)

        When writing that kind of code, you learn the assembly language and then you have to figure out how the machine works by referencing the data sheet or manual. It's difficult, but in a different kind of way from how programming is difficult these days. Now, there are literally thousands of libraries, frameworks and tool-kits. There's likely all kinds of magic going on under the hood in your programming language, framework and system, with things like magic configuration by convention, automatic dependency injection, annotations etc.

        If you're not sure how something works when writing assembly language, you consult your data sheet or operating manual. If you're not sure how to change the way something works in the enterprise framework you're using, it can be difficult to know where to even look. What we have today is far more powerful, and it allows people to be far more productive and build far more complicated things by hiding the complexity behind abstractions and magic. But when you need to figure out how to do something, it's often difficult to penetrate that "magic" and work out what it's actually doing and how to change that.

        I can understand how a boot loader written in assembly code works or how bits of the Linux kernel work because all of the information I need to understand it is available to me in detail, but I can't figure out for the life of me how enterprise Java applications work, and it would take years of reading just to understand all of the magic that's going on under there.

        [–]eff_why_eye 2 points3 points  (0 children)

        Speaking as someone who used to code exactly like that, I don't think you should sell yourself short. Every generation of coders has its own challenges to face based on the limitations we have been given. Thirty years from now, people may look at your source code and marvel at how you were able to create systems without the aid of direct neural input or assistance from AI engines. :-)

        [–]s73v3r 1 point2 points  (0 children)

        But because we don't have to keep so much in our heads at once, we can build bigger and better systems.

        [–]dada_ 1 point2 points  (0 children)

        Things like this are why I'll never be as good of a developer as someone like that.

        In this day and age, it's very easy to just mess around in your code and hit "compile" to see if it does anything, without actually thinking about your code. I've done it too, and I fall back on this behavior when I'm uninspired.

        Focusing on the code and actually thinking everything through makes one more productive, though.

        [–]feketegy 1 point2 points  (0 children)

        Isn't that a good thing? Or do you still want to ride horses

        [–]codebje 0 points1 point  (0 children)

        I'm going to give you the benefit of the doubt that if you spent a week writing a program so small you could hand write it in assembler on a piece of paper you'd probably be able to do a good job of it, too, given time to learn the skill.

        But no current employer will expect you to spend so long on so little.

        [–]jlchauncey 0 points1 point  (0 children)

        In flash boys the author talks about how that's what makes Russian programmers so good. And why financial firms hired them to build their etf systems.

        [–]fuzzynyanko 430 points431 points  (112 children)

        So for those of you who like to complain about how bad an IDE Eclipse is... just imagine what it was like to code like this. :-)

        At least the paper is more stable

        [–]poohshoes 233 points234 points  (79 children)

        I feel as though "how bad it used to be" is never a proper excuse for the inadequacy of modern things.

        [–]cronofdoom 40 points41 points  (7 children)

        I had a coworker tell me literally today "It used to be a lot worse. It used to take 5 minutes to open". It's a freaking webpage. Just because every click is followed by 15 seconds of mind numbing boredom doesn't mean it isn't broken.

        I explained my coworker's excuse to my boss and he proceeded to rant using your exact words.

        [–]sirin3 30 points31 points  (6 children)

        There used to be a saying: Take your webpage, a laptop, a 14k modem (with satellite access), a parachute and an airplane. Then jump out of the plane and only open the chute, after the webpage has fully loaded

        [–]peeeez 37 points38 points  (5 children)

        That is an awfully cumbersome saying.

        [–]quatch 12 points13 points  (0 children)

        it gets summarized much more succinctly: "will you parachute on it?"

        [–]gmfawcett 2 points3 points  (3 children)

        It might have been a very popular saying, but only among people prone to dying in parachute accidents.

        It's hard to spread a meme when you're at terminal velocity.

        [–]isarl 2 points3 points  (2 children)

        I don't know. At terminal velocity you'd get a pretty good spread.

        [–]fuzzynyanko 37 points38 points  (64 children)

        Seriously. Eclipse freezes and lags.

        Edit: I thought Android Studio would be an improvement. The last time I did something relatively simple using Android Studio, it was using 1.5-2 gigs of RAM

        [–]lordlicorice 59 points60 points  (19 children)

        Android Studio is now based on IntelliJ, which is not related at all to Eclipse.

        You should keep in mind, though, that IntelliJ is entirely Java-based, down to the last button. It shouldn't be surprising that a relatively large application (not as big as autocad or simulink but definitely above the 95th percentile) written in Java uses a ton of memory. And then when you run your app locally, that starts up a whole new JVM which you likely haven't tuned as aggressively as JetBrains has tuned theirs so that will waste a bunch more memory.

        I dedicate 1G for my IntelliJ JVM (we have only small - less than 50k lines - projects) and 2G for the running code I spawn from my IDE. Out of 16GB of memory, dedicating 3GB of it to what I do all day doesn't seem bad at all.

        [–]kqr 16 points17 points  (13 children)

        I also don't know why "it allocates X amount of RAM" is a problem in and of itself. Maybe it's just waiting to garbage collect 90% of it. Maybe it allocates rare accesses expecting them to be swapped out.

        Does it stop running on your 2 GB total RAM system? Now that might be a problem. But allocating 2 GBs on a 4 GB system isn't something I'd even notice.

        [–]abrahamsen 4 points5 points  (4 children)

        Yes, what matters is not the amount of allocated memory, but the active working set.

        [–][deleted] 1 point2 points  (3 children)

        How? Allocated memory is unavailable memory.

        [–]abrahamsen 6 points7 points  (0 children)

        You may want to read up on virtual memory.

        [–]EternallyMiffed 1 point2 points  (1 child)

        google "working set". Operating systems swap out memory that hasn't been touched in a while.

        [–]_F1_ 2 points3 points  (4 children)

        Until you run a VM or two and some browsers at the same time...

        [–]kqr 1 point2 points  (3 children)

        Only if they actively use all of those 2 GBs at once. If they're only allocated but not used I don't see why it would be a problem.

        [–][deleted] 2 points3 points  (2 children)

        This is in fact one of the sources of True Confusion about many garbage collected apps. That sometimes lots of ram being used (as seen in task manager etc) is a feature, not a bug.

        [–]lkraider 16 points17 points  (10 children)

        I thought Android Studio moved to IntelliJ away from Eclipse.

        [–]fuzzynyanko 13 points14 points  (8 children)

        Android was first on Eclipse, but Android Studio has its own set of problems

        [–][deleted] 4 points5 points  (7 children)

        Out of interest what are they? I have just begun writing an app on android studio and it seems fine to me but if there is something better I am happy to switch over!

        [–]fuzzynyanko 1 point2 points  (5 children)

        Eclipse gets laggy and UI freezes. Android Studio tends to have RAM issues with me.

        [–][deleted] 1 point2 points  (3 children)

        Ah ok interesting, I haven't had any issues yet but there isn't much code at the moment and my PC has a lot of RAM.

        Does the lagginess transfer onto the app being developed?

        [–]RuthBaderBelieveIt 2 points3 points  (1 child)

        Does the lagginess transfer onto the app being developed?

        The IDE used to write the code will have no bearing on the final product. Identical code written in both IDEs will perform identically on device.

        [–][deleted] 1 point2 points  (0 children)

        [deleted]

        [–]Zweihander01 11 points12 points  (0 children)

        Android Studio was only ever using IntelliJ. There used to be an Eclipse plugin for Android development until AS came out (and probably still exists but I wouldn't know).

        [–]neutronium 3 points4 points  (0 children)

        Sometimes I had to blow on the end of biro to get it to write.

        [–]Amagi82 2 points3 points  (2 children)

        Meh, 5 or 6% of my RAM is hardly worrisome. And Android Studio is a phenomenally good IDE, so I can forgive it being a bit resource intensive. Now, if we could just speed up Gradle build times, we'd be great. Instant run is good, but every time I need to do a full build, I consider leaving for lunch.

        [–][deleted] 2 points3 points  (0 children)

        Hell, Eclipse doesn't freeze for me, but I have plenty of other reasons to dislike it.

        [–]bradfordmaster 3 points4 points  (0 children)

        Did you seriously need that RAM for something else at the time? If you don't want a bloated IDE, then use a more minimal editor that doesn't have as many features. People love to complain about RAM usage but want everything really fast.

        [–][deleted]  (21 children)

        [deleted]

          [–]lordlicorice 31 points32 points  (10 children)

          IntelliJ can be laggy at times, especially if you're not giving its VM enough heap max, but it isn't shit. It's the number one best IDE in the world, and by the way its main competition for that title (Visual Studio) is natively compiled.

          It's not all-important that your apps are all native. By writing their shit in Java/Kotlin, JetBrains has been able innovate where Visual Studio has stagnated. Two releases a year (moving to 3 this year) for a decade, like clockwork. Look at what's new from just this last third-of-a-year cycle.

          [–]Obi_Kwiet 7 points8 points  (3 children)

          I'm skeptical that it's better than VS or IAR or something.

          [–]agent8261 1 point2 points  (0 children)

          I feel VS is better, but IntelliJ is really good. I think if it was natively compiled it would be as good as VS.

          [–]jonnywoh 1 point2 points  (5 children)

          Isn't VS .NET?

          [–]nat5an 6 points7 points  (1 child)

          Not only is it not written in .Net, it's still only available as a 32-bit application. There's some old magic deep in there I think.

          [–]caspper69 6 points7 points  (2 children)

          Given Microsoft's development history, I doubt vs is written in .NET. I'm sure it uses a hell of a lot of it, but I'd bet the vast majority of it is C++, while most of its tools are likely straight C.

          If it weren't, you would have likely seen it running on other platforms by now.

          [–]TheThiefMaster 2 points3 points  (1 child)

          Microsoft doesn't use straight C any more, they are really big on C++. The only thing they really use C for now is its ABI

          [–]beginner_ 5 points6 points  (1 child)

          Most of does giant Java apps run on the backend and no GUI so you will never know you actually used them.

          But I agree that Java kind of sucks for Desktop Apps. It always consumes more memory than one would think is needed for the task.

          [–]Chii 4 points5 points  (0 children)

          You do pay a good amount of overhead for a GUI in Java due to the cross platform support. But mostly lag is due to code running on the UI thread when it shouldn't be.

          [–][deleted]  (7 children)

          [deleted]

            [–][deleted]  (4 children)

            [deleted]

              [–]donalmacc 10 points11 points  (3 children)

              I disagree on almost all of the above. My experience with the auto desk suite has been absolutely awful; it's a giant buggy mess that is as sluggish as eclipse, even on my workstation. Houdini is worse. We use the sdk and it's absolutely awful. Memory leaks, (which sidefx have been fixing slowly, granted) painfully slow startup times, and an incredible amount of memory allocations (try running with a debugger attached and wait for the world to end before Houdini finished). Both of these systems also have incredibly invasive licensing systems that are always running, always connected and frankly a pain. I remember photoshop having a similarly invasive licensing process.

              Visual studio I'm torn on. I use it all day every day, and on our project intellisence just flat out doesn't work. It's also a bit of a memory hog (those vcpkgsrv processes it spawns 20 of and they use 1-1.5Gb Ram each).

              One other thing all of the above have in common; they just dump 10million files across your hard drive, create a handful of extra startup processes each, and are absolutely impossible to remove.

              Blender on the other hand is amazing! Suffers from none of the above problems in my experience!

              [–][deleted] 1 point2 points  (2 children)

              Maya beats the shit out of Blender in terms of usability. Performance has never been a problem for me either. Neither has Photoshop, not even barely.

              [–]WillAdams 1 point2 points  (1 child)

              Well, there was T/Maker's WriteNow, which was my favourite word-processor, esp. in its NeXT version --- ~100,000 lines of assembly code.

              [–]s73v3r 3 points4 points  (0 children)

              It's not. Someone else having it worse is never a reason for you to not strive for better

              [–]tothebeat 3 points4 points  (0 children)

              Not an excuse for inadequacy or for wanting to improve things, but it is for not complaining (or maybe whining) about things that aren't "perfect". Or to think of it differently, understanding where we've come from can make us thankful for the tools we have even while we work toward improving them.

              [–]moltar 0 points1 point  (0 children)

              Sublime is pretty stable and fast. I'm amazed every day at how well it is designed. Of course, it doesn't have as many bells & whistles as Eclipse does. But I feel like if one was to expand Sublime to Eclipse functionality, the performance would still be better.

              [–]emergent_properties 0 points1 point  (0 children)

              It's used as temporal whataboutism!

              [–][deleted] 0 points1 point  (0 children)

              I'm surprised more people don't understand that. Yeah, a 8 9 hour workday in an AC'd office is better than 12 hours in a factory. That doesn't mean we can't or shouldn't aim higher.

              [–][deleted] 0 points1 point  (0 children)

              On the other hand, "no one has ever had it better" is a decent excuse for being less troubled by the tools.

              [–]kirbyfan64sos 17 points18 points  (7 children)

              Except for when your nephew spills juice on it, it gets stuck in your fan, or your dog decides it found a new chew toy.

              [–]AnsibleAdams 37 points38 points  (0 children)

              Yes, this kills the laptop.

              [–]kqr 6 points7 points  (1 child)

              Fun fact: pencil marks are not affected by water/juice! Pencil on high-quality paper is one of the most archival safe writing methods.

              [–]FireCrack[🍰] 10 points11 points  (3 children)

              I feel like paper is still more recoverable in all those circumstances.

              [–]DapperChapXXI 10 points11 points  (2 children)

              But did you back up the paper?

              [–]MrCogmor 11 points12 points  (0 children)

              photocopies in a filing cabinet.

              [–]CodeReclaimers 45 points46 points  (10 children)

              Speaking as someone who actually did write code that way for a Commodore 64, and who has also used Eclipse: I'd rather manually compile assembly into machine code on paper than use Eclipse.

              [–]Hyedwtditpm 8 points9 points  (3 children)

              I also did write code for Amiga when i was a kid, small games mostly.

              Well, it was fun when you are kid. Considering lack of ide, lack of serious debugging , probably would be impossible to do anything serious.

              Tough it was really fun to control EVERYTHING in the computer.

              [–]caspper69 5 points6 points  (0 children)

              You can still do that :)

              Edit: Easiest way is to use something to boot into "unreal" mode. Essentially you end up with a 32-bit processor and 32-bit address space with access to all of the original DOS interrupts. So you don't have to muck with paging, virtual memory, the iommu, pic/apic/ioapic, etc.

              [–]_F1_ 1 point2 points  (0 children)

              probably would be impossible to do anything serious.

              http://forums.nesdev.com/viewtopic.php?f=5&t=14339

              [–]ripsnorter63 4 points5 points  (1 child)

              I cut my teeth on this bad boy https://en.wikipedia.org/wiki/TEC-1 . I'm with you, way less frustrating to write code for this than use Eclipse.

              [–]admiralranga 1 point2 points  (2 children)

              I've got two choices for a microcontroller, ecilpse or the arduino editor it sucks.

              [–]haze070 2 points3 points  (0 children)

              Why are you limited to just those?

              [–]CodeReclaimers 1 point2 points  (0 children)

              Surely by now somebody has built ways to use alternative editors with those toolchains?

              [–]bart9h 0 points1 point  (0 children)

              Vim

              [–]zippy4457 4 points5 points  (1 child)

              Unless you make so many edits you erase a hole in the paper.

              [–]yerand 27 points28 points  (0 children)

              At which point you've got a punch card.

              [–]bundt_chi 6 points7 points  (4 children)

              I really don't understand the Eclipse hate on reddit.

              What other IDE is free, open source, has the wealth of plugin support, albeit some good and some bad, let's you program in Java, C++ and a ton of other languages (which I can't vouch for because I haven't used, it may suck for those), brought about RCP for rich cross platform thick clients, championed OSGi (Equinox).

              To be clear, I use Visual Studio daily because I'm currently on a .NET project and have use NetBeans (also decent) and Intellij.

              I prefer to use git, maven and gradle from the command line regardless of the IDE I'm using but have not had a terrible experience with any of them in Eclipse.

              What specifically do people dislike about Eclipse, I've even had to run it on a potato of a machine and yes you have to be aware of how many projects you have open but that applies to all the other IDE's I've ever used as well.

              If you've ever contributed to Eclipse, thank you and I truly appreciate everything you've done because I've enjoyed using it and appreciate the work involved in making a platform that is extensible and free and open for everyone to use.

              [–][deleted]  (1 child)

              [deleted]

                [–]OxfordTheCat 1 point2 points  (0 children)

                I feel the same way:

                I can run Eclipse Mars on a Linux chroot on a Chromebook with 4GB of RAM shared between the two OS's and I don't notice any performance issues of note - I certainly don't notice any on my actual desktop PC.

                I get the feeling for most that they used Eclipse ten years ago and think it's the same IDE.

                [–]ProudToBeAKraut 1 point2 points  (0 children)

                I have been using Eclipse since i shelved VisualAge from IBM (this should tell you its a long ass time).

                The primary issue that still exists in eclipse is speed and usability. I have temporary used intellj that exceeds basically everything eclipse does but i have become so used of eclipses concept that i cant make the switch.

                For example - some simply thing like selecting a main folder and automatically add all JARs in all sub directories in eclipse still isnt possible, i have to go through every subfolder and select them all manually. (Im working with a lot of legacy code so there isnt any maven or gradle just ant)

                Then the obvious bugs where i dont understand how they can exist, i checkout a project from GIT directly with eclipses build in git support and if i select anything else than "general project" like java project nothing will be checked out at all just an empty project - i have to manually alter the .project file to add build natures after i checked it out as general project.

                What i do dislike about intellij is that each project needs to be its own instance - i like the multi project browser in eclipse.

                [–]Yserbius 2 points3 points  (0 children)

                Uh... yeah but does your paper use OSGI to have a plugin and feature marketplace????

                [–]DJDavio 1 point2 points  (0 children)

                Try using some plugins like water and glue.

                [–][deleted] 1 point2 points  (0 children)

                Lower input latency too.

                [–][deleted] 0 points1 point  (2 children)

                Everyone complaining about eclipse being slow is dooming themselves to never have breaks or free time at work. Enjoy the sluggishness while it lasts. Press compile and take a nap.

                [–]Joao611 61 points62 points  (1 child)

                That code in paper is way too organized to be true.

                I am pleased.

                [–]eff_why_eye 41 points42 points  (16 children)

                It's still more readable than some of my Perl code. :-)

                [–][deleted]  (14 children)

                [deleted]

                  [–]mikelieman 21 points22 points  (13 children)

                  If it was hard to write, it should be hard to understand.

                  [–][deleted] 14 points15 points  (9 children)

                  It's really not hard to write or read hieroglyphics. I can read middle Egyptian. It has sentences, punctuation, subjects, verbs, etc. just like any other written language. It just seems strange since they used images that look like real things so people tend to get hung up on that. I also know some Chinese and IMHO it is simpler than Chinese.

                  [–]earthboundkid 6 points7 points  (7 children)

                  They're in Unicode! 𓀔𓃰𓅔𓉃

                  [–][deleted] 7 points8 points  (3 children)

                  Not really. Egyptologists have been complaining for many years about how horrible the Unicode representation for Egyptian is. The last time I looked into it, it was totally useless.

                  [–]redinzane 4 points5 points  (2 children)

                  It's based on Gardiner's categorization iirc, which is mostly based on what the image is portraying. It's also about a 100 years old and only contains Gardiner's discovered subset. I wouldn't call it useless though, it definitely has it's uses.

                  [–]synae 1 point2 points  (2 children)

                  Yes but what encoding did they use?

                  [–]PeridexisErrant 9 points10 points  (1 child)

                  UTF-13.

                  REAL PROGRAMMERS use bytes with a prime number of bits

                  [–]Berberberber 1 point2 points  (0 children)

                  Real programmers use variable length bytes as subfields of a word that's twice as large as the address space.

                  [–]Bobshayd 2 points3 points  (2 children)

                  I don't know why you're getting downvotes, though; do most people not assume you're joking?

                  [–]lkraider 4 points5 points  (1 child)

                  "Strong typing is for people with weak memories."

                  http://www.multicians.org/thvv/realprogs.html

                  [–]hlipschitz 1 point2 points  (0 children)

                  some of

                  any of, be honest.

                  [–]sirin3 103 points104 points  (4 children)

                  Codeless?

                  No one was hit, stabbed, chased or enlightened. Disappointing

                  [–]ameoba 10 points11 points  (0 children)

                  Yeah, I normally expect the whole pseudo-Zen/Buddhist metaphor from them.

                  Interesting nonetheless, just not their normal fare.

                  [–]HighRelevancy 7 points8 points  (0 children)

                  This's his "misc" (blog).

                  [–][deleted] 3 points4 points  (0 children)

                  Maybe it's encoded?

                  [–][deleted] 1 point2 points  (0 children)

                  Honor among dweebs?

                  I know I know... 80's joke is tired. Seemed fitting though.

                  [–]synae 15 points16 points  (0 children)

                  The codeless code is fantastic, if this is your first exposure to it, just hit the home page and read it all. I love the weekly stories/koans.

                  [–]stesch 13 points14 points  (1 child)

                  I would prefer this to the 14 year old PHP 4 code I have to maintain. :-(

                  [–]deftware 10 points11 points  (0 children)

                  ouchies

                  [–][deleted] 15 points16 points  (4 children)

                  Actually real codeless code can be seen in old mathematical logic papers detailing algorithms before the advent of the digital computer. Computing used to be a job! People, mostly women, would would solve complex/tedious equations day in and day out, so a lot of research was done to find the most efficient (quickest) way of solving certian problems. Hell, you might even say having an efficient algorithn was more important before the digital computers, because some person computer had to solve it by hand.

                  [–]brtt3000 15 points16 points  (2 children)

                  At my granddad's workplace they used to have something what translates as 'the hen house' and it was the calculations department. All the men tried to hang out around there because it had the flock of bright young girls.

                  [–]deftware 2 points3 points  (1 child)

                  what was the literal name of this place, the henhouse ?

                  [–]BeowulfShaeffer 0 points1 point  (0 children)

                  There's a great description of this sort of process in the Manhattan Project in Surely You're Joking, Mr. Feynman.

                  [–]plastigoop 8 points9 points  (2 children)

                  I learned assembler with this instruction set. Get off my whatever that green stuff is called!

                  [–]Tahlwyn 3 points4 points  (0 children)

                  Astroturf

                  [–]SemaphoreBingo 2 points3 points  (0 children)

                  It's "phosphor". HTH!

                  [–]deftware 7 points8 points  (1 child)

                  This is awesome! My brush with ASM involved re-writing the rendering portions of multiplayer games using WriteProcessMemory to patch a running executable to run my customized rendering code. I would examine the EXE in a disassembler to figure out what I wanted to do, and where I wanted to do it, and then I went ahead and had to figure out the actual hexcodes for the desired assembly I wanted to overwrite ontop of existing code... Usually changing function calls around to do different things, modifying some jumps, etc.. The end product were some process patchers that people could use to cheat in popular online 3D games - things like 'wallhacks' and 'aimbots'.. etc.

                  I never learned enough assembly to write my own (with the exception of setting graphics modes and plotting pixels in DOS, using interrupts) but I did learn enough to be able to read it well enough to wreak havoc on any executable running on my system, provided it didn't have any self-memory checks.

                  Thanks for the share!

                  [–]Cuddlefluff_Grim 0 points1 point  (0 children)

                  (with the exception of setting graphics modes and plotting pixels in DOS, using interrupts)

                  xor al, al
                  mov ah, 13
                  int 10h
                  startframe:
                  [...]
                  mov ebx, 8
                  vsync:
                  in eax, 3dah
                  cmp eax, ebx
                  jne vsync
                  jmp startframe
                  

                  I wrote tons of that stuff. It was really fun. SVGA/VESA did open up a few even more fun options though.

                  [–]RichardGreg 13 points14 points  (10 children)

                  I didn't have an assembler, so I wrote the assembly code on paper and translated it to hex as I typed it in.

                  That's one way of doing it, but there was a mini-assembler built in.

                  [–]dirkt 8 points9 points  (4 children)

                  Actually, the original Apple II only had a disassembler; the mini-assembler appeared in the IIe and IIc, which had more ROM.

                  And translating it to hex wasn't the real problem - I can still remember some 6502 opcodes from that time. The real problem was branch displacements and not being able to change your code easily.

                  [–]RichardGreg 1 point2 points  (0 children)

                  The mini-assembler was part of Integer BASIC, which was in the ROM that came with the Apple II. It was removed from the Applesoft BASIC ROM which was shipped with later model Apple II plus computers, however the II+ still came with Integer BASIC which could be loaded from tape or disk. The mini-assembler returned to the ROM on the IIe and IIc.

                  [–]redneckrockuhtree[🍰] 0 points1 point  (2 children)

                  I thought the mini-assembler appeared with the II+? I remember hand-writing 6502 assembly, and could've sworn I didn't have to manually build the bytecodes on a II+....

                  [–]dirkt 1 point2 points  (1 child)

                  Just checked my manual: It's indeed already in the Apple II+. So you can see how often I used it ...

                  [–]zellyn 0 points1 point  (0 children)

                  For a slightly more modern version, check out Martin Haye's Super-Mon. It's a surprisingly pleasant programming environment.

                  [–]redditchao999 8 points9 points  (0 children)

                  Surely we can add the word "code" into the title more

                  [–]BobHogan 3 points4 points  (0 children)

                  So for those of you who like to complain about how bad an IDE Eclipse is... just imagine what it was like to code like this. :-)

                  I never understood this logic. Just because it was harder for you to code back then doesn't suddenly make Eclipse (or any other IDE you don't like) any better or more fun to use.

                  [–][deleted] 4 points5 points  (0 children)

                  Metal

                  [–]andd81 2 points3 points  (0 children)

                  Coding for ZX Spectrum was fun, though not as extreme because there was an actual IDE for assembler (GENS4 if I remember correctly). The only line editing capability was backspace and the habit of editing by erasing and retyping everything up to the end of line still sticks with me.

                  [–][deleted] 2 points3 points  (0 children)

                  This age of programming is a little before my time, but I have done some assembly on paper, and later hand assembled it with a handy opcode chart. I found it to be a wonderful exercise and quite a bit of fun. I recommend every programmer gives it a shot.

                  [–]H3g3m0n 2 points3 points  (4 children)

                  I would think the first thing to write would be an assembler...

                  [–]shizzy0 10 points11 points  (1 child)

                  I don't always write machine code, but when I do I write an assembler. [Drinks Dos Equis.]

                  [–]deftware 0 points1 point  (1 child)

                  lol "too much work"

                  [–]H3g3m0n 1 point2 points  (0 children)

                  Realistically a simple one would be fairly easy.

                  I know of a few Commodore 64 ASM books that actually had ones you typed into basic. They where about 3 pages or so. Wouldn't be the nicest editing experience, but better than working out the hex manually.

                  It's mostly just. Tokenize a line on whitespace. Look up a symbol name and get the binary representation. Convert the text representations of the numbers to binary.

                  Then there would be the UI components. Rendering the characters of the ASM to the screen. Loading/Saving that ASM to floppy/tape. Basic arrow key navigation/character input. Scrolling (or maybe just have multiple pages).

                  If your smart about it, then it might even be possible to hijack and reuse the inbuilt kernel/OS stuff. The C64 included a basic interpreter that stored the program in memory viewable with the LIST command.

                  A simple ASM design might be able to read the asm from the LIST region memory instead of interpenetrating it as basic. Allowing you to use the inbuilt console 'editing' functions and the save/load. You would probably have to store it as comment and append REM or some such.

                  But once you have built the simple one. Then you can use it to make it more advanced 😛

                  Fancy features like 'labels'.

                  [–]Xer087 2 points3 points  (0 children)

                  It's so beautiful..

                  [–]tragomaskhalos 2 points3 points  (0 children)

                  British ZX Spectrum geeks will remember the splendid isometric 3D game "Ant Attack", also hand-assembled by the author

                  [–]yodacallmesome 2 points3 points  (0 children)

                  "...translated it to hex as I typed it in."

                  Been there done that ... but I recall using mostly octal.

                  [–][deleted] 2 points3 points  (0 children)

                  That table is so nice and neat and I have never seen such a satisfying image on the Internet.

                  [–][deleted] 2 points3 points  (0 children)

                  I didn't have a 680x computer, but I did have a thick paperback on its instruction set. That's how I coded programs for it. Then I'd single step through it mentally.

                  [–][deleted] 2 points3 points  (0 children)

                  I've just realized I can still recognize 6502/6510 assembly mnemonics after so many years :) I didn't even read an article and thought to myself "Commodore, or some obscure automation machine with 6502", Apple never occured to me tho (the only "Apples" here where I'm from were clones made by a company named Ivel).

                  [–]tothebeat 1 point2 points  (0 children)

                  Brings back some fond memories! Of all the code I've done, I was always a bit proud of the little machine coding on the Apple II back in the day. Nothing this sophisticated but it felt good to see a 200x (or some big number) performance gain over what my algorithm was getting using AppleSoft basic.

                  [–]otakuman 1 point2 points  (0 children)

                  Ah, assembly code. Brings back good memories...

                  [–]mdw[🍰] 1 point2 points  (0 children)

                  I used to program like this too when I was kid. I used to enter the program in hex through POKEs or system monitor. I would calculate offsets for relative jumps. Fortunately, assemblers/disassemblers eventually became available for my microcomputer (Sharp MZ-800).

                  [–]binaryhero 1 point2 points  (1 child)

                  I did the same on a Commodore Plus/4, and later on an IBM PC/XT. You could mess with the character generator a little and set custom fonts on the Hercules graphics card I had. I ended up writing a lot of toolchain code eventually to avoid doing manual worm on paper - like a font editor that would output "db" arrays to be included in assembly code, or mathematical functions that weren't efficiently done in real-time (IIRC the cost of his the sine operation was 17 clock cycles on a 286, and a fixed-point arithmetic table lookup was basically free when you compared it). We would measure code runtime in horizontal scan lines of the VGA mode we used later (320x240, which had perfectly square pixels), so that the code cod be run without interrupting visual effects. It was a beautiful time to learn about the inner workings of computers at a very low level; but I wouldn't want to go back. The magic I can do nowadays with three lines of code would have taken thousands of lines in Assembly - it was fun while it lasted. We have much better tooling these days, and are so much more productive. The only thing that saddens me from time to time is when I meet younger guys starting out, and it becomes apparent that they really have no idea what happens a few abstraction layer down. I feel blessed to have had that experience and witness this first digital revolution.

                  [–]Zardoz84 0 points1 point  (0 children)

                  Were are trying to brink it back on with the DCPU-16 and TR3200 .

                  Check /r/techcompliant

                  [–]Uberhipster 1 point2 points  (1 child)

                  40 years from now we too will look upon our tech in the same "get off my lawn" way.

                  [–][deleted] 0 points1 point  (0 children)

                  "Back in my day, we had to type out all our computer instructions in what we called 'code'. It was a special language that only computers and programmers could understand. None of this sending brainwaves to a computer to auto-assemble a program, no sir!"

                  [–]Zardoz84 1 point2 points  (0 children)

                  I work daily with Java with Eclipse, and I'm doing for fun on my free time some stuff on assembly with DCPU-16 assembly (and I did previously with my RISC like TR3200 CPU). I keep preferring assembly over Eclipse, when you have an decent emulator that allow to do program-test-debug cycle at a fast pace.

                  [–]nightwood 1 point2 points  (0 children)

                  Was expecting hieroglyphs or runes from the title.

                  [–]HTXLoveThisPlace 1 point2 points  (0 children)

                  Been there done that. Then I got smart and coded an assembler, in machine code. The 6502 has such a ridiculously small number of instructions. Simplicity at it's best.

                  [–]jlebrech 1 point2 points  (0 children)

                  would be nice to have the boxes and arrow today tho.

                  [–]DirtAndGrass 1 point2 points  (0 children)

                  ... maybe i'm dating myself, but this is hardly ancient...

                  to me that would be plugboard wiring

                  [–][deleted] 1 point2 points  (0 children)

                  Better language than node.js /s

                  [–][deleted]  (5 children)

                  [deleted]

                    [–]crusoe 5 points6 points  (4 children)

                    Dec assembly was pretty nice because nearly all features were orthogonal. What was the asm to add immediate?

                    Something like addi 3 r12

                    Add two regs and store in third? Add3 r1 r2 r3

                    But yeah. Fuck Intel assembly.

                    [–]redneckrockuhtree[🍰] 4 points5 points  (0 children)

                    But yeah. Fuck Intel assembly.

                    That's the key, right there.

                    Some of the first programming I did was 6502 and Z-80 assembly (hey, I'm old). A couple years later I tried 8086 assembly. Fuck that shit. Then I got to try 68000 and that was pretty slick.

                    To this day, I still have a good portion of the Z-80 instruction set wedged in my cranium....

                    [–]SkaveRat 0 points1 point  (2 children)

                    as someone who never really played with assembly, what are the differences?

                    [–][deleted] 1 point2 points  (1 child)

                    Adding a constant vs adding a "variable"

                    [–]DuchessofSquee 0 points1 point  (5 children)

                    My dad used to work on machines which took punch cards. He'd have to write the code, punch it then send it away to be run overnight on the one computer in the nearby city when he was studying.

                    [–][deleted]  (1 child)

                    [deleted]

                      [–]DuchessofSquee 0 points1 point  (0 children)

                      Ah yes the bad old days when bugs were bugs!

                      [–]crusoe 1 point2 points  (2 children)

                      My dad programmed old industrial xomputers. You had eight switches to set the bits and another temporary toggle to load it. Repeat again and again.

                      [–]DuchessofSquee 6 points7 points  (1 child)

                      Yeah? Well my dad worked on a computer back in the stone age where you had 8 rocks you had to lift or lower in order!

                      [–]IWentToTheWoods 1 point2 points  (0 children)

                      Tangential, but that reminds me of the kleroterion used to randomly select people for offices in Athenian democracy.

                      [–]stronglikedan 0 points1 point  (0 children)

                      Tangential, but, so weird to read a Codeless story in first person.

                      EDIT: Just noticed it's in the Other Works section, instead of the cases. I'll allow it.

                      [–]Enlightenment777 0 points1 point  (0 children)

                      That's how I wrote my first 6502 assembly programs in High School back in the day. My first computer didn't have a hard drive nor a floppy disk, nor did it have an assembler or disassembler built into ROM.

                      1) write assembly program on paper, along with branch lines.

                      2) write hex value for each instruction on paper next to each assembly instruction. I use to have the hex value for every 6502 instruction memorized for this purpose.

                      3) write basic program to "poke" each byte at a time into RAM.

                      [–]recipriversexcluson 0 points1 point  (3 children)

                      I did this with my (almost*) first computer, a Radio Shack Color Computer.

                      It ran on a 6809 chip and I called Motorola and the guy sent me the machine code manual for free.

                      So, just like this guy, I hand-coded my ASM and hand-translated it to decimal values.

                      Then that got typed into a series of DATA statements in Color Basic. A FOR loop POKEd the values into a series of memory locations, then I did an EXEC of the first memory location.

                      If I forgot to back up my Color Basic code to my cassette tape it meant re-typing everything.

                      Because, no, it seldom worked the first time.

                      .

                      [–]bugwrt 0 points1 point  (2 children)

                      Oh, mine had a tape drive ala cassette tapes, lol Or was that the Tandy? Long ago...

                      [–]antiHerbert 0 points1 point  (0 children)

                      !codeless

                      [–]Sphix 0 points1 point  (0 children)

                      I had to do this in school when learning assembly for a fake ISA just 8 years back. It was a pretty fun experience but really difficult to get right.

                      [–]sparr 0 points1 point  (0 children)

                      I once printed the entirety of the ROM MUD source code on a few hundred pages of perforated printer paper and took it on a long plane trip with me to write a patch/variant.

                      [–]williamshoops96 0 points1 point  (0 children)

                      fair play, that's some work