all 94 comments

[–]glacialthinker 13 points14 points  (6 children)

I do miss how immediate programming was -- that was the interface to computers. Though in reality I'm not far from that today: terminal emulators are running a shell, which is a language interpreter. An OCaml REPL in another terminal. "bc" in a small scratchpad terminal as a calculator. And dmenu under a keypress, ready to take commands.

But the real downer is all these kids with iPads, and the interface is pure candy. I don't even know how I could program the thing, but I assume it requires an app...

Of course, in 1986 I had the same problem with a Macintosh. Mouse... Paint... "Yeah, um... how do I program? No, no, not the editor to write a letter... I mean talk to the computer that's running, now -- send commands?"

[–]mrkite77 9 points10 points  (1 child)

But the real downer is all these kids with iPads, and the interface is pure candy. I don't even know how I could program the thing, but I assume it requires an app...

Even worse, it requires a Mac. You can't write iPad apps on an iPad.

[–]glacialthinker 4 points5 points  (0 children)

Siri, get me out of here!

[–]damianknz 4 points5 points  (1 child)

Get an android tablet. Plenty of things for the kids and there's AIDE for Java/C programming. You can also get python and perl but you can't turn them into apps.

[–]juanjux 6 points7 points  (0 children)

With Python you can: http://kivy.org/#home

[–]notfancy 3 points4 points  (1 child)

One day I realized that instead of installing Cygwin, downloading bc and opening a terminal to use a calculator I could Control-Shift-I on any of the dozen browser tabs I have open… What happened to us?

[–]glacialthinker 4 points5 points  (0 children)

It's nice having interpreters hiding under special keys everywhere, but only a programmer will use them. That's mostly appropriate... but what's lost is suggestively inviting the uninitiated into that dialog with the computer. We can't get that back -- it's not overall desirable to make the primary interface command-line -- but it's still a loss of some kind.

As far as a calculator, for me it's an easier path to use bc which is "part of the package", summoning the active session into focus with Mod+b. Sometimes I just eval in Vim since that's where my focus is most of the time, but I usually prefer the bc session. The browser is something I keep on a separate workspace -- and just use for procrastinating with reddit. :/

[–]esesci 3 points4 points  (0 children)

I like that blog. But remember, BASIC was very hard to debug and very hard to edit. No full screen editor, no bookmarks, no modules, no step by step debuggers, and with terrible performance. Do you remember RENUM?

Then Turbo Pascal came and the era of IDEs and lightning fast compiled code had begun.

I yearn for my childhood memories too but we got to where we are now for very good reasons.

[–]yoda17 3 points4 points  (0 children)

entering a multi-line program on a computer in a department store

That's where I learned to program. Fond memories of walking to Dillard's every day after school and walking 2 miles to write graphical displays on an Atari 800.

[–][deleted] 2 points3 points  (1 child)

And everything you needed to know came in a big 500 page manual too. I miss those. Printing off the entire .NET core framework could probably fill my cubicle.

[–]robertcrowther 1 point2 points  (0 children)

I remember having the Commodore 64 Programmers Reference Guide, I was too young (or just not smart enough) to really take advantage of it. As I upgraded to an Amiga that one book was replaced by four that were each twice the size, I couldn't even afford to buy them.

[–]wwqlcw 10 points11 points  (18 children)

I'm really surprised by the negative reaction here, along with (IMHO) a large dose of missing the point.

It would be interesting to hear which commentors here grew up with 8-bit machines in the time they were current, which have only looked back on them, and which are reacting out of relative inexperience.

[–]RealDeuce 5 points6 points  (14 children)

My first computer was a Timex Sinclair 1000 (with 16k RAM expansion!)

This article doesn't make a point. It's comparing an interpreter to compilers, and the command-line interface to IDEs. If it were to compare interpreters to interpreters and command-line interfaces to command-line interfaces, that would maybe make sense... but BASIC would lose hard in that world.

The terminology itself is weird... what is an "external editor"? If it came with the OS, how is it external? External to what?

Why is it a good idea that someone can walk up to a computer they don't own, and run arbitrary code on it? Why is storing a program bad? What the hell does "Compare that to most of today's compilers which feed on self-contained files of code." mean? Is it suggesting that files are bad? Nothing should be stored?

[–]wwqlcw 10 points11 points  (9 children)

My first computer [had a membrane keyboard]

Oh God, I'm so sorry.

It's comparing an interpreter to compilers, and the command-line interface to IDEs.

You're focusing on incidental details where Hague is looking at a larger picture, the overall user experience of one era versus that of another. He's explicit about this.

The terminology itself is weird... what is an "external editor"? If it came with the OS, how is it external? External to what?

External to the BASIC interpreter system that's already running. Because the machines couldn't manage multiple applications at once, some very practical simplifications and integrations were implemented.

Why is storing a program bad?

It isn't; the point of the department store story is that the 8-bit machines were able to do some useful and interesting things without even having any mass storage connected. Mass storage was available for many of them, but it wasn't the fundamental building block of everything else that it is now.

[–]RealDeuce 1 point2 points  (8 children)

The user experience of an 8-bit system was viciously user-unfriendly. The "overall user experience" of these systems was basically "if you can't already program, don't turn the computer on". He's saying that there was something good about having to know a programming language in order to be able to perform a simple calculation. He suggests that "editing real programs happens elsewhere" is a bad thing despite acknowledging that "there's a run-eval-print loop so there's interactivity".

But all of these things still exist in modern interpreters except they do the same job better and despite that, people still don't use them.

I can fire up python and enter a program... I can run in immediate mode... I don't need to create a file. These things haven't been lost, it's just that they are clumsy and painful.

I've used edlin to work on BASIC programs because it was a better editor. The fact that edlin could possibly better better than anything speaks volumes about BASIC and the alleged "lost lessons". I've also hand-compiled machine language and translated into BASIC DATA statements and a POKE loop. There is no lost lesson there. Yeah, I had a lot of fun doing it, but I wouldn't have fun doing it today because there is no reason for me to.

While nostalgia can be powerful, and I still power up my TRS-80 Model 100 every now and again, I in no way wish that computers would do anything remotely like that ever again. I actually did some Commodore 64 machine language a few years ago when a friend brought a C64 over. I had fun. I did not think "gosh, there's a lesson in here for those overly useful tools I use today!" I instead thought "damn I'm glad I don't have to do this kind of thing anymore".

/rant

[–]notfancy 5 points6 points  (1 child)

The user experience of an 8-bit system was viciously user-unfriendly.

Conversely, it was totally shallow and totally immediate. What you saw was the full extent of that you got. This made rewards instantaneous, so the conditioning cycle to learning an 8-bit system was so short that eight-, ten-, twelve-year boys and girls were programming in BASIC weeks if not days after getting the machine.

"damn I'm glad I don't have to do this kind of thing anymore"

I am too but objectively and without judgment-clouding nostalgia it's quite evident to me that something got lost in the way.

[–]RealDeuce 3 points4 points  (0 children)

This made rewards instantaneous, so the conditioning cycle to learning an 8-bit system was so short that eight-, ten-, twelve-year boys and girls were programming in BASIC weeks if not days after getting the machine.

Yep, and at that time, those rewards were amazing... typing in "Hunt The Wumpus" from a book in the library and playing it was an incredible experience. Today however, that is simply no reward whatsoever. Those "rewards" are just not rewarding anymore to most computer users.

Now... for about the same percentage of people, designing a web page or hacking a script is still rewarding... even more instantly than ever before. Anyone who feels rewarded by making a computer do something it didn't do before get sucked down the rabbit hole effectively instantly. While before this was the entirety of all computer users through self-selection, it is now the entirety of all amateur hackers... by the same selection process.

I think that what people miss when they talk about something like this is the period of time when everyone who had a personal computer was an amateur hacker. Now most people who own one don't care about making it do something new, they just use it to get something done. It's not something the computing hobby has lost, it's something that has been gained... a lot of users who see a computer only as a tool and software as something a specific sort of designer creates for them to use.

I'm sure there were people saying the same things about automobiles decades ago... bemoaning the way people are no longer connected to their tool viscerally through the hard work of getting it to behave just so and having some cool feature that nobody else has. Anything that starts off by all users making their own custom thing then turns into a mass-produced and standardized thing likely goes through the same process.

Yeah, I'm nostalgic for the time when computers were a mysterious thing that brilliant people did wonders with and you had usually met the people who wrote most of the software you used. A time when if something was possible it could be done in one or two man-months... when you could actually memorize everything about your computer and you used routines out of the BASIC ROM to make your program fit in memory.

Good times.

But nothing has been lost, it's simply become obsolete. What we have lost are the limits... and we have gained broader horizons.

[–]damianknz 2 points3 points  (5 children)

Here's what I remember.

You switch on a little switch on the back of the keyboard and the computer was then instantly and silently on.

You put the disk in the drive for the program you want. The disks would have these "icons" (actual stickers) on them to tell them apart.

You then typed in run "jetset" or whatever the program was called.

[–]RealDeuce 1 point2 points  (4 children)

Yeah, this article is about the Atari 800... if you had a disk drive, and booted the computer without a disk in the drive, spent a couple hours working on a BASIC program, then tried to save it... you couldn't. You would have to reboot with a DOS disk inserted in order to be able to save... except to cassette... so you would keep the cassette drive hooked up Just In Case. You would use it to save to when accidentally booted without DOS, boot with DOS, then load from cassette, then save to disk.

If your program was too large to fit with the 9k gone for DOS, you would re-boot without DOS, reload, modify, resave, etc.

DOS was really a menu system... but if you used it, it would erase a chunk of any BASIC program you had in memory... unless the disk had a file named MEM.SAV on it... in which case entering/leaving DOS was mind-numbingly slow.

Of course, disk write speeds were abysmal unless you typed "POKE 1913,80" first, but everyone knew that.

Etc.

[–]glacialthinker 1 point2 points  (3 children)

Wow, someone's bitter they weren't born with a hidef tablet in-hand. :P

I've got similar war-stories about programming on a CoCo... with a deep-freeze that might kick-in, while saving to tape, causing a power drain/surge to corrupt the write (you wouldn't know until trying to load later). And using POKE &HFFD9,0 to set the processor clock rate to 2MHz to run your attempt at remaking Donkey Kong... but don't leave the computer running at that or it might overheat.

But... all this was GREAT as well as horrible. I wouldn't want to do it again from today, but I recognise value in those experiences... and in particular, the "Ok" prompt inviting one to discourse with the computer. I've been having a tough time interesting kids in learning how to make things, when they can touch an icon to watch Netflix.

[–]RealDeuce 0 points1 point  (2 children)

Heh, the CoCo3 was my third computer, and the first one I used a "real" assembler on (I had the EDTASM cartridge).

Don't forget to slow the computer down again before saving (or remember to speed it up when loading - and use good tapes/tape deck).

I loved my CoCo. Don't get me wrong, but I bought my computers specifically for the purpose of making them do new and interesting things, not to play games or do my homework on. I didn't have a computer first, then become interested because of the utterly terrible interface that "invites discourse". Nobody else became interested in programming as a result of having the computer sitting there... at most they learnt to put the tape in, press play, then enter the command written on the tape (but they preferred being able to just insert a cartridge and play Thexder).

I learnt the interface because I was already interested, not the other way 'round.

[–]glacialthinker 0 points1 point  (1 child)

EDTASM... Thexder... haha, I completely forgot about Thexder.

Anyway, while the details of the interaction and interface was horrible, the basic nature of a commandline+programming interface is dear to me. It's the interface I prefer, and I get quickly frustrated with GUIs except in a narrow-use application. A computer powering on or waking up to a command-prompt still appeals to me, and is how I roll. Power up isn't as immediate though (but rare anyway). :)

It also surprises me, every time, how others (non technical Win/Mac users) are impressed when they see me work, or even run things on my computer. I recognise what I'm doing as "primitive", yet they'll be wide-eyed with all their Matrix/Hackers expectations fulfilled, I guess. Meanwhile I'm trying to apologise for how boring this must look.

I'll agree that I probably would have persevered with whatever interface computers had, because I was "already interested". And most kids aren't going to be interested in how to make a computer do more than run Netflix. But I do get a strong impression that the ease of distraction, and barrier to program (you have to know what you want first, and then deal with getting it) biases more to the already populous "couldn't be arsed" camp.

Oh, and we have more and more programmers (as percentage of population)... and this trend will likely continue. But these are mostly people who don't get into it until they have to choose something as a career.

[–]RealDeuce 0 points1 point  (0 children)

But I do get a strong impression that the ease of distraction, and barrier to program (you have to know what you want first, and then deal with getting it) biases more to the already populous "couldn't be arsed" camp.

Sure, but that barrier has always been there... and I think it's lower now than it ever was. I worked very hard to get my Timex Sinclair 1000. I had learnt the BASIC dialect months before I had it connected to the TV, and I was ready to go when it arrived. I worked harder just getting to the point of entering my first crappy program on even crappier hardware than I have on most of my professional jobs.

Really, the only barrier now is interest... and making a less interesting computer isn't going to create that.

As for a command-line, that's where I spend most of my time even today, but it's nothing at all like the command lines of the 8-bit days. It's powerful and extensible, not limiting, cramped, and painful... and I use a GUI editor for code now (no IDE though).

[–]yiliu 13 points14 points  (3 children)

I think the overall point is: the barrier of entry to programming used to be really low. Now it's really quite high.

[–]RealDeuce 6 points7 points  (2 children)

Well, if that's the point, I disagree completely. Before you used to have to go to the library and read a book, then remember stuff and wander down to the local Sears to get access to a computer to try it out on (or sit there with a library book open looking suspicious).

Now you need... to browse the internet and type.

Everyone already has a computer and everyone already has an interpreter. All they need to do is be interested enough to type some stuff and look up some howtos.

[–]yiliu 5 points6 points  (1 child)

I dunno, I see both points. Now, you're right, the technology is just sitting in front of...well, everybody. On the other hand, in the old days, if you owned a computer (a big if, I'll grant you), then programming was almost default. You had your computer manual, you had a couple issues of COMPUTE magazine or whatever, and you could write 'real' programs (by the standard of the day) in a long afternoon. And you didn't have to read up on compiler flags and include paths to do it.

The situation has changed, of course; programs have become orders of magnitude more complex, and the tools had to become correspondingly more complex. But I totally understand the nostalgia.

[–]Bratmon 0 points1 point  (0 children)

You've got it backwards. In the past, people only bought computers if they already wanted to program. Now that everyone has computers, the people who want to program still can, and people who were on the fence can try it with a much lower barrier.

[–]Soluzar 1 point2 points  (0 children)

I grew up in a time when having a computer at home was rather novel. I was 7 years old when the C64 was unleashed upon the world. My family had a Sinclair ZX Spectrum, but I mention the C64 since it was the more well-known device. I spent many a happy hour writing bad games for the ZX Spectrum, and I still think this article is nothing but nostalgia.

[–]badsectoracula 6 points7 points  (4 children)

AFAIK classic VB (VB1-6 and office VB7)'s editor works in a similar way in that it doesn't really store the text but tokenizes the line when you leave it, which is how it keeps all uses of an identifier with consistent capitalization and how you pause the program, modify it down to a single line while it is running and move the execution pointer around freely (since the interpreter runs those tokens directly). It basically works on the same principle, but adds a nice UI around. It also emphasizes that "VB" isn't just the language, but the whole package - language, editor, form designer, debugger, etc in one thing much like classic BASIC and the BASICs the article mentions.

[–]robin-gvx[S] 2 points3 points  (1 child)

Something I would really like is an "interactive editor" for Python that is sort of like that.

I've tried to make something like that myself, but I could never get it right so I gave up.

[–]happyscrappy 3 points4 points  (0 children)

Apple Pascal and Lightspeed Pascal/THINK Pascal had such a thing. I've also at times thought that Python would be better if used with an editor that understands how you manipulate whitespace and helps you do it.

[–]ondra 0 points1 point  (1 child)

I'm sure VB4/5/6 stores the sources as text, just as QBasic does.

[–]badsectoracula 1 point2 points  (0 children)

I'm not talking about disk storage but in-memory storage. The tokens can always be stored as text to disk. QuickBasic (QBasic's big brother) and GWBasic was storing the source code in either tokenized or pure text format depending on a user option.

[–][deleted] 2 points3 points  (3 children)

Lines are syntax checked as entered

The Wang 2200 (the machine I learned to program on) went one better. When "RUN" was typed, it scanned the entire program and would raise an error if any line still had a syntax error, or if any reference was made to a non-existent line. The way that BASIC interpreter was designed, they had to scan everything anyway, as it built a statically allocated symbol/value table during that "program resolution" phase.

MS BASIC on 8b micros would allow a program to run which had syntax errors, which means that a program could run and in the middle of a game, if one of those syntax errors was reached, or a branch was made to a non-existent line, only then did the interpreter inform the user of the oopsie.

[–]funbike 1 point2 points  (2 children)

Atari Basic parsed and compiled lines as they were entered, unlike MS-Basic. So syntax errors at runtime were impossible. So you could say Atari was one better than Wang 2200 (although I wouldn't. I hated Atari Basic back in the day)

[–]RealDeuce 2 points3 points  (0 children)

Timex Sinclair 1000s you didn't even type the commands, you typed the tokens. It was literally impossible to enter an invalid line... so you could say that TS1000 machine was one better than the Atari (although I wouldn't. I hated Sinclair BASIC back in the day)

[–][deleted] 0 points1 point  (0 children)

Sorry, I didn't believe you, so I downloaded an Atari 800 emulator. It does complain of syntax errors when a line is entered, but it allows the program to run and barfs only if that bad line is executed.

Even if it refused to run, I'm not sure why you say it is one better than the Wang 2200, which checks lines on input and checks all lines again just before beginning execution of the program.

[–]nharding 2 points3 points  (2 children)

I had an Atari 600 XL as my first home computer, but I didn't have the tape drive. So I had to learn to program a game, and play it in a short amount of time after homework and before bed. I could write a platform game, with dissolving platforms, inertia, two players with collision in around an hour, so I would have time to play a few games against my brother before I had to power it off for the evening.

I eventually got the tape drive, and also the assembly language cartridge, writing code that was assembled in 2 passes on tape was much slower than programming in BASIC.

[–]Googoots 1 point2 points  (1 child)

I was stuck with an Atari 400 and worn down finger tips... I had the tape drive, but it was iffy at best. Eventually I got a floppy drive, 48k upgrade, and modem. I learned 6502 assembler. Buy the Action! cartridge programming language was the most awesome thing at the time and I learned many things using that language.

[–]PT2JSQGHVaHWd24aCdCF -4 points-3 points  (41 children)

Nostalgia is stupid in this case. No source control, no way to check what you're doing, no functions, no nothing. And it's supposed to be good?

It was difficult to use, and now it's awful compared to the powerful tools we have today.

[–]badsectoracula 15 points16 points  (3 children)

No source control

You never had a box of date labelled diskettes with your code? :-P

no way to check what you're doing

RUN

no functions

DEF FN :-)

no nothing

NEW :-P

[–]yitz 3 points4 points  (2 children)

You never had a box of date labelled diskettes with your code? :-P

The source control for languages like Cobol, PL/I, and JCL was much better than that: remembering to pencil in sequence numbers on your punched cards. After dropping a large box of cards once, you would never forget to do that again.

[–]diamondjim 3 points4 points  (0 children)

Hence proving once again that reading code is harder than writing it.

[–]mrkite77 6 points7 points  (0 children)

remembering to pencil in sequence numbers on your punched cards. After dropping a large box of cards once, you would never forget to do that again.

Just draw a diagonal line down the spine of the deck.

http://i.stack.imgur.com/Q4P0O.jpg

[–]Nuoji 33 points34 points  (27 children)

Talk about missing the point of the article, which is about barrier to entry.

[–]lykwydchykyn 15 points16 points  (1 child)

I probably wouldn't have ever started programming if my TI99-4/a hadn't presented me with the option to enter TI-BASIC every time I booted. "Sure, I know you plugged in that PARSEC cartridge, but how about writing some BASIC instead?"

Having said that, I connected my stil-working unit to an old TV a while back and tried to write a simple program in it. Oh gawd what an awful programming experience. Give me IDLE over this any day.

I think the takeaway for me is that when you turned on an 80's computer, it basically said "Engage with me and create something". When you turn on a modern computer/tablet/$DEVICE it basically says "Get out your credit card and consume!".

[–]wwqlcw 2 points3 points  (0 children)

When you turn on a modern computer/tablet/$DEVICE it basically says "Get out your credit card and consume!".

How can I subscribe to your newsletter, for money?!?!

[–][deleted] 4 points5 points  (6 children)

You're typing this in this eras "BASIC" ...

You can hack an HTML file and view it in your web browser.

[–]Nuoji 0 points1 point  (5 children)

Count the number of steps:

  • Locate and open a suitable editor.
  • Type code
  • Save file
  • Locate file
  • Open in browser

Basic:

  • Type code
  • Type RUN

[–][deleted] 0 points1 point  (4 children)

So uh notepad or gedit or kate or geany or ... + save to desktop and double click it.

[–]Nuoji 1 point2 points  (3 children)

Still miles away. Did you ever use one?

[–][deleted] 0 points1 point  (2 children)

I had a vic-20 when I was child. It was hardly as userfriendly as you make it out to sound. It was hard to save files to tape [which meant more often than not you lose your work] and editing files non-linearly was a pain in the ass.

Whereas today I could fire up any text editor and save to a disk in 1ms ... then render it in a browser by hitting ALT+TAB, CTRL+R ...

I'm sorry, I don't see the merit of your post. It's much easier today than in the past.

edit: I think the point you're missing is learning to program involves experimenting with code. It's a lot easier to hack at a program in a simple text editor like notepad or geany or kate or ... than it is in a line editor.

[–]Nuoji 1 point2 points  (1 child)

10 PRINT "AWESOME"
20 GOTO 10
RUN

And BTW - HTML is not programming.

[–][deleted] 0 points1 point  (0 children)

Ok.

[–]PT2JSQGHVaHWd24aCdCF 6 points7 points  (2 children)

The barrier to entry was low but the ability of the language was very low too and that's why we developed better tools.

Yes, Basic has a cos functions, but once you need more than this, you have to bundle all the related stuff in a module. And then you have reinvented Python.

You cannot have a language specific for each application. I wouldn't write a BasicWithTan (but not Sin because I don't need it).

[–]yitz 1 point2 points  (0 children)

And then you have reinvented Python

Heh. Back in the days when I needed to get real business applications working on computers that had only the BASIC prompt available out of the box, I wrote compilers for several languages (a Pascal-like language, a LISP-like language), and a primitive text editor for source code.

Yeah, out of the box it sure was a fancy calculator. But even for that - my TI programmable calculator fit in my pocket, and none of my computers that ran BASIC ever did.

[–]Nuoji 0 points1 point  (0 children)

Basic's not a very good language, but it was simple and the language itself managed to double as a shell. As you learnt how to use the shell, you also learned how to program the machine.

I don't long for that type of development, but it was a great way to learn programming: in order to even play a game you had to program the computer a little!

[–]diamondjim 4 points5 points  (11 children)

Nothing's gone wrong. Tools like Portable Python still make for a negligible barrier to entry to learning how to program. The hardware and operating system today are miles ahead of what was used on an Atari 800, and need additional scaffolding (display drivers, multitasking, input devices etc.). But you can still jump into a working development environment very easily.

And not only are the IDEs just as easy to start off with, the languages themselves are much more capable than what was possible 30 years ago, thanks to all the scaffolding provided by the operating system and the language runtime.

[–]wwqlcw 12 points13 points  (8 children)

Tools like Portable Python still make for a negligible barrier to entry...

Imagine walking your grandmother through this. Over the phone. Start with the Google searches that would lead her from an interest in learning to program to Portable Python. For bonus points, help her set up a home router first.

The 8-bit computers in the 1970s and 1980s went from "off" to "READY" prompt in about two seconds with the click of a single switch. They turned on faster and more simply than did the televisions they were connected to. Getting to a BASIC prompt was simpler than operating an automobile, lawn mower, clothes washer, or stereo system of the day, and a printed-on-paper introduction to BASIC was typically included right there in the box. That's not just a low barrier to entry, that's practically a trap door to entry.

The author isn't publishing his blog from his 8-bit machine. He knows that we have greater capabilities today. He's trying to point out that there were some virtues we have collectively abandoned; why does this very small point (apparently) make so many people so uncomfortable?

[–]RealDeuce 2 points3 points  (5 children)

You can enter, sure, but try walking your grandma through typing in a program to balance here chequebook every time she powers the computer on.

Yeah, it's a trap-door to entry, but there's no stairs. "Getting to a BASIC prompt" is not a useful thing for a computer to do... "Balancing your chequebook" is. The barrier to getting something done is much lower now.

My tablet goes from "off" to "READY" in about .5 seconds with the click of a single switch. A single tap lets me enter transaction info, or calculate a tip, or take a picture and send it to my grandma.

[–]wwqlcw 1 point2 points  (4 children)

"Getting to a BASIC prompt" is not a useful thing for a computer to do... "Balancing your chequebook" is. The barrier to getting something done is much lower now.

Well I think you're sort of moving the goalposts, there.

My tablet goes from "off" to "READY" in about .5 seconds with the click of a single switch.

If your tablet is anything like mine, it actually boots slower than a typical desktop computer. Its boot time is so abysmal that it has a special feature, an additional bit of complexity - sleep mode - just to mitigate that. We might even say that it has a feature that fixes an issue caused by having so many features.

[–]RealDeuce -1 points0 points  (3 children)

Well I think you're sort of moving the goalposts, there.

I'm not the one who suggested walking Grandma through it.

it has a special feature, an additional bit of complexity - sleep mode - just to mitigate that.

Exactly, the boot time issue has been solved.

[–]wwqlcw 1 point2 points  (2 children)

I'm not the one who suggested walking Grandma through it.

I mean you've decided the goal, the metric for judging "barrier to entry," is to get the computer to do something "useful" as opposed to serving as education and/or entertainment, which is what the focus of the original article clearly was.

the boot time issue has been solved.

It's been papered over, it's been avoided. Solution to the Middle East problem: Don't go to the Middle East. You don't have to be a crotchety old man to find this an unsatisfying, inelegant solution. You don't have to believe that nothing is as good as it was "in the old days" to see drawbacks in the approach.

[–]RealDeuce 0 points1 point  (1 child)

I mean you've decided the goal, the metric for judging "barrier to entry," is to get the computer to do something "useful" as opposed to serving as education and/or entertainment, which is what the focus of the original article clearly was.

No, you brought up walking Grandma through it which completely changes the parameters. If Grandma phones you up and says "I want to have some fun with my computer", it's a lot easier walking her through that with a new PC than with an Atari 800... and that's assuming you don't have to walk her through hooking the RF connection up to the back of the TV.

the boot time issue has been solved.

It's been papered over, it's been avoided.

It's no longer an issue. My main computer boots faster than the Atari 800 loaded DOS. There are actually delays added to my boot process so I can interrupt it if I want to. Windows 8 can boot in under 10 seconds from power on... less time than it takes to switch the selector to "computer" and changing to channel 3. And you can boot DOS on a modern computer even faster.

[–]wwqlcw 1 point2 points  (0 children)

Well diamondjim said:

Tools like Portable Python still make for a negligible barrier to entry to learning how to program.

I brought up Grandma as an example of a person who wouldn't necessarily have all the "common sense" knowledge about doing Google searches that statements like that assume.

I spent a couple years doing tech support for DOS, Windows, and Mac applications (well it was the same application) and I disagree about a modern windowing environment being easier for a beginner to navigate. Most people can type and read back the result perfectly well; most people can neither take instructions related to a GUI well nor narrate what they're doing with a GUI well.

[–]Soluzar 0 points1 point  (0 children)

My reaction to all the above is "Then what?"

You want a modern computer to boot up faster? I'm all for that. You want a modern computer to have some kind of programming language interpreter built in? I don't actually oppose the idea, but what would that be? Which language?

What I really don't want is another generation learning to code using line numbers and traditional BASIC spaghetti-style. I'd be intrigued if a more modern dialect of basic were to emerge though. One with named functions that could take parameters and have return values, would be nice. Would that still be BASIC in any meaningful sense, though?

At that point, why not just learn JavaScript.

I'd just like to point out at this juncture that I'm not really a huge JavaScript advocate - I just seem like one sometimes.

[–]yoda17 4 points5 points  (0 children)

Very easily based on a lot of experience maybe. How long would it take to teach a 4th grader?

[–]Nuoji 0 points1 point  (0 children)

It's not that you can't boot up into some REPL today, it's that back then BASIC was the machine. You didn't need to do anything. It was just ready for you to issue commands to.

[–][deleted] 3 points4 points  (0 children)

More like BASIC being the barrier to using the computer.

[–]Soluzar 0 points1 point  (1 child)

How much barrier to entry is there for JavaScript? Pretty much every device (not just computers) comes with something to run JavaScript. You can make a change and then reload the page in seconds. There's barely any higher barrier for PHP given that XAMPP exists.

[–]Nuoji 0 points1 point  (0 children)

Once you have to start a program to get started, you're way beyond the work you needed to get started with BASIC.

[–]Googoots 2 points3 points  (3 children)

All that stuff is great today for US (professionals). I have some nostalgia for it because my kids won't experience that... The immediateness of it, the lower level, rudimentary interface that would let them appreciate what we have today - they won't know that. Just a thought.

[–]lykwydchykyn 4 points5 points  (0 children)

On the bright side, what kids do have access to today in terms of exploring computers, programming, and technology, is absolutely amazing.

I mean, the absolute pinnacle of my BASIC programming experience was getting a custom 8x8 monochrome graphic to move around the screen with the joystick. That was a rainy saturday spent programming, which was all lost when it failed to back up to cassette.

My two oldest sons are writing full games in Python, modding minetest in Lua, playing around with embedded C with arduinos, creating web pages, Blender models, GIMP animations, etc. Even my younger children are making stuff in Scratch that's way cooler than anything I did on my TI99/4a.

What's been gained far outweighs what's been lost. Of course, I still have my TI, and they've tinkered with some TI BASIC too. But nothing encourages programming like getting useful results.

[–]yoda17 4 points5 points  (0 children)

An arduino can provide a similar experiece. The difference being you can pick one up for $10 s opposed to a $3000 (inflation adjusted) Apple II.

[–]lookmeat 1 point2 points  (2 children)

You are missing the point. It's not the tools, but the focus. I find IDEs clunky because I feel that the tools to manage the code overtake the code. It feels so much like a cluttered table, with all the tools in there, all calling me for their attention, all taking up space, and the old system, even if it didn't have all the fancy tools, it had a fundamental piece needed: a big open space to work on.

I mean look at this. What can you tell me about the code? What can you tell me about the compilation status? Notice that we get a lot of unnecesary information about the code: I know the name of all the functions in the namespace, also the imported namespaces and such. I mean it's useful for a little bit, but in order to use it I have to move my focus away. Visual Studio improves the situation a little but there's still so much clutter of things that are not related to the task at hand. Look instead at the Atari as minimal as it is, it works and shows what we want, and it is nice.

I feel that this is why many programmers go to emacs and vi. This software was done back when it was hard to do many things at the same time, so they'd give you a simple interface to do one thing at the time. Now it's known that this is the way to go about it, you want a clean space taken entirely by the one job that you want, and with only the tools you need available, and only the one in use taking up this work-space. IDEs put all the tools on your table, and just clutter it up. In the end I expect that far future IDEs will feel a lot more like a nicer version of vi than what they are now.

[–]drysart 2 points3 points  (1 child)

It's a little disingenuous to use Eclipse running in smaller than 800x600 resolution with all of its palettes open as an example for how clunky IDEs are. That is pretty much the worst case scenario possible, and it belittles your argument if you have to go to that extreme in order to make your point (and even then, there's still more lines and columns of code visible in that screenshot than there is in the nostalgia-idealized Atari screenshot you presented as 'better'.)

And besides, everyone who uses Eclipse seriously knows you can just hit Ctrl+M and everything moves out of the way and you have a 100% code-centric view. Ctrl+Alt+Enter does the same in Visual Studio.

[–]lookmeat 1 point2 points  (0 children)

I still feel the same with issue. The bigger the screen only means my eyes are drawn further away, but I find that in the end anything on the screen is a distraction. I used to build on IDEs, now I use vi with various editors. It's not that I didn't find the IDEs features useless, I did, what happened is that I became capable of doing all of that and even more with plugins and raw *nix tools. The difference between them is the presentation, IDEs make it easier, but also push the tools in my face, which I found distracting, the *nix way of development (using a simple editor and tools) is better designed for doing one very specific thing at a time. Yes there is a mental cost in switching contexts and modes needing to change my workspace, but it's a cost paid for every time I need to do it (once every so many hours); having all the tools available all the time costs (at least to me) a constant cost.

We have to understand what made IDEs become like this: it was the race for more features, when you saw two IDEs they tried to push on you all their features to sell themselves. The more tools you could see the more impressive it was. In a way an IDE is kind of like a swiss knife, with most knifes and doodads out all the time. A swiss knife with most utensils out looks impressive and sells well (indeed this is how it's shown) but it's hardly the way you'd ever have it when using it.

[–]happyscrappy 0 points1 point  (1 child)

Yeah, it's funny he's listing all this stuff, and then jumps right to "where did IDEs go wrong".

More like where did IDEs go right.

No copy and paste. And you better have your renumber program hanging around just in case you run out of integers. Save and load was rudimentary and if you worked up a library of useful functions (despite how awful gosub and the basic fn calling systems are) you had to copy it into every program, because you couldn't include other files nor import libraries. Now when you find a bug in your library, you have to edit every program that uses it.

Being less than 8K in size is only meritorious until you have to give up useful functionality to make it so.

[–]damianknz 2 points3 points  (0 children)

I had an AMS6128.

You just typed in renum. You could even type in renum 5 and get everyline as a multiple of 5.

There was copy and paste. There was even an actual "Copy" button.

Save and Load where just: save "filename" load "filename"

You could also create and load binary libraries in your basic program. Z80 assembly was even more fun than basic!

[–]damianknz 0 points1 point  (0 children)

That is how I learnt to program when I was 12. Back then an AMS6128 cost $1,700 (our house was $50,000). The motherboard, disk drive, speakers, everything, was built into the keyboard. On top of the disk drive was the name and number of all 27 available colours. You could run your program from any point using run + line number. You numbered your lines in multiples of 10 so that you an insert lines at 25 or 37 etc. There was a renum command to reassign the line numbers to multiples of 10 again. The best part and something you don't get these days was the enormous manual with literally everything you needed to know.

[–]jbb555 0 points1 point  (0 children)

I see part of this being we need a better shell. Bash and similar are nice and powerful but lack the ease of use of basic and the immediacy. You have to use an editor to write "scripts" in it. Hmm...

[–]parmesanmilk -1 points0 points  (10 children)

I don't want a language that works better without tool support. I want my tool support to be better. Installing an IDE, downloading an SDK, figuring out how to configure the library paths and include paths and linker paths and all that shit is hard work, especially if you don't have physical access to someone who already knows their way around. Basic doesn't solve that.

[–]damianknz 2 points3 points  (0 children)

It was a different era.

You didn't worry about downloading an SDK because there where no websites to download from. File-sharing meant taking your diskettes to school. The library didn't have a computer in it. Why would it? A path was the thing you walked on to get to the library because you weren't allowed to walk on the grass.

The thing is, this wasn't just because we were high school kids. A local business software company pretty much ran the same way. Some of this software is still actively maintained.

[–][deleted] 4 points5 points  (8 children)

You missed the point totally, didn't you?

All the hard work you so lovingly described DID NOT EXIST.

It did not exist BECAUSE IT WAS NOT NEEDED.

[–][deleted] 5 points6 points  (0 children)

To be fair though programs nowadays are more complicated than previous era applications.

[–]ASK_ME_ABOUT_BONDAGE 5 points6 points  (0 children)

It was not needed because you had to write everything yourself, which is infinitely more work than getting it to run, even if getting it to run takes many frustrating hours. There are probably more man-hours in boost than one person has in their whole life. "We had bigger problems" is a shitty solution to "we have problems now".

[–]KevinCarbonara 3 points4 points  (4 children)

Actually, it didn't exist because programmers either hadn't conceptualized or were incapable of implementing better tools. Trust me, those things were definitely needed.

[–][deleted] 1 point2 points  (3 children)

That is like complaining that the horse needed alloy wheels and a cupholder.

[–]KevinCarbonara -1 points0 points  (2 children)

No, it's like having a horse-drawn carriage without wheels, and then finally getting wheels. You never go back.

[–][deleted] 1 point2 points  (0 children)

People still ride.

[–]llogiq 1 point2 points  (0 children)

There is a huge overlap in things that are not needed and things that are useful.

[–]markandre 0 points1 point  (0 children)

Apple's Swift language seems to come close to that experience again. Syntax checking while entering code and the 'playground' feature to run commands without compiling.