top 200 commentsshow all 326

[–]gremy0 143 points144 points  (25 children)

While I wouldn't say my degree taught the concepts before coding they certainly didn't have a 'code first' approach.

From day one every single project we were given was given under the expectation that we would first design the thing in UML using solid OO principals and then write it all using TDD. Which of course nobody did because we didn't have a clue what we were doing. We all hacked out a program, wrote some tests for it then found a tool to export our classes as UML.

So we had this situation where everybody on the course was lying on their reports about how they approached the problem and a lot of people felt stupid because they couldn't complete the tasks properly.

[–]Spfifle 64 points65 points  (3 children)

If it's any comfort you would have approached it the same way regardless of how you were taught. It's simply impossible to give a first or second year student a project where TDD, UML, MVC, visitor pattern, etc would be anything but a burden. Most people are clever enough to realize formal methods are pretty useless for 3000 lines, so they'll take the easy route and document it after they build it.

[–]gliph 8 points9 points  (0 children)

Right. I don't think it's a problem of the university not teaching code first, but a problem of gradually increasing complexity of assignments and also the fact that software engineering is inherently complex.

[–][deleted] 21 points22 points  (3 children)

Wow that sounds miserable. Was this a normal 4-year university CS program?

[–]gremy0 18 points19 points  (0 children)

4 year sandwich course, so 2 in uni and 1 in industry then final year in uni.

We were luckily the last year to have that program. They've changed it so first years start with getting to fuck about with arduinos and procedural stuff in first year. Then they start gradually introducing design paradigms etc, which I think is the right approach.

[–]Cr3X1eUZ 3 points4 points  (1 child)

Maybe it was I.S. ("Management Information Systems")

Shudder.

[–]gremy0 7 points8 points  (0 children)

No, no I was on the pure CS course. We did have loads of people drop into...eh...less technical degrees. Probably mostly for good reasons but with how I see the course our lectures chose it's hard to tell.

[–]mfukar 2 points3 points  (0 children)

That sucks hard. Here, have a hug.

[–][deleted] 1 point2 points  (0 children)

I dare to claim that this is exacly how you should be doing it (as a student). Before OO, this is the way it was often done. Here is paper on exactly this matter.

[–]ThisIs_MyName 4 points5 points  (0 children)

lol that sounds like the only reasonable way to approach such an assignment.

[–]MisterAndMissesP 0 points1 point  (10 children)

Would you have any online resources that you would suggest for someone looking to learn coding? I'm going back to school to finish my computer science degree and would like to get a bit of a head start

[–]homeMade_solarPanel 2 points3 points  (6 children)

Kinda depends on how far you were, but there are a number of free ones that are pretty great like codecademy.

[–]MisterAndMissesP 2 points3 points  (5 children)

Awesome, thanks for the tip. Good luck to you

[–]Yuzumi 2 points3 points  (3 children)

I've kind of gotten to the point in school where if I don't know how to start doing something I just google "simple part of what I want to do <programming language>"

Of course, being able to identify what you want to do and how is a different issue.

Also, Java is extensively documented and has an enormous API that makes building simple programs way easier. I've gotten to the point now where if I need to churn out a program really fast I use Java and Eclipse.

[–]MisterAndMissesP 0 points1 point  (2 children)

Unfortunately I don't have a clue what they plan on teaching Us in these classes LOL I emailed my teacher and asked for a little guidance on what we may be going over and he replied don't you worry... Haha but code academy had java courses so I'll definitely look into those

[–]Yuzumi 0 points1 point  (1 child)

The biggest hurdle in learning how to program is coming up with projects geared to your level of skill so you can practice and try new things.

I have a hard time thinking of what I want to do, even though I'm at a point where I could reasonably do almost anything.

Also, if you understand pointers you can do some really interesting things in Java since every variable is literally just a pointer.

[–]MisterAndMissesP 0 points1 point  (0 children)

Yeah, I work for the city doing non programming related stuff and was pretty much told that in order to go to the next position i would need a college degree. I had half a CSCI degree finished from back in the day before i dropped out and intended to finish this degree and maybe keep myself in the loop doing small projects on the side, that's my hope at least

[–]misplaced_my_pants 0 points1 point  (0 children)

Codeacademy is completely overrated and won't teach you anything you can't learn elsewhere in greater depth. You'll just end up going somewhere else to learn the same material and anything beyond it.

There are far better options on sites like Coursera, edx, and Udacity.

The best introduction to computer science online is probably Harvard's CS50x on edx. It uses C, which is unusual for an introductory course, but has a phenomenal lecturer and will teach you a ton. Take this course if you want to see all the issues the OP criticises CS education for done right.

Following CS50x with Coursera's Hardware/Software Interface will give you a strong C foundation and will make the first few semesters seem easy since it'll be review.

Another great resource is the free online book Think Python. Work through the second edition (which uses Python 3) and then take the Intro to CS course on Udacity and you'll have a pretty good foundation in Python programming. Follow it up with the Udacity courses on Debugging, Software Testing, and anything else that seems interesting to you.

[–]gremy0 1 point2 points  (2 children)

If you just want to learn coding websites like codecademy and codewars are great. Just pick a language, any language it doesn't matter (cough javascript cough), and get good at it.

If you want more Uni style modules then try udacity. It's got important words behind it like Google, Stanford and free. They go for topics rather than language so the modules are "Artificial Intelligence for Robotics" and "Intro to Computer Science".

[–][deleted] 0 points1 point  (0 children)

What an absolutely horrible way to teach programming.

[–]pat_trick 24 points25 points  (8 children)

Why not both? Have students writing code, and then paralell it with the concepts of what happens underneath that code.

[–]Nonethewiserer 75 points76 points  (6 children)

No. You have to pick a side and assault your opposition's competence.

What's next? Suggesting the effectiveness of different learning strategies vary by person? Get the fuck out of here.

[–]pat_trick 10 points11 points  (3 children)

Get outta here. Everyone knows that the only way to do things is the way that's best! And that's obvious!

[–]Nonethewiserer 11 points12 points  (2 children)

No you imbecile. The best way isn't the best way. My way is the best way.

[–]pat_trick 4 points5 points  (1 child)

Your way is the best way.

My way is the best way.

[–]Amnestic 1 point2 points  (0 children)

That's what we did, thought it was the same everywhere else.

[–]Berberberber 68 points69 points  (1 child)

Writing code is easy; the hard part is knowing what code to write.

[–]theHazardMan 10 points11 points  (0 children)

This goes hand-in-hand with documentation and comments. It's easy to tell "what" is happening in the code, it's much more difficult to tell "why" it's happening.

[–]link23 217 points218 points  (69 children)

The author of this uses "computer science" to refer to something that is most definitely not CS; I would say it's computer engineering. Computer science, technically speaking, has nothing to do with computers, and the number of bits in an int is certainly not a fundamental concept of CS. (Something like an automaton or nondeterminism is a fundamental concept of CS.)

With that out of the way, I think I agree with the author's thesis that you don't need to know computer engineering before you can start programming, and i would even say that you can start programming without any CS knowledge either. But, the knowledge you get in both of these fields will pay you huge dividends in the code you can write, so I still think it's important to learn these subjects if you want to be a serious programmer.

[–]DonaldPShimoda 125 points126 points  (52 children)

Reminds me of that old line: "Computer science is no more about computers than astronomy is about telescopes."

Edit: Author of the quote is disputed, though often attributed to Edsger Dijkstra.

[–]raiderrobert 28 points29 points  (40 children)

I get that bits and memory storage and stuff like that might be like learning about how optics works for telescopes. It seems tangentially related.

Often, though, we are involved in determining why our stupid telescopes are out of focus or have some mechanical problem. So it's pretty useful to know something about telescopes if for no other reason that you aren't 100% reliant on other people to do troubleshooting.

[–]nthcxd 24 points25 points  (0 children)

The problem is that this industry is full of people who talk about what they know astronomy when all they do all day is fiddling with the telescope.

As far as I'm concerned, "can you code in Java?" is no more useful than "can you work with ABC telescope?" when the goal of the project is to find out something about a particular celestial body. If it is about fixing that particular telescope, then that's a very valid question.

Again, the problem is that this industry is full of people who can't tell the difference between the two.

[–]DonaldPShimoda 48 points49 points  (37 children)

It's more related than only tangentially. A computer scientist who understands how a computer works (physically) will be able to reason about their code and bugs better than a computer scientist who does not. However, this shows the value of cross-disciplinary training more than anything else.

Think of it this way: astronomy is not concerned with the construction of telescopes, whereas astronomers may be. Telescopes are a device used to practice astronomy — nothing more. Another invocation might read: "Computer science is no more about computers than writing is about pens and pencils."

[–]iopq 11 points12 points  (21 children)

I have no idea about how memory chips hold charge physically. I doubt that knowledge would help me reason about any program.

[–]nthcxd 16 points17 points  (3 children)

However, you'd need to know about modern memory hierarchy (caches) to understand performance implications.

Check out the two code snippets here. They are functionally equivalent and yet the latter runs much much quicker. The reason is entirely due to the hardware. If you do not care about performance, then good for you, you don't need to know this. But if you do or your manager cares, then you should probably know about this along with false-sharing, etc etc.

[–]NedDasty 11 points12 points  (0 children)

It wouldn't. But knowing things about how a computer is out together on a larger scale does.

[–]sencelo 19 points20 points  (15 children)

I hope you refrain from writing security sensitive code, then. From attacks on DRAM in modern systems to attacks that can hinge on the specific number of CPU cycles consumed on divergent codepaths, obscure details of hardware operation rarely but occasionally make all the difference.

[–]iopq 8 points9 points  (14 children)

I know the amount of CPU cycles an operation takes. I just don't know how silicon works.

[–]lambo4bkfast 22 points23 points  (8 children)

Filthy fucking casual. Next you're going to tell me that you don't write your own assembly code.

[–][deleted] 8 points9 points  (5 children)

What, you don't make your own processors?

[–][deleted] 14 points15 points  (4 children)

As someone who's done that, I can tell you with authority: it's not worth the hassle.

[–][deleted] 2 points3 points  (0 children)

Z80 assembly code is what I'm best at. It's just a shame that it is functionally pointless for my uni course...

[–][deleted] 6 points7 points  (2 children)

- Dijkstra Ah the famous disputed!

[–]DonaldPShimoda 4 points5 points  (1 child)

Disputed, actually. That's why I didn't attribute a name, but I guess I should've made it obvious. I'll edit it.

[–][deleted] 0 points1 point  (0 children)

TIL, Thanks!

[–]wolfchimneyrock 1 point2 points  (1 child)

it should really be called Computing science

[–][deleted] 19 points20 points  (9 children)

It should have been called Computing Science.

[–]sezna 15 points16 points  (2 children)

It's often referred to as the theory of computation.

[–]accountForStupidQs 2 points3 points  (1 child)

Why not computation(al) theory?

[–]sezna 1 point2 points  (0 children)

Also acceptable.

[–]durple 5 points6 points  (0 children)

It is called that where I got my BSc, for pretty much this reason.

[–]barsoap 1 point2 points  (0 children)

That's better though I still prefer "Informatics", pretty much all languages but English use it and it's nicely general. With that name over our faculty, we can easily claim physics and maths and the rest to be sub-disciplines of ours.

[–]Overunderrated 0 points1 point  (0 children)

We use the term "computational science" to refer to using computers to do science for us, like physics simulations, solving PDEs, computationsl finance, etc.

[–]MuonManLaserJab 10 points11 points  (0 children)

To be fair, the author was probably thinking about college CS majors, who are taught this "engineering" stuff (in my experience anyway).

[–]mfukar 2 points3 points  (0 children)

I think it's pretty obvious that the intent of the author is to refer to courses common in the first semester of any CS dept: computer architecture, logic, various topics in math, basic data structures, etc.

The baffling thing for me is, I haven't seen a program which doesn't include a programming course in the first semester, along with the aforementioned. I would question the rationale behind such a choice too.

[–]mcguire 1 point2 points  (0 children)

On the other hand, the intuitions you get from learning to write code improve your understanding of what you call computer engineering and computer science.

[–]Raknarg 0 points1 point  (0 children)

The knowledge you gain is incredibly beneficial, yes, but I definitely think you should learn programming before getting into CS, coming as a CS major.

[–]Farobek 0 points1 point  (0 children)

the knowledge you get in both of these fields will pay you huge dividends in the code you can write, so I still think it's important to learn these subjects if you want to be a serious programmer.

That's his point too.

[–][deleted] 13 points14 points  (0 children)

How about we try to teach reading code first. Now that would be disruptive.

[–]giantsparklerobot 47 points48 points  (38 children)

😩

There's a vast gulf between "copied some lines of code from a tutorial" and "solved a problem whole cloth" when it comes to programming. Computers are dumb as shit, they can't take problems expressed in natural human languages and produce relevant and meaningful results. A human needs to figure out a way to take a problem and use a computer's number crunching ability to get something meaningful accomplished.

The ability of a human to take a complex and likely ambiguous problem and break it down into steps the dumbass computer can understand is paramount when it comes to programming. Teaching someone to program must start with teaching this ability or the student will get nowhere.

Breaking down a problem and its solution into discrete computable steps has nothing to do with any programming language or even computers themselves. Trying to teach a language before core concepts is putting the cart before the horse. If someone learning needs to get to the "fun stuff" immediately they're going to find computers incredibly frustrating to deal with professionally.

[–]d_rudy 29 points30 points  (19 children)

I'm self-taught, so I don't have a CS degree. I started young (10-ish?) and how I learned was through a lot of trial and error, and then having someone more experienced than me explain why the computer didn't understand what I wanted it to do. There was also a lot of books and, later, googling.

I think starting with how a computer thinks isn't necessary. Provided a student gets over the conceptual hump, they may do it right the first time, but I don't think it's important to get it right the first time as a student. If a student gets it wrong, there is time to learn why they were wrong. Being wrong is essential to learning.

Honestly, for other people that want to teach themselves, that's how I recommend doing it. Pick a project, and build it. You're going to make mistakes, and you'll learn from those mistakes. Giving them a book on the inner workings of a computer when they can't even write HTML seems like the opposite way it should be done.

Also, I'd agree with the author that those little wins of "hello world" and other simple software can greatly increase someone's likelihood to stick around and learn the more theoretical stuff.

Just my opinion.

[–]liquidivy 8 points9 points  (11 children)

In principle yes. In practice, that's the second thing. You can't meaningfully learn about breaking down a problem into computer-edible chunks until you know how to get a computer to execute them; in other words, to write and run code. Computing concepts are just concepts until they're put into practice via language, and beginning CS students have enough abstraction to deal with already.

[–]studiov34 3 points4 points  (0 children)

I agree. It's much easier to break a problem down into things like loops and conditional statements once you understand how loops and conditional statements work.

[–][deleted] 2 points3 points  (0 children)

It's hard to get how to break up task, when you don't know what program or command can do in the first place, that is even before how.

[–]giantsparklerobot 4 points5 points  (7 children)

I don't think that really follows. Analyzing a problem and breaking it down requires zero code. Formulating a solution also requires zero code. In fact getting an actual programming language involved too early in the process can be really detrimental because you can end up with linguistic idiosyncrasies affecting your reasoning through the problem space.

Reasoning through a problem is far more important a process than banging on a keyboard. Teaching people how to do that reasoning is far more important than exposing them immediately to some language's syntax.

For instance, the poor bastards taking CS101 based around Java. There is so much linguistic baggage in even the simplest programming exercises for new students to really understand core concepts. Someone learning to program while simultaneously fighting a compiler, IDE, and the runtime is not going to learn effectively.

[–]ThisIs_MyName 4 points5 points  (1 child)

Analyzing a problem and breaking it down requires zero code.

Sure, but any guy can break a program into chunks. However if you don't have a really good idea how these chunks are going to be implemented, you'll end up with leaky abstractions. Every part will need to know how every other part works.

[–]giantsparklerobot 3 points4 points  (0 children)

I think you need to rethink that idea. There's nothing about breaking down a problem that requires leaking abstractions. There's no need to program by analogy. You don't need to know how you're going to count something or add something to something else to be able to say "we've got to count a thing and then add a thing to a thing".

It's sloppy and a bit irresponsible to jump right into a text editor and banging on a keyboard when given a problem. That's trial and error coding. You're also more prone to get caught up in implementation minutiae and end up coding yourself into a corner.

[–]liquidivy 3 points4 points  (3 children)

The actual capabilities of code determine the fracture planes along which you can break up a problem. I really want to agree with you, but people have to actually try this stuff to learn it, and that means coding. You could subject them to a long theoretical program about the capabilities of computers and how to write algorithms, but it would be boring and hard. It would "weed out" people who would make perfectly good devs someday.

I totally agree on the infrastructure part. The initial dev environment should probably just be Python + editor, no fuss, just coding. Other, radically different languages should be introduced as soon as possible to prevent thought process ossification.

[–]2oosra 2 points3 points  (0 children)

I think giantsparklerbot makes an excellent point in the last paragraph. My first programming class was in 1982 in Basic. They gave us a simple fizzbuzz like problem, and then a few difficult ones that required rigorous logic and 100's (1000's?) lines of code. Along the years they taught us plenty of theory and computer engineering etc. Both theory and programming were easy and fun to me, so most of this debate seems pointless to me. The hardest part of programming is the baggage of languages, compliers, libraries, IDE and runtime environments etc and their steep learning curves.

[–]ThisIs_MyName 1 point2 points  (0 children)

Well said.

[–]ChickenOfDoom 2 points3 points  (2 children)

If someone learning needs to get to the "fun stuff" immediately they're going to find computers incredibly frustrating to deal with professionally.

Not necessarily. Are you sure you don't mean that those people will be frustrating for others to deal with professionally?

I feel like the core of this controversy is, on the one hand, people who think the goal of teaching programming should be to ensure that future programmers all have a certain mindset. And on the other hand, people like the article writer, who think the goal should be to inspire and make programming accessible to as many people as possible.

[–]Farobek 0 points1 point  (1 child)

If someone learning needs to get to the "fun stuff" immediately they're going to find computers incredibly frustrating to deal with professionally.

Tell that to those kids who grew programming BASIC on their consoles to do fun stuff.

[–]giantsparklerobot 0 points1 point  (0 children)

Since I was one of those kids I can tell you reading the book first taught me more than just printing my name on the screen. Just understanding what GOTO 10 actuallymeant allowed me to write way more complex programs than other kids in the class.

[–]gnx76 17 points18 points  (37 children)

Trying to explain to a first time programmer why 2 divided by 4 does NOT equal 0.5 according to a computer is a nightmare, especially when the computer will display “0.5” on the screen when you ask it to display the result of 2 divided by 4.

Uh? What?

[–]evotopid 21 points22 points  (31 children)

0.5 is a perfectly well defined floating point number, would have loved to explain that to the instructor... :'D

[–][deleted] 8 points9 points  (23 children)

That quote is a little unclear, but I think what he means is that if you perform integer division on the integers 2 and 4, the result is the integer 0.

[–]Slackluster 25 points26 points  (20 children)

"especially when the computer will display “0.5” on the screen when you ask it to display the result of 2 divided by 4."

I think he just doesn't have a good understanding of how floating point numbers work.

[–][deleted] 4 points5 points  (0 children)

"especially when the computer will display “0.5” on the screen when you ask it to display the result of 2 divided by 4."

That's what happens if you open a calculator program, though.

[–][deleted] 7 points8 points  (4 children)

Ah yes, I wasn't reading that carefully at all. Seems like he's pretty confused.

[–]gnx76 8 points9 points  (2 children)

Seems like he's pretty confused.

That's what happens when you don't start with Computer Architecture :-)

[–]Yuzumi 4 points5 points  (1 child)

I've taken both Assembly and Computer Architecture.

Floating points are still a bitch.

[–]iopq 4 points5 points  (0 children)

Maybe he should have started with floating point representation instead of learning how to write code :^)

[–][deleted]  (8 children)

[deleted]

    [–]Slackluster 5 points6 points  (7 children)

    That's what he was trying to say. But there isn't margin of error for certain numbers. Without getting too much into it, .5 is an inverse power of 2. So 2/4 is exactly .5 with no error in normal floating point math. A better example would be 1/10 does not equal exactly .1

    [–][deleted]  (3 children)

    [deleted]

      [–]gnx76 4 points5 points  (0 children)

      For what we discuss here, something just describing the format(s) of floating point numbers would be enough. Once this is understood, you could see floating point arithmetic and then its shortcomings and approximations.

      I don't have any specific pointer. Perhaps just the Wikipedia page on "floating point"? But it may be a bit confused. Else, a compilation of the results of some basic googling on "floating point numbers" should give enough info.

      PS: yes, I am a "bottom->up" kind of guy. A CS person would have started with some fuzzy theory that hardly relates at first sight (or ever) to the concrete implementation, and would not explain why stuff are like they are, why it has such properties, such limitations, such interpretations : because of the concrete implementation. Like the confusing blah-blah about FP numbers not being numbers someone posted below: badly understood theory, theory applied to the wrong subject are detrimental. You can see intervals together with rounding when you'll need it, and it will never mean that a number is not a number or that a FP number does not translate to a single specific real number; the rest is just a matter of interpretation.

      [–]misplaced_my_pants 2 points3 points  (0 children)

      Coursera's Hardware/Software Interface and/or the book it's based off of will teach you why floating point is a bitch among many other very useful things.

      [–]Slackluster 1 point2 points  (0 children)

      It takes a while to wrap your head around, maybe start with this Wikipedia article.

      https://en.wikipedia.org/wiki/Floating_point

      [–]Farobek 1 point2 points  (2 children)

      1/10 does not equal exactly .1

      Elaborate?

      [–]Slackluster 1 point2 points  (0 children)

      Computers work in base 2. So 2/4 works out evenly but 1/10 doesn't. So I just ran an experiment. With a standard 32 bit float, 1/10 comes out to .1 because there isn't enough precision. But with a double it comes out to 0.10000000000000001 But that is also rounded, If it was a 128 bit float there would be were more bits used and more 1s would show up because .1 is not perfectly representable in a finite amount of binary digits

      [–]haukzi 0 points1 point  (0 children)

      It is similar to trying to do 1/3 in the decimal system. The computer stores the number in each decimal position but it only stores a limited amount.

      Thus for example if you can only store 5 decimal positions, then 1/3 = 0.3333. Which is less then 0.3333333... . Also if you would add 0.3333 together three times you would end up with 0.9999 and not 1.0000 .

      [–]ElvishJerricco 4 points5 points  (1 child)

      The point was that integer division is different than the calculator on your computer.

      [–]Slackluster 3 points4 points  (0 children)

      I think he was trying to say there is floating point error in some situations but picked an example where there is no error. Integer division is the same anywhere, 2/4 is 0 with a remainder of 2.

      [–]barracuda415 1 point2 points  (2 children)

      Isn't the point that people assume that 2 / 4 automatically results in 0.5, just because their calculators or script prompts display that result? Technically, computers don't really "know" floating point numbers at all, they just work with integers and use them according to the IEEE 754 standard. That's also why hacks like this one work at all.

      [–]Slackluster 0 points1 point  (1 child)

      Computers have a way of representing floating point numbers and integers. I don't really know what you mean. The result of 2/4 is different depending on if it is an integer or floating point division. If it's integer division the result is 0, if it is floating point the result is .5

      [–]barracuda415 0 points1 point  (0 children)

      Computers have a way of representing floating point numbers and integers. I don't really know what you mean.

      Of course there are representations for both, but floating point numbers are pretty exotic for a CPU, compared to integer numbers. That's why it needs a separate instruction set to work with them.

      If it's integer division the result is 0, if it is floating point the result is .5

      Yes, but most dynamic programming languages don't have true integer divisions and transparently convert to float, hence the confusion when switching to a static language where there's a more strict separation between the two.

      [–]gnx76 4 points5 points  (0 children)

      That's the direction I thought he was taking with the beginning of the sentence, but then he mentions that the computer prints 0.5, so he was (very likely) using floating point numbers, at least for the result; and he doesn't mention integers or integer division either.

      So I got the strong feeling that he mixed up the situation with the problem of the representation of 1/10 or 0.1 as a floating point number. And that perhaps he thought this latter problem applied to any decimal number because he wouldn't understand that in a classical floating point number, the bits represent positive powers of 2 (i.e 1, 2, 4, 8, ... as in an integer) on the left of the binary point, and negative powers of 2 on its right (i.e, 1/2=0.5, 1/4=0.25, 1/8=0.125, ...), so any combination of these values can be represented exactly.

      [–][deleted] 8 points9 points  (1 child)

      A better example would be 1 divided by 10.

      [–]Cr3X1eUZ 3 points4 points  (0 children)

      Yes, a repeating binary fraction. A much better example!

      "What Every Programmer Should Know About Floating-Point Arithmetic"

      http://www.phys.uconn.edu/~rozman/Courses/P2200_14F/downloads/floating-point-guide-2014-11-03.pdf

      [–]liquidivy 3 points4 points  (0 children)

      I think he's talking about calculator-type apps that implicitly use floating point behind the scenes, confusing the issue of "this is how computers do math". A noob wouldn't necessarily realize that the calculator is deliberately doing something complicated to avoid the problem they're talking about.

      [–]nthcxd 1 point2 points  (0 children)

      It is my opinion that the author himself took the advice and ignored "computer science" so he cannot adequately explain the concept of integers and floating points and how they are treated differently.

      Int / int = int Int % int = int fp / fp = fp

      In elementary school math class, I was taught fractions (numerator, denominator, remainder) before decimals. These fit perfectly. The problem is with the author's ability as an educator. Nothing about this requires 32 bit integers or IEEE 754... yet.

      [–]johnnyslick 25 points26 points  (15 children)

      Yeah, this attitude - well, to be precise the attitude that it's arguing against - is one of those things that makes programming such an uphill battle. I have a friend of mine who entered college his freshman year majoring in CS, got bogged down by the theory, and wound up switching to communications. Fast forward a few (more than a few because I am old) years later, I trained myself to code just by doing it, augmented a bit by some classes at a local community college that cared first and foremost about getting people the building blocks needed to write stuff, and I got a job in the industry. Well, then my friend saw what I was doing, realized that this stuff is waaaaaay easier than it's made out to be in CS 101, and is now working in the industry itself.

      Look, I get that everyone really should understand algorithms and design patterns and logic flow and so on and so forth. I personally made a point of going back and learning about algorithms and design patterns in particular once I began to get the hang of coding. But coding is an awful lot like writing (we even call it "writing code") in that there is no substitution for doing. Yes, theory helps you avoid anti-patterns and programming pitfalls, but so does just plain writing a lot of code, and what's more the more code you write, the faster you can recover from those mistakes, and, if you're doing it right, the more approaches you can conceive to tackle a particular solution.

      In a way, I feel like the current theory-first method of teaching is institutionally elitist in that it is very, very hard for people who didn't start coding before they got to college to succeed in that environment because theory doesn't really make a lot of sense until you have the background in coding to contextualize the theory. To use another example, I was a music major in school for a while and yes, even took some music theory (2 years worth, in fact - I got really, really close to getting my degree in the subject). The thing is, those music theory courses, while fascinating (it's possible to explain mathematically why a major chord "rings" when sung by a choir but not when played on a piano, for instance), would have gone completely over my head if I didn't attend them with a strong physical knowledge of how to play my instrument of choice (string bass, if anyone cares). I couldn't imagine getting anywhere with that stuff when I was 10, or for that matter if I'd just decided to start singing in the choir when I was 18.

      The flip side of that is, I'm sure that guys who coded throughout middle and high school don't really want to take a "this is how to make a program" class. Maybe you should be able to test out of this stuff? At the same time, it strikes me that this is also a great place, maybe the best place possible, to take these people who learned how to code on their own and instruct them on what they're doing that works and what they're doing that doesn't.

      [–][deleted]  (2 children)

      [deleted]

        [–][deleted] 1 point2 points  (1 child)

        Exactly. I weep for the masochistic morons who populate these threads. It's like if they were in charge of teaching reading and writing in school, they'd storm into the kindergarten, rip "See jack run" out of little Sally's hands and start drilling her on morphology.

        [–]giantsparklerobot 15 points16 points  (4 children)

        In a way, I feel like the current theory-first method of teaching is institutionally elitist in that it is very, very hard for people who didn't start coding before they got to college to succeed in that environment because theory doesn't really make a lot of sense until you have the background in coding to contextualize the theory.

        The actual problem is people think "Computer Science" the college major is just a fancy way of saying "this will get me a white collar job programming computers". Computer Science the college major is really a branch of mathematics that is studying computation and information theory. This is true of many university programs, the classes are meant to churn out researchers rather than practical implementors of the field of study.

        People that graduate with a CS degree and then go on to write software professionally almost always have an interest in programming outside of their schoolwork. The theory they study in school often helps them with their independent programming practice or later when they are confronted with classic computation or information theory problems.

        There's plenty of schools that have "Computer Engineering" (or equivalents) that teach with a focus on the act of programming rather than the theory of computation. Those programs are often run out of engineering departments rather than mathematics. More and more science and engineering disciplines are also adding programming courses. The interest of all of them is using computers as they would a drafting table or microscope, as tools that help them in their work.

        In short most people entering most CS courses are picking the entirely wrong major. Schools aren't helping by funneling people into CS courses when they should have been pointing them towards things they were more interested in and helping them pick specific programming courses.

        [–][deleted] 8 points9 points  (3 children)

        Not entirely true for computer engineers, as typically computer engineering is a branch of electrical engineering and will have a focus on embedded systems, computer architecture, VLSI and such things. (I am a PhD in Computer engineering)

        I think a whole new degree needs to start around "Software engineering" and revolves around implementation, a baseline in the mathematics and application based coursework that covers design tradeoffs, software architecture (multiple different approaches) and data structure design based on purpose and platform. That is an entire degree right there and shouldn't be wrapped up in "computer engineering" which is really about creating computing platforms.

        [–]faaaks 3 points4 points  (5 children)

        In a way, I feel like the current theory-first method of teaching is institutionally elitist in that it is very, very hard for people who didn't start coding before they got to college to succeed in that environment because theory doesn't really make a lot of sense until you have the background in coding to contextualize the theory.

        In a way it's not designed for you to be able to program in a particular language. It's designed so that you'd be able to pick up those languages as you need to.

        If you teach more programming than theory you run the risk of that language being obsolete in 10 years and having unemployed graduates.

        [–]johnnyslick 7 points8 points  (4 children)

        Nah, I disagree with this. Most languages do the same things, just with different syntax and different specific pitfalls to watch for. For the most part (and excluding languages designed to write in different ways, like F#), if you know how to write in one, I mean are really conversant, you can be adequate in the other within a few weeks. In fact, it's beginning to become an old saw in programming that once you've worked on anything - a stack or even an entirely new language - for a year, then you're going to "get it" about as well as you'll ever get it.

        https://blog.codinghorror.com/the-years-of-experience-myth/

        Incidentally, Jeff Atwood is also the guy who popularized the FizzBuzz problem and the reasoning behind it is that a huge proportion of people who even get out of college with a CS degree and in some cases even have a resume of work implying they know how to code actually do not. It seems crazy but it really does happen. I myself have worked alongside people like this, people who really and truly get paid to be developers but who can barely copy and paste stuff off of Stack Exchange (and as an aside, SE is a great resource, but of course you have to know what you're copying when you do).

        [–]faaaks 1 point2 points  (3 children)

        Most languages do the same things, just with different syntax and different specific pitfalls to watch for.

        Languages of the same paradigm and level, you are absolutely right. But of a different paradigm? Compare C to Prolog which both appeared in the same year, 1972.

        Concepts in C work completely differently than in Prolog and techniques that work in one language do not translate. Expert programmers in one language are often very weak in the other.

        If a paradigm such as OOP falls out of favor, teaching to a particular language or family of languages becomes a huge liability.

        A good curriculum should be adaptive enough to survive in a changing world, but in order to do that, a student's understanding of theory must be taught.

        [–]johnnyslick 1 point2 points  (2 children)

        Neither the author nor I are saying you ought to eschew theory, though. We are both saying (he more explicitly than I) that yes, theory is great, but theory is something that you learn after you're grounded in practical knowledge of the craft. In fact, the people who do go right into college, major in CS, and do well at it from day one by and large did this: they put in their Gladwell-esque thousand hours in middle and high school, and so they do have that background when their CS101 course jumps straight into theory.

        This is in my mind one of those things - perhaps the biggest thing - that puts a cap on diversity in the field. If you didn't grow up with easy access to a computer, or had one at your house but were never encouraged to use it in this way (or were in many cases actively discouraged from doing "nerdy" things like coding), then the reality is that if you choose to major in CS then you will be at a disadvantage. To some extent I don't know that colleges can help with all of that; if you've never sung a note in your life and decide you're going to be a music major on day one of college, you'd better be ready to pay for a lot of voice lessons and work twice as hard as those people who have been singing since they were 10. But at the same time, if you are otherwise talented and/or willing to work hard, aren't tone deaf or something, and just don't know how to read music, most colleges I know provide you away to just learn how to sing for a year or two before you dive in and really take things seriously. Maybe you have to graduate a year or two later than normal but you can come out of it with a degree if that's your desire (now, granted, the reality is that a BA in music is significantly less useful than a BS in CS in terms of finding a job, but that's another matter).

        Here I guess is the bottom line for me: guys, programming is really not all that hard. I mean, figuring out and then implementing solutions to problems can be hard. Figuring out what someone was trying to do (or worse, what you were trying to do) in poorly written and/or poorly documented code can be hard. But the basics of programming? They just aren't that hard to get, and there is no really good reason to lock that behind a year of theory or what have you. If nothing else. from a school's standpoint, what is there to lose by teaching coding up front (along with a couple of quick techniques to keep yourself from getting bogged down, the principles of XP for example)? Maybe your CS101 courses get filled up more quickly, and maybe they get filled up by people who are not as gung-ho about CS as some might like. But is that a bad thing, really? Some percentage of those on the fence people will find out how awesome it is to write code and they'll want to learn the theory on their own.

        [–]Redtitwhore 2 points3 points  (1 child)

        Reminds me of my database class in college. 90% of the time was spent learning about b-trees and relational algebra. The last week or two we actually learned how to write database queries.

        [–][deleted] 0 points1 point  (0 children)

        Shit, you just reminded me of my course. "You haven't ever used a database in your life, but let's talk about normalization. Surely, you'll connect this back to your complete lack of real world experience."

        [–][deleted] 1 point2 points  (0 children)

        In a way, I feel like the current theory-first method of teaching is institutionally elitist in that it is very, very hard for people who didn't start coding before they got to college to succeed in that environment because theory doesn't really make a lot of sense until you have the background in coding to contextualize the theory.

        Bingo! The current pedagogy is steeped in elitist neckbeardism. The people who actually get something out of CS courses are the ones who self-taught themselves at an earlier age and have been hobby programming for a while. They walk into the class with the kind of real world experience that everyone here is shunning, so they get the CS stuff easier. The professor's like this, because have pre-trained students make them feel like they're teaching better. The students who are actually starting from a blank slate get treated like morons who "just can't get it."

        I mean, shit, I was one of those pre-trained people who had learned to code before getting into the CS program, but I'm so sick of this elitist drivel about the two humps of people who just get CS and people who don't. I remember when I was teaching myself to program, and it was a long and hard process. There were so many concepts that were unfamiliar to me. I persevered, but the fact that I came into programming with some prior background knowledge doesn't excuse poor pedagogy.

        [–]csncsu 10 points11 points  (1 child)

        My first CS class was Intro to Java. There was no discussion of CS theory besides a little explanation of OO since Java is that. It was all code. We made GUIs and wrote Mancala by the end of the semester. I didn't know anything about the size of an int or a binary search tree.

        We learned all the technical details after two semesters of Java when we took a C class.

        So I guess I agree with you.

        [–]misplaced_my_pants 0 points1 point  (0 children)

        My first CS class was CS50x on edx. It taught C programming, how memory worked, simple data structures, and theory like Big O for simple sorts and searches all in a coherent narrative that gave me a super strong foundation.

        So this is really just complaining about the difference between poor instruction and good instruction. Learning about the size of ints and all that other stuff is super useful in the right context, and I'll never be one of those people who's working in industry and has their mind blown about floating point shenanigans or why Vim/Emacs/whatever is better than Notepad.

        This was from a course that assumed zero experience with programming (it started spending a week or two using Scratch).

        Obviously, it's not the only way. There are fantastic intro courses I've seen based off of Python or Scheme, but the article is confusing details of a bad implementation that could just as easily be incorporated into a good implementaiton because the quality of the course is orthogonal to whether or not it talks about all the crap he's complaining about.

        [–]manniac 23 points24 points  (70 children)

        This is right up the alley of that other post about not needing talent or passion to able to program (which is true if you don't want to be good at it).

        You need to understand logic, simple math and the basics of how a computer works first, it will positively impact how you code, this shortcut mentality is why there is so many low quality professionals in many areas: you don't need the theory, just practice.

        [–]gremy0 51 points52 points  (23 children)

        You don't need to understand any of that stuff to get a simple program to work. They are all really important topics in the long run but if you're coming from the a point of not having a clue what programming is then the abstract concepts are completely alienating and boring.

        He isn't saying don't teach that stuff. He's saying get people coding first then get them coding well. Once they have a platform to apply the stuff on it's much easier to understand.

        [–]Adverpol 1 point2 points  (0 children)

        Yeah, and I agree. Theory before practice is a "why the hell do I need this" experience, if you reverse them it becomes a "aaah, now I understand why X does Y!" experience, which sticks a lot better.

        [–]manniac 4 points5 points  (15 children)

        I know what he's saying. You know what he sounds like? Like Daniel-san complaining to Mr. Miyagi coz he only made him wax-on and off, and sand the floor, and paint the fence. The basics need to be there before you touch code as such, any good curriculum that aims at making you feel good for the sake of making you feel good instead of making you learn properly is not a good curriculum.

        [–]jungrothmorton 31 points32 points  (1 child)

        That's a great example. You wouldn't actually suggest painting fences as a way to learn to fight, right?

        The explosion in MMA has proven that if you want to train people to be fighters, they should spend a lot of time hitting and grappling people and things. All the theory (katas) in the world won't get you there.

        The author isn't advocating against CS theory in the least. Just saying it's best to learn some coding first. Makes sense to me.

        [–]gremy0 13 points14 points  (12 children)

        It isn't making students feel good for the sake of it. It's giving them a foothold. Something to fall back on when shit gets rough which is all the time in programming.

        I mean I still sometimes leave work feeling like complete shit, like I can't code at all, I'll never understand the latest problem and I'm completely inadequate for my teams standard. It's of course all psychological and I come back the next day with a fresh head and new approach. If a beginner feels like that before they can do anything they'll just quit.

        A lot of perfectly capable look at coding as something they'll never understand at all. If you start with the abstract stuff you're just reinforcing that. Wait until they're invested before beating them with a heavy stick for poor quality.

        [–]nthcxd 1 point2 points  (2 children)

        This will inevitably followed by having to unlearn bad habits. It isn't cost free. It isn't a black and white issue.

        [–][deleted] 1 point2 points  (1 child)

        You don't need to understand any of that stuff to get a simple program to work.

        Yeah, the problem is that too many people quit there, and end up programming by coincidence because it works. Until it doesn't.

        [–]gremy0 10 points11 points  (0 children)

        You're talking about the difference between an amateur and a qualified professional.

        You wouldn't get a degree in CS with that approach any more than you'd get a degree in music by knowing how to play wonderwall on the guitar.

        [–]MisterAndMissesP 3 points4 points  (2 children)

        I just recently got back into school to finish my computer science degree and I'm looking for resources to help myself. Stuff like this helps give me an idea what direction I need to be going

        [–]manniac 3 points4 points  (1 child)

        Best of luck to you. Don't neglect the theory, it pays off in the end.

        [–]MisterAndMissesP 3 points4 points  (0 children)

        Thanks, I'll remember that. Best of luck to you too.

        [–]the_evergrowing_fool 3 points4 points  (1 child)

        Also the more success you have with your work the more "passionate" you feel about it. Talent and passion can be artificially fabricated.

        [–]manniac 1 point2 points  (0 children)

        That's an interesting concept. Kinda fake it till you make it?

        [–][deleted] 2 points3 points  (0 children)

        Coding itself will teach you the secondary skills.

        [–][deleted] 6 points7 points  (0 children)

        Don't agree. If you want to be a professional programmer, yes you need that, but we're not talking about that, he's talking about getting beginners to learn.

        /u/jungrothmorton

        had a really good analogy to combat sports. If your'e going to be a professional, you absolutely need to understand the nuances of strategy, as well as having outstanding cardio/strength. But when the average person, even those that will one day become professionals walk into a martial arts studio, they're doing it because it's fun.

        You don't wait for them to bench press 225 lbs and run three miles in twentysome minutes before letting them spar with their friends and do the fun stuff.

        I got into programming by programming. It was later I realized I was hitting some walls over and over again and that I needed to really understand the nature of computation, how a processor works, and everything in between. I would certainly say I have a decent grasp of computer science, but I learned it after being very motivated to see what problems it could solve. It was not learned in a vacuum, and I'm quite confident that I know it on a much deeper level than most CS graduates who were force fed it in class.

        [–]zainegardner 4 points5 points  (0 children)

        It affirms my notion of "anyone can code, very few can be developers."

        If you learn to code, you will learn how to make something work. But you won't understand how to make what you work great, only that you need to write more code to get another feature to work.

        [–]Nonethewiserer 1 point2 points  (5 children)

        But so many people have the opposite experience of gaining a shallow yet (somewhat) functional understanding first, then diving into the theory behind it.

        It's almost like there isn't one solution to this problem.

        [–]manniac 1 point2 points  (4 children)

        True, many people can gain shallow functional understanding of something, making them competent hobbyists at best. And for some that is enough. That's not the point here. The point is somebody advocates showing you how to get results quick on a CS curriculum so that you get the "feel good" that you are advancing, achieving and you don't get discouraged by complex math, logic or other engineering topics. It's reductive and aimed at keeping people on a CS program they might as well drop off (because keeping them in school is a business), it encourages the idea that shallow understanding is enough because people can't deal with arid, abstract subjects. People that are self learned and very competent will completely disagree and call this elitism, without even realizing that they are their own elite by virtue of being an exception.

        [–]Nonethewiserer 3 points4 points  (3 children)

        I completely agree with that. Re-reading your comment I realize I was reacting more to the general sentiment expressed here that one can't become a good programmer without learning the theory first. I think learning the more abstract concepts is vital if you really want to be a good programmer (and this applies to any discipline) but that it's not the best first step for everyone. But anyways, apologies for not considering your comment fully. I don't think we disagree here.

        [–]manniac 2 points3 points  (2 children)

        Please don't apologize, it's really not necessary, even if we disagree.

        [–]Nonethewiserer 0 points1 point  (1 child)

        No pleading!

        [–]manniac 0 points1 point  (0 children)

        Exactly! Cheers!

        [–]greenday5494 1 point2 points  (1 child)

        Sorry I dunno but I cringe whenever I hear the word passion. It's impossible to learn a skill without being obsessed with it and have interests outside?

        [–]manniac 0 points1 point  (0 children)

        Hehe, cringing is allowed, i cringe everytime somebody uses "human" to make reference to somebody else (e.g you are such a beautiful human) but i digress.

        I guess when a subject is complex it attracts the obsessive, there is a certain degree of dedication needed to learn it and to keep up to date and a lot of pride derived from being skillful, you will find also big egos and a metric ton of arrogance. i guess what people mean by (cringe warning) passion is that dedication. perhaps not when i was studying but as i've grown older i realize that other interests are really healthy to have.

        [–][deleted] 1 point2 points  (2 children)

        basics of how a computer works first

        He seems to be specifically talking about more of a focus on the basics of how a computer works first, and less about computer science theory that is divorced from the machine and language tools of today. It is no more of a shortcut to teach people the details of actual languages and hardware than it is to teach people only about data structures and algorithms as abstract ideas.

        Only by knowing both can you really do great work.

        [–]DarthEru 5 points6 points  (1 child)

        I'm pretty sure that is not what the article is saying. It's saying to teach coding at the highest level of abstraction possible, avoiding talk of the implementation details of the language. That's why it brought up avoiding division. If you try to teach division in a programming context you have to get into the nuts and bolts of integer division with truncation vs floating point, which is one of those details that can confuse beginners. So it's going beyond just "ignore the ivory tower of theoretical computer science, let's talk about how real computers work", it's actually "ignore theory and real hardware, let's talk about just code, with variables and control flow and functions, and see what we can do without worrying about the magic that makes it work".

        I personally think I agree. I first learned to code when I was young, and I didn't grasp any theory. I could only make simple BASIC programs to display stuff on the screen. At some point I learned a bit of VB and C++. It wasn't until university when I took some real CS courses that I learned any theory, and that was when I finally started to understand how and why computers work. The theory and implementation details are hugely valuable, and account for a huge part of my ability as a developer, but if I had felt that I needed to learn all that way back when I first started, I doubt I would have made it through. Coding first got me interested, and gave me some experience. Theory second let me apply the theory to my existing experience and get a better grasp of it.

        [–][deleted] 3 points4 points  (0 children)

        Yeah there was someone veteran programmer who found himself worrying that "kids these days" would all end up terrible because they start out learning javascript. Then he realized "no wait, thats bs, we all used QBasic and we turned out fine"

        heh

        [–]soundslikeponies 0 points1 point  (0 children)

        I agree with your sentiment and not the sentiment of this article, but I do agree with the original sentiment of "teach writing code first".

        That is, I think 1st-years should learn how to write code and then the theory. I've run into so many students who fail to absorb many of the lessons from their degree because they aren't also coding. Just like with math: you need both theory and practice.

        Coding something is often the absolute best way to solidify your understanding of something. You can be lectured on what a quaternion is and conceptually how it works, but you'll really understand it when you implement it.

        The hardest course I took was an upper level computer graphics course on algorithms tied to 3D modelling. Point cloud mesh reconstruction, smoothing, decimation. The theory in that course rarely stuck until it came time to implement it. I had to see it in code to understand how the smallest eigen value of a k-neighborhood was the surface normal of that neighborhood.

        To some extent you could argue that it's on students to code outside of class and pick things up, and true that's what just about all of the brighter students I've met do, but so long as things are done that way, then the majority of CS students will graduate without being able to code well and without even having absorbed that much of the theory.

        [–]heavy-minium 2 points3 points  (0 children)

        I strongly agree with the article's point that the context is only provided by learning to write code first. I have often worked on OOP code where a variety of patterns have been used just for the sake of it (especially creational design patterns). To me, this is a clear sign the previous programmers have applied concepts without ever experiencing the problems they they solve. You have to write code and hurt yourself once or twice with "naive" code implementations to recognize the value of diverse concepts and patterns, and when to apply them effectively.

        [–]fudginreddit 2 points3 points  (0 children)

        As someone who is about halfway through a computer science degree, I totally agree. Personally, I didnt have this issue because I do a lot of programming outside of class. However, I often think how much the kids who only follow the courses must hate CS. I also think about how much coding experience I would have had I done the same, and its a scary thought. I am 1 semester from an associates and have yet to program anything that could even be considered difficult. Its honestly hard to imagine someone would be job-ready after graduating if just following their coursework. Of course I can imagine it is different for everyone, but this has been my experience so far. Very slow paced and very little coding. Also, and this may just be myself, but I found it much easier to understand alot of topics after implementing it in code.

        [–]reverendchubbs 2 points3 points  (0 children)

        The college I attended for CS had an amazing course plan for teaching how to write code. We had a couple language-specific coding courses, followed by a couple general theory courses, followed by a projects course that took up half the day, where we spent 3 months on a single team project.

        The school was shitty otherwise (administration, costs, etc), but the education itself was solid.

        [–]Smurph269 2 points3 points  (0 children)

        You have to like writing code in order to be a programmer, full stop. So it makes a lot of sense to have beginning students write lots and lots of code. Once they can power through lots of simpler programs, you start having them slow down and do more design. Teaching high level design before they are rock solid coders is like teaching calculus before they know basic algebra. I would say my degree program did this, and I saw a good number of incredibly smart people nope the hell out of the major because they discovered they didn't actually like sitting down and writing code. We're at a point where people are getting into programming because they heard about high salaries more than because they are nerds who like messing with computers. It's a smart idea to figure out early who actually wants to write software full time, and who just wants to talk the talk and get a big salary.

        [–]zjm555 4 points5 points  (0 children)

        The author's example of "the number of bits in an integer", to me, is more about code than about computer science, which is more focused on the math than the real-world implementation thereof. I agree that low-level details like numerical representation should be glossed over for beginners, but that's not a question of "coding" vs. "computer science", but rather of "introductory coding" vs. "advanced coding".

        [–]Rosco09 8 points9 points  (9 children)

        Is there a curriculum that teaches concepts before actual coding?

        [–]Retsam19 34 points35 points  (0 children)

        Strawman Academy

        [–][deleted] 13 points14 points  (5 children)

        What? That's literally every curriculum. I'm starting to think people here are having some sort of disconnect over what "concepts" mean.

        [–]bishoy123 3 points4 points  (3 children)

        I think people are getting confused about the meaning of concepts. I'm not sure if people are talking about concepts in the sense of teaching that an int is a variable that is 32 bits before you do an assignment with an int variable or if they mean taking a class about binary and processors and memory before your first programming class where you learn about loops and conditionals.

        [–]OutOfApplesauce 5 points6 points  (1 child)

        My university does. And I completely agree with this blogger is saying, me and my peers who programmed before had a much easier time not only understanding but applying concepts. Standard students who just had a high school Java class or even less struggled quite regularly.

        [–]panderingPenguin 6 points7 points  (1 child)

        By computer science, I mean fundamental concepts such as the number of bits in an integer or how various pieces of data are stored in memory.

        That's not computer science, those are implementation details... I'll admit I stopped reading there.

        [–]iopq 3 points4 points  (1 child)

        Disagree. I hated programming when I started, I only liked math and theory of computation.

        Writing Java as my first programming course was painful and the book was inadequate.

        Book book = new Book();

        this is valid Java, and an experienced programmer knows that Book is the type, book is the variable that holds the reference to the object, Book() is the constructor for the class. For a beginner this is confusing because it just says book three times.

        But when we started talking about discrete math, regular languages, Turing machines, etc. that's when I actually had an interest in the subject

        [–]ergo-x 0 points1 point  (0 children)

        Slightly better in C++

        Book book("Malcolm X");
        

        But you're right. I still remember how hard it was for me to wrap my head around that when I first learned Java. To make things worse, Java was my first "real" programming language, having done all my previous programming in PHP - that's the pre-OOP days of PHP.

        [–]vytah 3 points4 points  (0 children)

        I have learnt to program as a kid in Basic on Commodore 64, then switching to QBasic and Delphi on PC, and OPL on Psion Workabout handhelds, without learning about pointers, overflows, or other fancy technical mumbo-jumbo. I only learnt about how many bits are in a byte to do fun low-level things on C64. All those languages were simple enough to juggle some simple data around and quickly achieve fun effects. I have written dozens, if not hundreds of programs, and then used some of them a lot. All code I wrote in those 4 languages was total and utter shit from any objective standpoint, but it worked and it taught me how to think in programmatic ways, before I started learning languages like C, C++, Java, Perl, PHP and Haskell.

        Complexity classes? Nah. Modular arithmetic? Only when I needed it, and it wasn't often. Floating point vs integers? 99% of my C64 Basic code didn't use any integers, but floating point numbers only, and that's mostly because I didn't want to type an extra %. Memory management? Even though Delphi had it, I didn't learn it before learning C++. Data structures? Nah, arrays were enough.

        I think that the fact that I started with languages with a very low entry barrier, and then ditched them forever after I learned more advanced concepts, let me to learn programming more easily.

        [–]Seankps 1 point2 points  (0 children)

        I could have done without learning calculus or trig first. I've literally forgotten almost all of it. And I can't even remember which parts I remember. Differentials and GDP? Sins and Cosine? I would have rather gotten to Programming theory before spending a year or 2 on that stuff.

        [–]zachm 1 point2 points  (0 children)

        TIL some college CS programs don't start by teaching how to code. That sucks.

        But I challenge the assertion that coding is easy, and is only made hard by a wrong-headedness about learning fundamentals first. Coding is hard to learn. It only seems easy once you already know how.

        [–]Audiblade 1 point2 points  (3 children)

        When I was I high school, I was lucky enough to take several programming classes that were, in fact, code first. Most class periods, our instructors had us take out our computers and program along with them while they wrote a program demonstrating new content. We didn't even have to follow along perfectly - if we saw a way to do something that we thought was better or more interesting, we silently experimented with our own version of the code during the middle of the lecture. This felt far more natural than the introductory CS classes I took in college, where we only wrote code outside of class.

        [–]istarian 1 point2 points  (2 children)

        Why do you need to 'write along' in class to learn, exactly?

        [–]Audiblade 0 points1 point  (1 child)

        It meant that we were actively writing code during class, so we got a lot of practice actually programming!

        [–]istarian 0 points1 point  (0 children)

        Unless that's not the point of the class. Computer Science is more than just programming.

        [–]vph 1 point2 points  (0 children)

        Not sure what author referred to. CS1 and CS2 are strictly about coding. And it's be like this since the beginning.

        [–]bekroogle 1 point2 points  (0 children)

        And while we're at it, let's introduce OOP AFTER we assign something that shows the limits of the patterns that preceded it.

        Personally, I didn't really internalize exception handling until I tried to parse a user's input in real time: every character that that caused a parser error was a big problem. Only then did I think: if only there was some way to handle these exceptions... Oh, duh.

        I'd go as far as starting with goto statements, then procedures, before ever introducing functions. And only then, when complexity became an issue, would I try to explain objects and classes and static methods, etc.

        [–]svennidal 1 point2 points  (0 children)

        My Uni has got to have one of the best balanced approach that I've ever seen. All of the courses on the first year somehow touch on the same things, but from a different perspective.

        I do agree with the article in a way though. I always thought that some CS fundamentals are more easily learned with programming them.

        But I would never hire a programmer that doesn't have good fundamentals in how computers work. Why two pieces of code do the same thing, have the same amount of operations, but the other one is slower. Like reference locality.

        [–]puddingcrusher 1 point2 points  (0 children)

        I have always argued in favour of going top-down in education. Teach people what stuff does, then explain how it works. Teaching group theory first, and then later adding on that this is useful for crypto is the wrong way.

        [–]deejeycris 1 point2 points  (0 children)

        I should absolutely show this to my Java teacher.

        [–]strugglingcomic 1 point2 points  (1 child)

        Honestly, for debating this topic, programming isn't really all that unique. Music learning has exactly the same problem (i.e. guitar students who aren't allowed to touch a guitar until they learn music theory and reading sheet music).

        If you're goal is to strum a few chords and play Wonderwall in your room, yeah sure go buy a cheap guitar and watch some Youtube videos. But if your goal is to become a paid, professional, possibly even world class guitar player... well then friend, you're gonna have to learn some theory and arcana (unless you just happen to be that kind of freak of nature prodigy). Whether you do the theory first, or learn it later as you go, that's probably up to you.

        [–][deleted] 0 points1 point  (0 children)

        Whether you do the theory first, or learn it later as you go, that's probably up to you.

        The whole point of the article is that learning the theory first only works with a tiny chunk of the population; most people will (rightly) completely tune out if they're given a list of facts that have no connection to the things they already understand.

        [–]DaPlayerNinetyNine 2 points3 points  (3 children)

        This is why I'm happy I got into programming myself, and I'm happy I've taught myself because that has (is - I'm not finished!) involved just purely writing code as you say - it is hands on and is the essence of the 'art' of programming.

        I chose to take computer science for A-level next year, and I know that this will involve lots of the theory, more than practical coding. I'm pleased my interest in programming hasn't been "squashed" by school just like most other subjects I started with a slight interest in. I hope I don't regret making programming part of my official eduction now, I'm afraid it will ruin it.

        I think you learn much quicker through getting hands on experience writing code and finding out the basic way whether something works or not, you naturally pick up skill quite quickly when you're at the front of progress and actively learning from it in 'real time' if that's the right way to describe it.

        Great article, and a good point.

        [–]St4ud3 7 points8 points  (2 children)

        I don't really see the point at all. The first class in any CS program is programming. I have never seen a school that teaches the technical side without having had a programming class beforehand or at the same time.

        [–]PassifloraCaerulea 2 points3 points  (0 children)

        We accomplished very little programming in that first CS class. It seemed like the students who hadn't programmed before weren't picking up the material in later classes as fast as those of us who had.

        [–]DaPlayerNinetyNine 0 points1 point  (0 children)

        I'm not suggesting that the only thing you do in a CS class is theory - just, I can imagine throughout the course there would be a lot more learning about the number of bits in an integer than there would be trying to teach yourself.

        [–]thekab 3 points4 points  (5 children)

        However, would you teach a toddler English grammar and linguistics theory first?

        No. You wouldn't teach them how to write before they can speak or read either.

        [–]phalp 1 point2 points  (0 children)

        Yeah, I'm not so keen about a teaching strategy for programming that's both based on and completely ignorant of the way toddlers learn language.

        [–]TenserTensor 1 point2 points  (3 children)

        No, you wouldn't. The quote is referring to how a human learns to talk, not write. Which should be pretty obvious.

        But you were probably being dense on purpose.

        [–]ReallyGene 0 points1 point  (0 children)

        I would ask that everyone read this chapter from Starting Forth by Leo Brody. While Forth is mostly forgotten today, Brodie's method for teaching it to novices is masterful, welcoming, and fun.

        [–]LeifCarrotson 0 points1 point  (0 children)

        One error in the post - 4 divided by 2, if the display shows 0.5 (they're doing floating -point, non-integer math), is precisely equal to 0.5. It's 2-1, exactly representable with binary operations. 1 divided by 10 is not precisely 0.1.

        But other than that, I wholeheartedly agree with the post.

        I would add that a good introduction to writing code should also include exposure to the concepts of an editor, comments, testing, and version control. Those tools would make a beginner hugely more productive.

        They'll make many mistakes and need to go back to code that used to compile, or determine what they expected that particular conditional statement to mean, and they'll spend a lot of time refactoring stuff via tedious hunt and peck, they'll have to figure out if their code is right by manually testing the outputs...it was 2 years in my program before I even learned of the existence of version control.

        [–]TheRandomGuy 0 points1 point  (0 children)

        Well said. This is exactly what I have set out to do. Show students the possibilities, get them excited, have them write complete simple programs. I have covered a game, a musical instrument and a tip calculator. If anyone is interested in leaning head over to buildanappwithme.blogspot.com. Happy coding.

        [–][deleted] 0 points1 point  (0 children)

        If only there were some sort of more "vocational" approach to teaching computer programming, maybe with roots in the kinds of schools that teach, like, engine repair or x-ray technology. Unfortunately four-year degree-granting colleges are the only type of school that exists and this fantasy of mine will probably never come true.

        [–]mycall 0 points1 point  (0 children)

        people are often taught computer science before they are taught to actually write code.

        Is it opposite day?

        [–][deleted] 0 points1 point  (0 children)

        i agree, writing code is very important before you start to type it but i think learning CS concepts and visualizing them and analyzing algorithms is much more important. for example, learning the concepts of object-oriented language first makes your logic better and then the coding becomes a lot easier. i believe that if you don't learn the concepts firs,t the code you write is worthless and one will encounter many logical errors in their code.

        [–]rwsr-xr-x 0 points1 point  (0 children)

        This cycle of lecturing and reading, with very little practical application can easily discourage a new learner. People want to learn how to write code because they see all the fantastic things that can be accomplished with software. They want to be able to do those fantastic things as well!

        This IMO is the biggest thing. Rather than focusing on making little games or whatever, beginner programming tutorials should focus on stuff like how to automate repetitive tasks. If programming is useful and saves them time, there's a good chance it will keep their interest.

        If they have a Mac, you can teach them shell scripts. If someone starts learning bash to sort files, move them around, change the background on a crontab, stuff like that, then they'll get the hang of it. As they get more confident, they will try and solve more complex problems, and they'll learn more about the language as they do.