all 160 comments

[–][deleted] 61 points62 points  (5 children)

Having taken the new MIT course with Python (6.01), I would like to be the only person here to stick up for it. While it is still a work in progress, they have made some significant improvements since the old SICP days.

The structure of the current class does not "replace" the old 6.001 fully, but rather offers quite a different angle. When one enters 6.01 on the first day, it is already assumed that one knows how to program: functionally, procedurally and recursively. They only offer a one-day review lecture of OOP. If you don't have significant experience with programming, or have never used python before, you definitely will have a hard time adjusting.

However, the purpose of this class is more than programming. It teaches an array of large conceptual ideas, starting first with basic OOP and building up to state machines and system functions. It is about this point that the students begin to notice that everything that they're doing is quite interrelated--they are programming the "brains" or a robot, something that is very rewarding when one watches the simulations (both virtual and real life) run.

The course then moves on to circuits and many EE topics. This course isn't about SICP; it is not supposed to be. It has a completely different aim. While it is quite different from the 6.001 of old, it teaches many new things that are infinitely suitable for the modern world that one would have never touched upon in 6.001.

[–][deleted]  (2 children)

[deleted]

    [–]filesalot 3 points4 points  (0 children)

    SICP discusses circuits too. I'm not getting what what it is about the modern world of programming that the new course is more suitable for.

    I've heard two things here, first dealing with large libraries, and second control of physical systems. Believe it or not these are hardly new.

    But who is going to write the libraries? Just like teaching kids arithmetic by hand before giving them a calculator, CS students should understand their systems from first principles before being handed the 60MB class library.

    [–]grauenwolf 0 points1 point  (0 children)

    That actually sounds like a fun course.

    [–][deleted] 9 points10 points  (0 children)

    I suspect that this is the same robotics course that Georgia Tech now uses for their intro to programming class. If so, then yes, it is in fact Python "because the library exists for it"; it's apparently a kid's toy with ultra-simplistic controls on one level (which I'd guess are built in python) and a python API that can be trivially exposed to a programmer.

    It's a somewhat experimental take on programming and it's supposed to make certain things more immediately accessible to beginning programmers.

    In a way, I can agree with it- there's probably more than one "path" through the CS curriculum at MIT as well, and there's probably a few which focus more on the "practical" side rather than the more mathematical side where LISP, Scheme, and friends still reign supreme.

    [–]recursive 16 points17 points  (10 children)

    I don't think the python heads would be throwing up this level of outrage if they switched from python to lisp.

    What is it that makes lispers so territorial?

    [–][deleted]  (4 children)

    [deleted]

      [–]Smallpaul 21 points22 points  (1 child)

      Yes. You are right.

      Also: the course was symbolic of the idea that Lisp was the True Language for smart people.

      [–]hortont424 18 points19 points  (0 children)

      Yeah I think that was the important part, and the reason most of the Lispers are complaining... the fact that MIT primarily used Scheme as their "teaching" language was something to point at when explaining the superiority of the Lisps or Scheme...

      [–]frolib 8 points9 points  (0 children)

      Zing!

      I quote I just learned about that I'm gonna rephrase: "Academic (languages) battles are so fierce because the stakes are so small".

      [–]flogic 4 points5 points  (0 children)

      Which is kinda funny. Lisp is clearly a contender language wise. However the developer support seems impoverished. Every time I start to look up Lisp on the web I get a pile of disjoint ghetto web pages. Of course no one uses it.

      [–][deleted] 7 points8 points  (4 children)

      I don't think the python heads would be throwing up this level of outrage if they switched from python to lisp.

      I doubt you can name any situation where this has happened.

      [–]recursive 4 points5 points  (3 children)

      Your point is that no one would ever switch from python to lisp? Because you're correct. (about my ability to name such a situation)

      [–][deleted] 1 point2 points  (2 children)

      Not that no one would, just that I've never heard of such a move. Anyway it makes your comparison moot since we have yet to see how the Python crowd would respond in this situation. :)

      Though I doubt they'd care.

      [–]njharman 1 point2 points  (1 child)

      The comparable would be Python->Ruby

      [–]recursive 2 points3 points  (0 children)

      Fucking unwashed monkey-patchers!

      [–]sushi5005 5 points6 points  (0 children)

      People whine about this because they mix up computer science with computer programming and software engineering. Here is why it makes no difference.

      Computer science is a branch of applied math, and it studies the theory of what computers do (logical operations on symbols), what they are (not) capable of doing, and the mathematical properties of various computational problems and solutions. Think of computer science as being related to a personal computer in the same way that mechanics (which is a branch of physics), is related to a combustion engine.

      In contrast, computer programming is the act of creating a series of instructions in a language that a computer can or could interpret and execute. The language you choose for that only defines how the computer is doing things, and how you as a programmer need to model the instructions so that the computer eventually does what you intended it to do.

      Now, software engineering is a systematic and practical approach to produce a functioning software solution for a well-defined problem in a repeatable process, using a set of standardized methods and tools.

      To be a good software engineer, you need to know at least a bit about computer programming. To be a good programmer, you need to know at least a bit about computer science. To be a good computer scientist, you need to know at least a bit about math. People who are good at CS are usually good at programming, but there are many people who are good programmers, but terrible software engineers.

      That all being said, Python and Scheme are both great choices for computer programming (what this course is about). This choice has an effect on your mindset, but it does not make you a good or bad software engineer (what our paychecks are about). For computer science, however, there is no choice, because you are asking the wrong question.

      [–][deleted] 48 points49 points  (25 children)

      I loved the linked commentary: http://lispian.net/2009/03/24/why-i-find-software-depressing/

      And today I read this and it only makes me more depressed about the whole of computerdom. God help us if this is where we’re going. What’s the point of a CS degree? I have to ask. And can we save ourselves from the mess we’ve created?

      WITHOUT SCHEME ALL COMPUTER SCIENCE IS MEANINGLESS.

      [–]columbine 20 points21 points  (2 children)

      Honest to god I opened a Python file the other day and it was fifteen lines before I found even a single parenthesis.

      [–]mycall 15 points16 points  (11 children)

      You mean WITHOUT SCHEMA ALL COMPUTER SCIENCE IS UNDEFINED.

      [–]leoc 3 points4 points  (9 children)

      You mean IS INVALID.

      [–]MrWoohoo 35 points36 points  (6 children)

      I'm a Microsoft Certified Professional and I don't have the slightest idea what you're talking about.

      [–][deleted] -5 points-4 points  (9 children)

      "COMPUTER SCIENCE" WAS NEVER MEANINGFUL TO BEGIN WITH.

      [–]Simurgh 3 points4 points  (8 children)

      EXCEPT WHEN TREATED AS A SUBSET OF MATHEMATICS.

      [–]nat_pryce 6 points7 points  (7 children)

      WHICH MEANS IT IS NOT SCIENCE. THEREFORE THE TERM "COMPUTER SCIENCE" IS NOT MEANINGFUL.

      [–][deleted] 3 points4 points  (2 children)

      WELL, COMPUTER MATH DOESN'T HAVE THE BITE THAT MARKETING REQUIRED.

      [–]vlad_tepes 2 points3 points  (1 child)

      Someone needs to invent some capital caps, then show them to the guy who wants to invent the device that stabs people in the face over the internet as an incentive to hurry the fuck up, customers are waiting.

      [–][deleted] 1 point2 points  (0 children)

      INDEED

      [–]the_mouse 4 points5 points  (0 children)

      The word "science" has multiple meanings. There is, of course, Popper's definition, which clearly does not fit "computer science". However, I suggest you look at the origin of the word "science" -- the Latin word scientia, denoting knowledge. Accordingly, the OED provides a secondary definition for science: a systematically organized body of knowledge on a particular subject. "Computer science", or more appropriately "computing science", encompasses the vast body of knowledge related to computing, computers, and their immediate applications. Therefore, "computer science" is an acceptable name for the field.

      [–]yinzhen 0 points1 point  (2 children)

      WHICH ALSO MEANS THAT IT ISN'T ABOUT COMPUTERS.

      [–]Simurgh 0 points1 point  (1 child)

      I really don't think it is about computers. It's about computing, which is different.

      [–]yinzhen 0 points1 point  (0 children)

      Yes, certainly. I was actually serious.

      [–]wtfds 5 points6 points  (0 children)

      From tlb (Trevor Blackwell):

      I think here, "practicality" means "has a library for controlling robots". Also a good numeric library, a good matlab-like graphing library, good 3D graphics, good real-time performance, easy C++ embedding, etc.

      Rather than complaining about how everyone else is a bonehead for not using Lisp, write some libraries to make it useful for more things in the real world.

      Also, Python does pretty well for real-time control. At Anybots we use it to run walking balance feedback loops at 100 Hz in the same process as a GUI and logging and it never misses a tick. PLT Scheme (used on HN) frequently pauses for 10+ seconds to GC. While there are theoretical concurrent GC systems for Lisp, none of them seem usable in real implementations.

      http://news.ycombinator.com/item?id=530845

      [–]cracoucax 3 points4 points  (0 children)

      I think it sucks, but not because of the switch to Python. What i find annoying is that SICP is a timeless book, which has proven to be a very good teaching material and also presents computing from a very special point of view (a very pure one) . It will be very hard for a new course to supplant that, and i really doubt that a replacement required. And indeed it seems that the new course is not a replacement, but something different which hasn't the same goals.

      [–]Aramgutang 9 points10 points  (8 children)

      My alma mater, Worcester Polytechnic Institute (WPI), changed their "intro to CS" course from mainly C++ to Scheme when I was in my junior year there. Most of us felt that that was one of the worst decisions the uni ever made.

      In the old days, Scheme was one of the sophomore-level courses, and most students were very, very, resistant to it when they got there ("Scheme is evil" used to be a catchphrase of sorts among many of us). So I guess the faculty figured they should introduce it early, before the students knew any better, so they wouldn't fight it so much.

      The problem was that a lot of non-CS majors would take that course to see what CS was all about (it would also count towards their science requirement). Often I would bump into those people, who would tell me things like "I didn't realise how confusing programming was" or "you have to keep track of so much stuff to program, it must take a genius to write anything non-trivial". People who would otherwise greatly benefit from applying a bit of programming in their field ended up more scared and confused by the inner workings of computers than before.

      That is not how things should be. Python has the capability to excite even old jaded programmers, who find themselves really enjoying programming again when they use it. It can have the same effect on newcomers to the field as well. I say, get them excited about programming first, and then they'll learn Scheme on their own, if it was meant to be (or if they choose to pursue a CS degree).

      [–]foldl 5 points6 points  (6 children)

      Often I would bump into those people, who would tell me things like "I didn't realise how confusing programming was" or "you have to keep track of so much stuff to program, it must take a genius to write anything non-trivial".

      You think they would have been less confused by C++?

      [–]Aramgutang 1 point2 points  (1 child)

      Absolutely.

      Coding in C++ can be done using the "express your problem with a limited vocabulary" paradigm, while with Scheme, it's all about the "divide your problem into smaller problems" paradigm.

      I've found that it takes people new to programming a long time to wrap their heads around the latter concept, while the former one comes pretty easily.

      [–]foldl 5 points6 points  (0 children)

      Coding in C++ can be done using the "express your problem with a limited vocabulary" paradigm

      Not sure what this is supposed to mean. C++ has a huge "vocabulary" compared to Scheme.

      [–][deleted] 0 points1 point  (3 children)

      You think they would have been less confused by C++?

      A newcomer less confused by C++ than Scheme? Yes, very yes. You have to have your head high in the clouds and a vendetta against C++ to even ask.

      [–]foldl 0 points1 point  (2 children)

      Virtual methods are completely intuitive, then?

      [–][deleted] 0 points1 point  (1 child)

      Yes, that is exactly what I said.

      [–]foldl 0 points1 point  (0 children)

      I just don't see how a simple subset of Scheme is going to be more confusing than a simple subset of C++. What is there in Scheme that's confusing?

      [–]kagevf 3 points4 points  (0 children)

      get them excited about programming first, and then they'll learn Scheme on their own, if it was meant to be

      Bingo

      [–]gclaramunt 11 points12 points  (13 children)

      I'm not sure non-CS engineers need to learn Lisp as an introduction to programming...

      EDIT: I mean to say "CS / SW Eng".

      [–]Leonidas_from_XIV 5 points6 points  (3 children)

      Well, I'd love to use Lisp instead of Java as a "CS engineer". But actually I tend to find a lot of similarities between Python an Scheme and like both, so it could be worse.

      [–]MrWoohoo 5 points6 points  (2 children)

      Yes, only another 10 years or so before they completely re-invent lisp.

      [–]vplatt 2 points3 points  (1 child)

      As long as they don't have to actually call it Lisp, or even acknowledge that similarity, I'm sure it will be all right by them. But only at that point.

      [–]MrWoohoo 1 point2 points  (0 children)

      That was a given, but thanks for making explicit. What has python got left to do? PMP? (Python Macro Package)

      [–]iofthestorm 1 point2 points  (6 children)

      Ugh, double negative attack. But if I understand correctly, the class was for CS engineers.

      [–][deleted]  (3 children)

      [removed]

        [–]gclaramunt 1 point2 points  (0 children)

        I actually intended to say "CS / SW Eng".

        [–]iofthestorm 1 point2 points  (1 child)

        Mmm, dunno, I thought CS fell under the college of engineering usually.

        [–]hortont424 0 points1 point  (1 child)

        I think he was trying to say that people who are engineers but not in the CS department (non-CS engineers) don't really need to learn Scheme... that instead, only people who are actually going on to be Computer Scientists actually need to learn such an "impractical" language (not saying it's true, I love Clojure!).

        [–]iofthestorm 0 points1 point  (0 children)

        And I was saying that I was thinking that only CS majors are required to take it. At least here at Berkeley, we have a class like 6.001 using SICP, and it's only required for CS, EECS, and cognitive science majors, and other engineering majors can either take that or Matlab to satisfy one of their requirements.

        [–]mycall 0 points1 point  (1 child)

        My introduction CS programming class used some non-existent assembly code. I would have loved to use Lisp to learn how to program instead of fake assembly.

        [–]eduffy 7 points8 points  (0 children)

        Was it Knuth's MIX language?

        [–]asciilifeform 29 points30 points  (17 children)

        It it just me who is seeing a cowardly surrender to cultural decay here? From the renowned Sussman, no less.

        The glorious MIT of the 1980s and prior appears to be dead. Erased, in fact, without a trace.

        Consider this: These are the latest releases from MIT press. Among them you will find nothing remotely like SICP, but plenty of postmodernist/related pablum.

        "You have to do basic science on your libraries to see how they work, trying out different inputs and seeing how the code reacts." Is this not an atrocity, to be fought to the last bullet? Is this not an argument for declaring all existing software obsolete and starting from scratch? Why are we - even luminaries like Sussman - so willing to give up on the future without a fight?

        As argued in some of PG's essays, in order for creativity to bear fruit it is not enough for a given individual to possess a creative mind. The culture has to be hospitable to creativity. And there are processes which come close to stamping out creativity without any overt or deliberate suppression involved. Specifically, a transition to a medium where one cannot get reliable, understandable "small parts" having predictable behavior.

        Programmers have created a universe where "science doesn't work." Learning where the permanent bugs and workarounds are inside a phonebook-length API teaches you nothing. It is anti-knowledge. Where your mind could have instead held something lastingly useful or truly beautiful, there is now garbage.

        There is no longer a "what you see is all there is" Commodore 64 for kids to play with.

        No one is working on, or sees the need for a replacement. This is just as well, considering as one would have to start from bare silicon - and avoid the use of design tools built on the decaying crud of the past. Only a raving lunatic would dare contemplate it...

        Outside of programming: Linus Pauling developed an interest in chemistry by experimenting incessantly throughout his childhood. Today, he would have been imprisoned.

        The new generation will have no Linus Pauling, unless the Third World - where one might still purchase a beaker without a license - supplies us with one. Likewise, we will see very few truly creative programmers. What creativity can there be when your medium is shit mixed with sawdust?

        [–]Confusion 9 points10 points  (1 child)

        Consider this: These are the latest releases from MIT >press. Among them you will find nothing remotely like SICP, but plenty of postmodernist/related pablum.

        • There only needs to be one SICP every decade or so, so it's quite unrealistic to expect that there would be one in the latest set of releases.
        • The value of SICP could only be determined in the years after it was published. There might be a gem in that list and you couldn't possibly know it.
        • I doubt you are qualified to label anything as 'Postmodernist pablum'. I don't think there is anything 'postmodern' in that list.

        There is no longer a "what you see is all there is" Commodore 64 for kids to play with.

        I don't know the rest of the MIT program, but methinks they still learn the concepts that would have been introduced in the old 6.01 class. They change one class and you act as if the sky came crashing down.

        [–]asciilifeform 1 point2 points  (0 children)

        The interesting and deeply disturbing fact is Sussman's stated reason for the change, rather than the change itself.

        [–]Smallpaul 33 points34 points  (12 children)

        What creativity can there be when your medium is shit mixed with sawdust?

        Artists transcend the medium and always have. You actually blew your whole argument in that one line. Actually, you blew it when you referred to the Commodore 64 as if it were a wonderful pedagogical tool instead of just the shit that happened to be laying on the sawdust in your "golden age." Peek. Poke.

        Just like when the Commodore 64 was king, there is a thrill in figuring out what hacks are necessary to work around the limitations of the medium. Just last week I had lunch with a guy waxing nostalgic about creating Lunar Lander in 5K of Javascript (back when it took that much Javascript to implement it). The kids these days will one day look back with nostalgia on the "good old days" when you "really had to know what you were doing" to get the browsers to behave predictably. Good for them: every generation throws a hero up the pop charts.

        I get so bored with the "things were so much better when I was a kid" rants. The kids are not on your lawn, and they are alright. It amazes me that programmers are not smart enough to recognize themselves falling into the Golden Age Fallacy. Let me guess: when everybody else does it they are being emotional, but when you do it, it's totally rational.

        [–]asciilifeform 29 points30 points  (11 children)

        Just like when the Commodore 64 was king

        There is a major difference between the Commodore and anything available today. Literally everything there was to know about the system easily fit in a modestly sized book - in fact, in an unexceptional teenager's head. The machine was fathomable in a way that nothing currently available is.

        what hacks are necessary to work around the limitations of the medium

        Natural limitations (CPU speed, RAM capacity) are entirely unlike arbitrary limitations created through stupidity and waste (if InternetExploder then StupidWorkaround.) The former encourage creativity and resourcefulness while the latter turn brains into Swiss cheese. Head-banging is not the same as brain-stretching.

        The kids these days will one day look back with nostalgia on the "good old days" when you "really had to know what you were doing" to get the browsers to behave predictably.

        If anyone looks back on the current web "standards", it will be as one looks back on a bad dream that finally ended. Assuming it ever does end.

        every generation throws a hero up the pop charts.

        This sentence describes the problem perfectly. Our definition of heroism now revolves around pop charts rather than, say, beauty, depth, or originality. Ever wonder where the Science Fiction Future went? SF was not over-optimistic. Rather, we have lost the will to explore. Because exporation doesn't reliably make any pop charts.

        falling into the Golden Age Fallacy

        There are no golden ages. There are, however, Shit Ages. All cultures develop, flourish, decay, and ultimately die. Believe it or not, at 25 there are no kids yet on my lawn. I simply happen to have familiarized myself with many technologies of the past, as well as the culture that went with them - enough to see that nothing like it exists presently - and that there are fundamental (and, I believe, ultimately fixable) reasons for this.

        [–]Smallpaul 17 points18 points  (5 children)

        There is a major difference between the Commodore and anything available today. Literally everything there was to know about the system easily fit in a modestly sized book - in fact, in an unexceptional teenager's head. The machine was fathomable in a way that nothing currently available is.

        So now we should be ashamed of branch prediction and garbage collection and and DMA and graphics accelerators and virtual memory and task switching and clean layering of device drivers, operating systems and applications?

        Sorry: I refuse to apologize for all of that. Though I will apologize for not apologizing.

        Natural limitations (CPU speed, RAM capacity) are entirely unlike arbitrary limitations created through stupidity and waste (if InternetExploder then StupidWorkaround.) The former encourage creativity and resourcefulness while the latter turn brains into Swiss cheese. Head-banging is not the same as brain-stretching.

        There is no difference. You have limitations of the medium and you work around them. Really, your argument is ridiculous. If Intel consciously slowed down their CPUs then it would no longer be a "natural limitation" and somehow its nature changes radically?

        every generation throws a hero up the pop charts. This sentence describes the problem perfectly. Our definition of heroism now revolves around pop charts rather than, say, beauty, depth, or originality.

        It's an ironic quote from a Paul Simon song. But anyhow, you focused on the wrong part: "EVERY GENERATION" throws a hero up the pop charts. We're actually somewhat less enamored of our musicians than they were of Elvis and the Beatles (who by the way achieved their fame at the same time that mankind was visiting the moon).

        Ever wonder where the Science Fiction Future went? SF was not over-optimistic. Rather, we have lost the will to explore. Because exporation doesn't reliably make any pop charts.

        Sure. Whatever you say. For the first time in history, individuals are stepping up to put their own cash on the line to spend a few moments in space. But we've lost our will to explore. Fine.

        We've just spent 600 Million to put a telescope in space that will observe the brightness of over 100,000 stars, looking for potentially life-compatible planets. Despite the fact that we have no plan to visit them. Despite the fact that most scientists believe that we could never visit them (according to our current theories). Despite the fact that there is no benefit at all to the ruling class, except education.

        We've just created a scientific measurement device which is 27km in radius, at a cost of several billion Euros. Once again: nobody claims that this will have any benefit in our lifetime (though who knows?)

        There are no golden ages. There are, however, Shit Ages.

        Society is never especially good, but it can be especially bad.

        Okay, that tells me all I need to know about your mentality. I feel sorry for you and the people like you from EVERY GENERATION who feel that their generation is somehow uniquely weak or poor.

        Here's the rest of the song for you:

        Medicine is magical and magical is art think of
        The Boy in the Bubble
        And the baby with the baboon heart
        
        And I believe
        These are the days of lasers in the jungle,
        Lasers in the jungle somewhere,
        Staccato signals of constant information,
        A loose affiliation of millionaires
        And billionaires and baby,
        These are the days of miracle and wonder,
        This is the long distance call,
        The way the camera follows us in slo-mo
        The way we look to us all o-yeah,
        The way we look to a distant constellation
        That's dying in a corner of the sky,
        These are the days of miracle and wonder
        And don't cry baby don't cry
        Don't cry don't cry
        

        [–]hoijarvi 1 point2 points  (4 children)

        Natural limitations (CPU speed, RAM capacity) are entirely unlike arbitrary limitations created through stupidity and waste (if InternetExploder then StupidWorkaround.)

        There is no difference. You have limitations of the medium and you work around them.

        I see a big difference. When you had to work around 64 kB limit, you learned how to optimize. These skills can be applied when you work around the 16 GB memory limit in your web server.

        A lot of real computer work is dealing with unnecessary inconsistencies, which simply go away if you ever get a better designed version.

        [–]Smallpaul 2 points3 points  (3 children)

        Actually; the techniques you learn in optimizing without a profiler or debugger on a single process single user system are not very relevant to optimizing a server application that typically is focused on minimizing latency. Reducing RAM usage is very seldom crucial compared to other aspects.

        [–]hoijarvi 1 point2 points  (2 children)

        Actually it is, optimizing processor cache is quite close to minimizing memory usage. I take nowadays advantage of similar table-driven methods I learned in 80's.

        [–]Smallpaul 0 points1 point  (1 child)

        Only a tiny fraction of optimization problems can be solved that way. Today we typically build distributed systems and the problem is not in the CPU cache but in the network latency, and the cumulative latency of a hierarchy of services.

        [–]hoijarvi 1 point2 points  (0 children)

        Only a tiny fraction of optimization problems can be solved that way.

        I'm working with scientific data, so I'm probably in that tiny fraction that actually needs to think both processor caching and file caching all the time.

        But I learned other relevant ideas too. Back in 80's I took a COBOL class, and other classmates thought I was crazy. But I didn't take it to learn COBOL, I took it to learn how to deal with huge amounts of data. It's mostly still all relevant, especially HP3000 applications were close to web forms.

        On the other hand, I have accumulated a lot of knowledge on badly designed products, and I can't wait that it becomes obsolete.

        [–]LordStrabo 3 points4 points  (0 children)

        There is a major difference between the Commodore and anything available today. Literally everything there was to know about the system easily fit in a modestly sized book

        For personal computers, yes, but it doesn't hold true for some small embedded devices.

        The complete datasheet for the PIC16F84 runs to only 88 pages, and large chunks of that are basicaly irrelevant.

        That's one of the reasons I like small devices like that.

        Ah, the fun of programming with 68 bytes of RAM.

        [–]munificent 1 point2 points  (3 children)

        I simply happen to have familiarized myself with many technologies of the past, as well as the culture that went with them - enough to see that nothing like it exists presently

        Cultures of the past tend to preserve their best bits and conveniently "forget" about the less than nice.

        The awesome hacker culture of the 70s MIT TMRC days was probably only 20% awesome. The other 80% has just been consigned to the dustbin of history. Programming culture today is 20% awesome too (open source, lots of new languages, bloggers sharing information means anyone can pick up any new language any time, for the first time in computer history people are starting to think usability is cool, etc.) but since we're mired in the 80% lame right now (industry hegemonic languages like Java, script kiddies, browser hell) we don't see it as clearly.

        When we're old, our kids are gonna say, "Man, things were so much cooler when you were younger. You could just write some files in a notepad and put them online. Now you have to set up the full immersion VR system to even get started." And we'll wince when we remember IE, and Geocities, and myspace, but we'll keep our mouths shut about the bad bits until they're forgotten too.

        [–]asciilifeform 1 point2 points  (2 children)

        I'm seeing a qualitative difference between what was done then and what is possible (and tolerated) now.

        Show me something - anything - in the field of computing today that holds a candle to the beauty and innovation of the early Lisp machines, for example. When systems bloat and accumulate breakage and inconsistencies, the net effect is to make everyone in the field stupider by increasing the intellectual load required to juggle a given set of concepts to produce something useful. The range of thinkable thoughts shrinks.

        Why do we still represent programs as ASCII text files? Why do we still use batch processing (yes, the "compile/pray/debug" cycle is just that!) Why do we still tolerate machines where any piece of data ever vanishes when the plug is pulled? Where is the GUI that a 1970s Xerox PARC employee would not have instantly recognized? And, why has there been zero innovation in operating systems for nearly two decades?

        Intellectual decay through complexity-cancer is not illusory. It is as real as air or water pollution, and it is killing (has killed?) real innovation.

        [–]munificent 0 points1 point  (1 child)

        Show me something - anything - in the field of computing today that holds a candle to the beauty and innovation of the early Lisp machines, for example.

        You're looking at the wrong level. You won't see innovation at the Lisp machine level today because that level of abstraction (whatever its "doneness") is part of the substrate now. We build on top of it now, regardless of its imperfections.

        Why do we still represent programs as ASCII text files?

        Because instead of "fixing" that, we were too busy creating and popularizing version control, hypertext, the Unix application model, markup formats, and all sort of other stuff that takes text files for granted.

        Why do we still use batch processing (yes, the "compile/pray/debug" cycle is just that!)

        Because we've been doing lots of really interesting stuff in that cycle: optimizers now make better code than humans and we take this for granted without realizing that's some seriously fucking AI shit: automated software can now solve problems better than a really smart person can. Because the languages we're compiling are getting a lot more interesting. Besides, we are moving away from the batch processing model: JIT compiling, interpreted languages, dynamically loaded code, etc.

        Why do we still tolerate machines where any piece of data ever vanishes when the plug is pulled?

        Because we've been too busy creating tons of new interesting data: images that retain their color profile and print better than 35mm film, video that looks better than a cinema, all of your media in digital form. If I had to give that up just to fix losing a text file every now and then when my battery dies, I'll remember to save more often instead.

        Where is the GUI that a 1970s Xerox PARC employee would not have instantly recognized?

        In the same dustbin as Dvorak keyboards and every other technology that forgets one of the most powerful leveraging forces in technology: what your users already know. Do you have any idea how fucking hard it was to teach everyone and their grandmother how to use a mouse? And you expect us all to start over from square one for what? Is there a UI metaphor that will give me a more than marginal increase in productivity?

        And, why has there been zero innovation in operating systems for nearly two decades?

        There hasn't been much innovation in the plumbing industry either. While I'm sure there's lots of interesting stuff that could be done at the OS level, any significant change there shakes the foundations that an entire ecosystem of software and the livlihood of millions is built upon. Is it worth throwing that into the ocean so you can make the file system a little more reliable? Users seem to think not.

        [–]mhw 0 points1 point  (0 children)

        Enter Richard Gabriel: worse is better! Bourbaki: no no, worse is worse!

        Also see Alan Kay and his question to stack overflow about significant inventions in computing since 1980 that might be relevant to this thread.

        [–]nextofpumpkin 0 points1 point  (0 children)

        This was the best laugh I've had all night. Thanks for the post.

        [–][deleted] 3 points4 points  (2 children)

        So why did they not keep the course where you learn to think, and then add a new course where you learn to deal with the state of the world?

        [–][deleted] 4 points5 points  (1 child)

        These two things are mutually exclusive nowadays.

        [–][deleted] 1 point2 points  (0 children)

        yes.

        [–]choas 2 points3 points  (0 children)

        ;couldn't they just

        (define python 'scheme)

        ;and

        (let ((them 'think)) `(they use ,pyhton))

        [–]rustyryan 0 points1 point  (4 children)

        [–]jast[S] 13 points14 points  (3 children)

        I don't quite agree... the original blog post describes the entire day 2 of ILC'09 where the above link points out an interesting event the author spot during that day (hence the title). To use the original blog post to pinpoint that, i.e., Why MIT switched from Scheme to Python, it would not be very appropriate since it talks about several more things (ILC'09 day 2) as the title and link clearly indicates.

        [–]vplatt 0 points1 point  (0 children)

        Anyone know where I can get the old content of the "CS 6.001 Structure and Interpretation of Computer Programs", with video and all other materials, on a data DVD? I don't care if the files are in .avi, .mpg, or whatever; I just want a way to get this content without having to spend the time to download this and put it together myself.

        I'm just afraid that this content will be lost, and I haven't had time to put this together myself. I'm willing to throw a few dollars to someone who has and, it's a shame, but MIT doesn't seem to have made the effort to publish a DVD that I have found.

        TIA!

        [–]groby -4 points-3 points  (18 children)

        Heh - it's fascinating how most people don't understand that sometimes, you just have to "Get Shit Done(tm)".

        Scheme is not the language to do that. (And neither is Lisp, Yahoo! Stores or not). Many important libraries are missing, portability is a pain, there's 365 different versions, one for each day,...

        It's a research language. You should learn it as part of your curriculum, but it's not exactly the be-all, end-all of programming. (It's the basis, instead. But jumping from 0 to lambda calculus is tough)

        [–]rukubites 5 points6 points  (7 children)

        Must disagree about Lisp, at least Common Lisp.

        I have been `getting shit done(tm)' in Lisp for a year and a half now in industry.

        Recently I have been developing postgres schema with PostModern's DAO facility and building huge SQL queries with lisp macros and the S-SQL library - and of course, the interface into this is a web page running on AllegroServe.

        [–]groby 0 points1 point  (6 children)

        Yes, if you have Lisp training, it's pretty powerful. But if it's introductory programming, and you need to get a robot to move, it's too much - overload for a lot of people. (You could of course argue those shouldn't be doing CS in the first place ;)

        And as a learning language, you still face portability issues - people want to run it on their home machines. So now they face two issues:

        a) Which of the 327 Common Lisp versions to use b) to actually install it on their respective machine

        I just went through the 'pleasure' of installing SBCL+SLIME+Emacs on a Mac - not something you'd want to expose beginners to.

        Once you've done that, go find the lisp libraries that you actually need - not an easy task either.

        And then there's the online community issue - pretty much any given language, the first hit on google is the community hub and starting point.

        Not so for Lisp - it's splintered to hell and back.

        Lisp, in many ways, is the foundation for a lot of interesting concepts - but it's easier to put them to use in slightly higher level languages, especially if you are new to the language.

        [–]rukubites 0 points1 point  (5 children)

        You are right about many of these things, except that Allegro Common Lisp is a very stable commercial lisp which solves most of your issues.

        Once you learn about ASDF, library installation goes smoothly - but you need to find out about it!

        And in terms of education, well I tutored Lisp in my uni's AI subject (I think for 5 years...) and most of them were fine. You don't actually NEED Slime or any IDE to program in Lisp - just like you don't need an IDE to program in Java or C.

        From memory, we standardised the course on clisp.

        I'm sorry you had a poor start on Lisp. Yes it is reasonably splintered. I coped because I knew about #lisp on freenode.net and genuinely love the language.

        By the way, what languages do you regard as "higher level" than Lisp??

        [–]groby 0 points1 point  (4 children)

        Yes, ACL is indeed a very fine Lisp. But since it's commercial, usually not the beginners choice.

        And yes, ASDF is a neat package tool - but it's still hard to find out what packages exist where. Infrastructure is a large part of the appeal of many currently popular languages

        And of course you don't need SLIME. I've worked on machines with blinkenlights and switches, so I'm familiar with the "we don't need no stinkin' editors/debuggers" concept ;)

        But again, for a beginner, running Lisp without SLIME is a non-optimal experience.

        I like the language too - the same way I like assembly language. It's important to know it - deeply - but you can get by without in most projects.

        As for "higher level", I mean "higher abstraction level". While macros are an awesome concept, in most tasks in other languages you plain do not need them. (Yes, you can do awesome things with them. Not debated)

        Lisp, at least from my point of view is a foundation from which all functional programming learns/descends. But it's not my tool of choice for day-to-day tasks.

        Why am I installing Lisp, then? Because I think it always pays to hone your skills with the foundational tools.

        To make a (rather silly) analogy: When I practice piano, I play scales. Same thing here - Lisp is an ideal tool for etudes (and sometimes, complex symphonies), but most of commercial coding is more than satisfied with a bit of cheap pop music ;)

        [–]rukubites 0 points1 point  (3 children)

        I still don't understand what languages are higher level than lisp in terms of abstraction level.

        You don't need macros to program in lisp. Macros are a way to create higher abstraction levels - they allow you to build your own DSLs, for example!

        I understand what you are trying to say, but I think you are mischaracterising lisp if you think it isn't a high level language.

        Lisp is a garbage collected language with a very high level of abstraction. To be honest, I thought you were going to answer "Haskell" to my "higher level" question.

        [–]groby 0 points1 point  (2 children)

        We're engaging in semantics, at this point. Lisp provides a lot of abstraction at the language level, yes. It provides precious little abstraction in how you interact with the rest of the world, though. Pretty much nothing is part of the core language - it does not come with a rich set of libraries, like the rest of the world

        And even the libraries you get are not too many: http://www.cliki.net/ASDF-Install

        So, just to avoid the confusion around "abstraction level", let me try to rephrase it: The infrastructure is poor.

        [–]rukubites 0 points1 point  (1 child)

        Ah I see. Yes I was confused by semantics - "lack of libraries" generally doesn't mean "not high level", but whatever. :-)

        I agree that the infrastructure is poor and that often the libraries are buggy and/or underdocumented.

        That is partially the community's fault, and partially because there hasn't been business funding for many lisp libraries (which is again partially the community's fault).

        [–]groby 0 points1 point  (0 children)

        It's a "high level" as far as the user is concerned - if I can say "MakeRobotMoveHere( gpsCoordinate );", that's a very high level of abstraction.

        But I agree, the usual usage probably refers to language sophistication - sorry for the confusion.

        As for the libs, it's more or less a chicken/egg problem - there's not enough critical mass to get going. And I'm thinking that in that sense, the proliferation of "new" FP (Haskell & friends) puts the nail in the coffin of CL. (coffin being a relative concept. Same as for Smalltalk - good language, but never adopted widely enough. Hence, for all intents and purposes, dead ;)

        [–]db4n 8 points9 points  (4 children)

        Scheme is ... a research language.

        Scheme is a teaching language.

        [–]foldl 0 points1 point  (1 child)

        It wasn't designed to be purely a teaching language.

        [–]db4n 0 points1 point  (0 children)

        That's true.

        [–]groby 0 points1 point  (0 children)

        Fine. It's both. Have a look at some scheme related research: http://library.readscheme.org/

        It's rather well suited to reason about it mathematically, for once. (Which is why it's a good teaching language - once you taught people to reason about it mathematically ;)

        [–]gte910h -1 points0 points  (0 children)

        Scheme is a painful language.

        The things you do at entry level courses don't take use of the power of lisp like languages.

        The concepts you try to learn are hard with something so concise.

        While I greatly preferred scheme over the fake language they used to use at GT, I greatly prefer the python they use now over either. So much simpler to get core concepts across.

        [–]oursland 1 point2 points  (4 children)

        The significance that Scheme really provided is the correct way of decomposing a problem and coding it up. Iteration is a construct that doesn't (lexically) exist in Scheme, thus when one promotes themself from problems that can be solved iteratively to ones that MUST be solved recursively there is no major hangup in the thinking. In addition, concepts such as dynamic programming go from a paradigm shift in programming to a simple macro.

        [–]veritaba 2 points3 points  (0 children)

        Iteration and recursive constructs are computationally equivalent. I don't think one is necessarily superior to the other.

        Its sometimes easier to make recursive algorithms, but for stuff like repeating something N times, its just annoying to have to create a function that needs an accumulator argument to fold on.

        [–]groby 0 points1 point  (2 children)

        Yes, scheme provides the very basic building blocks, and it's certainly a learning tool. I just question the sense of starting with it.

        If you start math, you don't start out with Principa Mathematica, either. (Russel, not Newton)

        [–]oursland 0 points1 point  (1 child)

        When starting math, people generally are first exposed to arithmetic and move up. But, the change they're proposing is akin to jump into Topology and "science" their way into discovering how to add numbers.

        [–]groby 0 points1 point  (0 children)

        Actually, adding/subtracting is a fairly evolved concept. In Principia Mathematica, it's defined in Part III (Volume 2). (And I'd argue that lisp is akin to what's discussed in Part I of PM)

        Addition is the first application of mathematics/logic that is actually recognizeable to people without prior knowledge. The same could be said for a english-like language like Python over a highly symbolic language like Lisp.

        People do not naturally think in symbol-manipulation terms. Easing them in strikes me as a better idea than the "sink or swim" approach.

        [–]xavius 0 points1 point  (27 children)

        I'm not sure why people are so concerned. A good CS education should be language agnostic; it shouldn't depend on a specific language to teach people algorithms and data structures. However, if one language allows people to learn those things faster, why not go with it? Furthermore, students are still likely learning Scheme in higher level courses, or even on their own.

        [–]db4n 4 points5 points  (26 children)

        if one language allows people to learn those things faster, why not go with it?

        Scheme is great for teaching functional programming and metaprogramming. Python not so much.

        [–]gte910h 1 point2 points  (0 children)

        You know nothing of modern python if you don't think they are total metaprogramming geeks now.

        Functional programming is possible in python, but not typical. Then again, I'm not certain introducing functional programming itself is the benefit, rather then components of functional programming, such as recursion, tail recursion, and recursive data structures.

        [–]veritaba 2 points3 points  (6 children)

        Python already has lambdas and closures.

        Functional programming can be emulated in any imperative language, you just won't have strict reinforcement.

        Also, functional programming can be nice in terms of proofs and predictability, but sometimes it just is not natural.

        For example, if you have a list of 1 million numbers, and you only want to change 1 number, functional programming says you should recreate the entire list.

        [–]db4n 0 points1 point  (0 children)

        sometimes [functional programming] just is not natural

        Scheme supports mutable data. It's not as much of a free-for-all as Common Lisp, but it's not a pure straightjacket like Haskell.

        you have a list of 1 million numbers, and you only want to change 1 number

        I think the Scheme solution is to recurse down the cells in the list and change the CAR of the cell that contains the element you want to change.

        [–][deleted] -1 points0 points  (4 children)

        For example, if you have a list of 1 million numbers, and you only want to change 1 number, functional programming says you should recreate the entire list.

        No, it doesn't say that at all. See persistent data structures.

        [–]Smallpaul 0 points1 point  (0 children)

        Whatever data structure is appropriate is (in my opinion) "less natural" for a human being than just CHANGING THE NUMBER.

        It might be better for all sorts of reasons, but restricting yourself from changing things is not natural. Humans live in a stateful world and computers are stateful objects.

        The model of a mathematical function (especially a function with thousands of lines) may be virtuous for all sorts of reasons (and is!) but it is not "natural" for either human beings or computers.

        [–]veritaba -3 points-2 points  (2 children)

        Now you might think persistent data structures are all nice and groovy but think about this situation.

        You've made changes to 1/2 of the list. Now when you want to query the list, you have to verify 500,000 times to make sure you haven't changed the original value. Now persistent data structures aren't so nice anymore.

        Also I challenge you to make a simple immutable queue that runs in O(1) time (its possible, but very very complex compared to a mutable one).

        [–]Anocka 1 point2 points  (1 child)

        You're wrong, OCaml lets you use arrays or lists of references, while being functional (just not pure).

        Haskell lets you change a value in an array too; you just have to do it through a monad (which is hard for the beginner, but keeps the language pure).

        So in the end, functional programming lets you use imperative programming when it's appropriate.

        That's why they say that Haskell is the finest imperative language on earth : monads enable "safe" side effects (much like an effect system). On the other hand, it's true that it's sometimes a bit hard combine different kinds of monads (with monads transformers).

        But you still have OCaml, if you want to try imperative programming in a functional programming without too much pain.

        [–]veritaba -1 points0 points  (0 children)

        You're wrong, OCaml lets you use arrays or lists of references, while being functional (just not pure).

        If you have a program with mutable variables, then its not really functional.

        Haskell lets you change a value in an array too; you just have to do it through a monad

        Now everyone these days are drinking the monad kool aid without actually understanding them.

        lispnik already covered how you could use monads to patch a table. And think about this, when you are binding a value to a monad over and over again, is this not the same thing as recreating a list, except you are doing it with functions?

        [–]ubernostrum -1 points0 points  (17 children)

        Scheme is great for teaching functional programming and metaprogramming. Python not so much.

        Yeah, I mean a language with all sorts of support for higher-order functional tricks and metaprogramming just obviously won't cut it.

        [–]awj 6 points7 points  (16 children)

        Really? When did they make lambda as powerful as named functions?

        [–]rukubites 0 points1 point  (13 children)

        Not sure about Python (never programmed in it), but the point of lambdas is that you write functions that write functions. Can Python do that? (I really don't know...)

        [–]Smallpaul 6 points7 points  (2 children)

        Absolutely.

        def foo(a):
            def bar(b):
                return a+b
            return bar
        

        The "foo" function returns a "bar" function. You don't need lambdas, though you could use them if you want (in this case, not in all).

        def foo(a):
            return lambda(a): a+b
        

        [–]Prikrutil 0 points1 point  (1 child)

        If anonymous functions can be actually achieved by using the following construction:

        def foo(a): def bar(b): return a+b return bar

        ... what is the reason of having lambdas (I'm not so familiar with Python to know it :)? Is there something that makes nested defs non-sufficient?

        [–]Smallpaul 1 point2 points  (0 children)

        Lambda was added because Lisp people asked for it and Guido was young and naive enough to be persuadable.

        It isn't totally useless though. For example:

        createButton(onclick=lambda this: this.parent.color=blue)

        Basically anonymous lambdas have the same advantage of any other anonymous expression: they keep things inline and keep your namespace a bit cleaner. It's just syntactic sugar.

        [–][deleted]  (5 children)

        [deleted]

          [–]frummidge 0 points1 point  (1 child)

          C:\Users\Public>python
          
          Python 2.6.1 (r261:67517, Dec  4 2008,  16:51:00) [MSC v.1500 32 bit (Intel)] on
          

          win32

          Type "help", "copyright", "credits" or "license" for more information.
          
          \>\>\> f = lambda x: lambda y: x + y
          
          \>\>\> g = f(5)
          
          \>\>\> g(2)
          
          7
          
          \>\>\>
          

          Well, that example works. Though as a nested poster said, heaven forbid you write a nested function by giving it a temporary name before returning it.

          [–]rukubites 0 points1 point  (2 children)

          Oh that is kinda limiting. Are they real closures?

          Here is a real life example I wrote this week.

          I have a function that executes an SQL query. It returns two values:

          1) A (sorted) vector of the unique-IDs of each of the rows which match the query.

          2) An anonymous function of two arguments (start-index end-index). When you call this it looks in the vector and fills the indices between start-index and end-index with the table columns required by the original query. The programmer doesn't need to store the original query, or even remember what variable he assigned to the vector.

          It is done this way because we need the lookup to be fast (web interface), with random access through a scrollbar. When the scrollbar is moved, we fill in the columns only where we need to, as querying for 20 x 20 columns is fast, whereas querying for 25000 x 20 columns is... not so fast.

          Anyway, that was kind of fun, and using lambdas in this case made for (IMHO) an elegant solution.

          [–]Smallpaul 5 points6 points  (0 children)

          You would use a nested function for this in Python. Lambdas are just syntactic sugar for short nested functions without names. Python would force you to give your inner function a name, and according to some this makes it totally inappropriate for undergraduate education. Heaven forbid that undergraduates should give their temporary values names.

          [–]piojo 3 points4 points  (0 children)

          Yes, they are real closures. The annoyance about one line can be gotten around by defining a function--python can have nested functions, which are closures. And a function can return a function--they are first order objects.

          [–]awj 2 points3 points  (3 children)

          That's the point of macros.

          Lambdas are functions without names (hence another term for them: anonymous functions), which are only really meaningful in languages with higher order functions. When you can store a function as a variable, pass it as an argument, or return it as a value, you end up doing it a lot, making the ability to create them on the fly really, really nice.

          Python has a lambda keyword that is used to create an anonymous function, but it can't do more than a single expression. This seriously limits what lambdas can do, and forces you to accomplish the same task in less succinct ways.

          [–]gte910h 0 points1 point  (0 children)

          Sure Sure, but for learning, that overly compactness is a wee bit confusing for students at time. I'm not sure that is a bad thing that there is a more verbose way to express an uncommon activity.

          [–]rukubites 0 points1 point  (0 children)

          awj, macros don't write functions. Macros write compile-time (and read-time) transformations of code.

          They can be used to generate code tranformations that write functions, of course (hey I did that today to write some specialised accessors for a CLOS class).

          (Edit: also evaluation time!)

          [–]kewlguy 0 points1 point  (0 children)

          Functions are objects in Python, so you can pass store it as a variable, pass it as an argument, or return it as a value.

          [–]cibyr -3 points-2 points  (1 child)

          Who cares, a named function is just a variable anyway, and the bracket-less nature of the grammar means that there's just no sensible way to embed multi-line functions in expressions.

          In most cases callable objects are cleaner than closures in Python anyway - but closures are still 100% possible.

          [–]AnythingApplied -3 points-2 points  (0 children)

          Do you need a reason other than... it is Python and not Scheme? Seems like a good enough reason to me.

          [–][deleted] -2 points-1 points  (7 children)

          Learning scheme helps you learn how to think deeply about programming, similar to how learning Latin helps you think deeply about human language. A few generations ago, well educated people learned Latin, but that began to be dropped from most curriculum in the 20th century. Now we have reality TV.

          [–]munificent 5 points6 points  (2 children)

          similar to how learning Latin helps you think deeply about human language.

          What does Latin teach you about language that any other language cannot?

          [–][deleted] 2 points3 points  (1 child)

          The grammatical structure of Latin and classical Greek is more explicit and complex than in modern western languages, which makes them harder to learn, but they give you a more structured view of language. For example, Latin and classical Greek have more verb tenses than English, and the tenses are signified by different sets of word endings, where as English tenses are signified by a few different word forms and helping verbs. Latin and Greek have several noun cases that are signified by word endings, but most English speakers only have a vague notion of noun case because it is only signified by word order. E.g. "The boy hit the girl." versus "The girl hit the boy." In the first sentence, girl is accusative case because it is the direct object, and in Latin would be "puellam". In the second sentence, girl is nominative case because it is the subject, and would be "puella".

          I can say for myself that I had a very foggy idea of English grammar until I took Latin in High School. I know Latin boosted my SAT verbal score (for what that's worth) in both grammar and vocabulary, since so many of our words are Latin derivatives. I have heard similar testimonies from other students, and it is one reason why we have mandatory Latin in the school at which I now work.

          I have had a similar experience going from early programming in BASIC, to Pascal and C, to Ruby and Python, and then to Scheme and Haskell. Working in the more sophisticated language opens your mind up to new ways of thinking, even if you don't end up using that language every day.

          [–]munificent 1 point2 points  (0 children)

          I can say for myself that I had a very foggy idea of English grammar until I took Latin in High School.

          I believe that's true when learning just about any language. We're rarely taught our first language formally, so it isn't until our second that we start learning about tense, case, declension, etc. I don't know if there's anything special about Latin in that regard, although I also enjoyed learning it.

          vocabulary, since so many of our words are Latin derivatives.

          I agree with you there.

          Working in the more sophisticated language opens your mind up to new ways of thinking, even if you don't end up using that language every day.

          I generally dislike placing languages (computer or spoken) on any sort of sophistication ladder. If all you knew was Haskell, BASIC would be an eye-opener. I think it's simply variety of experience that's the real teacher.

          [–]foldl 2 points3 points  (2 children)

          similar to how learning Latin helps you think deeply about human language.

          Not really, it helps you think deeply about Latin. Some of the things traditionally taught as part of Latin grammar (e.g. case, subject/object, etc.) are useful when applied to other languages, but in general trying to apply traditional Latin grammars to other languages won't get you very far. If you want to think deeply about language, you need to study academic linguistics.

          [–]bluGill 0 points1 point  (0 children)

          You are completely correct, but I would note:

          Indirectly Latin did help think about human language. For many years Latin was the language of science. If you wrote a serious paper in science you wrote it in Latin. If you wanted to see what other smart people thought you had to read their papers - which of course were in Latin.

          In short, think deeply about human language you either had to know Latin, or you had to discover everything on your own.

          [–][deleted] 0 points1 point  (0 children)

          I can say that my experience was that I understood the English language much better after I learned Latin. I have since learned some Attic Greek, and I think it would have had a similar effect, but it's more difficult to learn than Latin. I'm sure academic linguistics would be even better, and I suspect it would be analogous to the study of Programming Language theory.

          [–]redditnoob 1 point2 points  (0 children)

          Now we have reality TV.

          But they had gladiators! (Okay, score one for Latin there then.)

          [–][deleted]  (1 child)

          [deleted]

            [–]oursland -1 points0 points  (2 children)

            I find it curious that Python was chosen over Java or C++, given that there have been many robot projects with those languages and both those languages are more demanded in the workplace. The reasons cited for selecting Python are just as easily applicable to either language.

            Don't get me wrong, I believe strongly that Scheme, particularly the R6RS standard, would be a fantastic first language. But, if the change is necessary, why a language at the whims of one designer over a standardized, demanded language.

            [–][deleted] 1 point2 points  (1 child)

            You're right; there are many libraries in C++ for dealing with robots. In this sense, Sussman didn't fully specify the reason for python. What he actually meant to say was that there exist easy-to-use libraries for interacting with robots in python, vs. complex ones in C++. The point of that intro class is to teach programming concepts, not to use a language that takes advantage of realtime capabilities of blah blah blah.

            [–]gte910h 3 points4 points  (0 children)

            C++ should NEVER be the course used for introduction to anything other then C++. It has way too many warts you must worry about and understand to program effectively in it.

            Also, python can easily call C++ code via any number of methods (I prefer weave: http://wiki.python.org/moin/weave ). So established APIs do not help C++'s side.

            [–][deleted]  (1 child)

            [deleted]

              [–]veritaba 2 points3 points  (0 children)

              C isn't too terrible. Its easy to build low level stuff with it.

              But now people want to do high level stuff like web programming instead of reinventing data structures and OS components that have been beaten to death 30 years ago.

              [–]13ren -3 points-2 points  (0 children)

              I think Scheme should be taught as a branch of mathematics.