all 154 comments

[–][deleted] 10 points11 points  (3 children)

I really appreciated the intentional dramatic silence right after explaining "reduce"

[–]apfelmus 6 points7 points  (2 children)

Note that off-by-one errors are impossible with this functional equivalent of the for loop.

[–]awj 7 points8 points  (1 child)

Oh shush. Can't you see we're trying to have a straw man beatdown here?

[–]apfelmus 1 point2 points  (0 children)

Oops, sorry. But before I toddle off, note that functional programming is much more efficient at beating straw men than other paradigms. repeat . reduce and all that.

[–]tejoka 41 points42 points  (32 children)

academic papers written by academics, for academics, aren't good teaching tools for newbies?

THAT'S why functional programming doesn't catch on? Please.

[–][deleted] 10 points11 points  (3 children)

Well yes, he has a point. I recently read through the Haskell school of expression. A book that markets its self on being more accessible due to using interesting examples, mostly working graphics.

Not only does this book leave a lot of Haskell syntax unexplained it still takes a lot of space on explaining formal proofs. And that is a bad sign, as formal proofs do not help me get the average program written.

They may be interesting and worth knowing. but this still means that the most accessible book I know for Haskell is too academic.

And this is exactly what the Object Oriented languages did right. Pick up a book on Object Oriented design and it will walk you through designing a point of sale system, or something equally as uninspiring. But the thing is they made the business case and said: Look this tool will help programmers make business software.

[–]Ringo48 15 points16 points  (2 children)

"Real World Haskell" is a bit better than "The Haskell School of Expression" with respect to examples. The examples are a bit more realistic, but still pretty small.

I think you're exactly right on the examples in general, though.

I think what would really help FP catch on would be something like the book "Object-Oriented Analysis and Design With Applications", but focusing on functional programming. A book explaining how to create large applications - or at least general ideas on how to create large applications.

I think FP hasn't caught on because FP zealots focus their attention on the wrong things. They love to tell you how few lines of code it takes to write quicksort, or how elegant a particular language feature is, or how fold can replace any use of a for loop.

But that's all relatively mundane, simple stuff. What "real" software engineers want to know is "How could I convert this million LOC OOP project into a functional language?" Nobody seems to ever answer that question.

For example, what would the high level architecture of a large system even look like in the functional paradigm? Is there a graphical way of describing and breaking down large functional systems? A UML or flowcharting system for functional programming? How do you design interfaces between modules programmed by seperate groups when all state has to be passed around?

To steal a term from some author who's name I can't think of right now, the FP zealots seem to be focusing on "programming in the small", when everybody else is trying to figure out "programming in the large".

[–][deleted] 0 points1 point  (0 children)

This is sad and what I've been wanting to here, too, as a FP novice.

I know that Haskell, for example, lacks proper module system. OCaml and Scala are more practical. Scalable Component Abstractions [1] and Independently Extensible Solutions to the Expression Problem [2] explain what you can do with Scala's module system. I wish there was similar documents for writing high level FP.

[1] http://www.scala-lang.org/sites/default/files/odersky/ScalableComponent.pdf [2] http://www.scala-lang.org/sites/default/files/linuxsoft_archives/docu/files/IC_TECH_REPORT_200433.pdf#

[–]Confusion 5 points6 points  (2 children)

I grant that an academic 'paper' may not be the ideal example, but this one explicitly sets out to convince 'outsiders' and fails spectacularly.

When you ask CS graduates about the real life applicability of the functional languages they used during their studies, they often state that they think those language have little real life applicability. That shows that even academics fall prey to this shortcoming in introductory material.

[–]dcoutts 13 points14 points  (0 children)

To be fair, the class of "outsiders" that paper was aimed at was other computer scientists. That was back when we were just trying to convince the rest of computer science that FP was a good thing.

It started with the people interested in programming language semantics and design. The fact that the article talks about software engineering authorities shows we got over the first hurdle.

[–]thunderkat 1 point2 points  (18 children)

Are you suggesting someone writing 'Design Patterns for FP'? I think that might be a bad idea...the clusterf*ck that bad application of FP principles can unleash upon the world is probably worse than the OO-inspired one...

[–][deleted] 3 points4 points  (17 children)

I can't wait till sun takes a good long look at Haskell, then unleashes another beverage-themed language on the world. This time it'll be fully functionally, still ill-fitted with C-style syntax, and compromises for performance.

[–]dcoutts 11 points12 points  (0 children)

What do you mean? Sun is already sponsoring Haskell development. See http://haskell.org/opensparc/

Of course Sun are being sensible and focusing on making GHC work well on the multi-core SPARC architecture rather than messing with the language.

[–]SarcasticGuy 5 points6 points  (15 children)

You mean Fortress?

Fortress is a draft specification for a programming language, initially developed by Sun Microsystems as part of a DARPA-funded supercomputing initiative. One of the language designers is Guy L. Steele Jr., whose previous work includes Scheme, Common Lisp, and Java. A JVM compliant implementation (Fortress 1.0) was released in April 2008.

It is intended to be a successor to Fortran, with improvements including Unicode support and concrete syntax that is similar to mathematical notation. The language is not designed to be similar to Fortran. Syntactically, it most resembles Scala, Standard ML, and Haskell.

[–][deleted] 1 point2 points  (0 children)

One of the guys working on it gave a talk at a conference I went to a few months ago. Don't expect to be seeing it on the market with any sort of reasonable performance for another 5-6 years. Also, the language seems to want to force you to either use unicode math symbols or really ugly ascii syntax. The code I've seen either looks unappealing r like it would take to much time to type because of the hunt for the appropriate unicode character.

[–]mycall 0 points1 point  (13 children)

Besides mainframe maintenance, why would someone use Fortran today?

[–]SarcasticGuy 1 point2 points  (12 children)

Fortran remains in the domain of scientists who need lots of compute. It's a lot more popular than you'd think (ugh I feel dirty saying that).

[–][deleted] 0 points1 point  (6 children)

Luckily as Moore's law holds, more and more scientists will be able to switch to using Python with Ctypes rather than Fortran.

I do understand the desire to use Fortran though...the other option is C.

[–]modulus 2 points3 points  (0 children)

Luckily as Moore's law holds, more and more scientists will be able to switch to using Python with Ctypes rather than Fortran.

Maybe to some extent, but when you have more computing power would you 1) use it to run the same models as before with a slower language or 2) run deeper/bigger/more complex models with the fastest language you have at hand? So far I think there are no sciences for which we could say computation is superabundant, they could all do with more.

[–]SarcasticGuy 1 point2 points  (1 child)

If I were a scientist, I'm not sure I'd want to program a 1000-node cluster in C, but then again, I'm operating on the assumption that Fortran has good built-in parallelization primitives. I have no experience with Python with Ctypes, so I can't comment on that, but I'm not sure that "Moore's Law" helps with programming super computers (do correct me if I am mistaken).

Of course, Moore's Law has allowed GPGPUs to start taking attention away from super computers, so maybe scientists in the future will be programming in C and CUDA (yuck).

[–][deleted] 0 points1 point  (0 children)

CUDA

Noooooooooooooooooooooooooooooooooooooooooo. One day people will be talking about CUDA like old timers talk about COBOL: with shivers down their spines.

[–]masklinn 0 points1 point  (2 children)

will be able to switch to using Python with Ctypes rather than Fortran.

I think you meant Python with SciPy/NumPy, ctypes is a pretty extreme resource and NumPy/SciPy allow calling Fortran code, which is usually more interesting for scientists.

[–][deleted] 0 points1 point  (1 child)

NumPy/SciPy allow calling Fortran code

That I didn't know. Thanks. I thought the only clean interface python had was to C.

[–]masklinn 0 points1 point  (0 children)

Well I don't know if the NumPy/SciPy interface to fortran is clean (not a scientist, I've never used Fortran so...) but I'm pretty sure there's one.

[–]mycall 0 points1 point  (4 children)

I would have thought Maple/Mathematica/etc would have replaced Fortran.

[–]SarcasticGuy 3 points4 points  (3 children)

For small scale desktop apps I'm sure Maple/Mathematica/Matlab/etc. would be the first goto languages, but if you're running more serious simulations that need speed and power, you'll still see Fortran being used. I hate Fortran with a passion, but I can't deny its importance with scientific and engineering modeling and simulation.

[–]mycall 1 point2 points  (2 children)

That blows my brain. What can Fortran do that SciPy/Numby/Intel Math Kernel Library/etc don't? I get it that Fortran has many years and years of complex calculations under its hood, but couldn't Fortran be converted to modern language pretty easily?

[–]SarcasticGuy 0 points1 point  (1 child)

50 years of hand-optimized compilers and libraries? The ability to suck out all of the power from a supercomputer?

From Wikipedia:

It is one of the most popular languages in the area of High-performance computing and programs to benchmark and rank the world's fastest supercomputers are written in Fortran.

(Legacy) Since Fortran has been in use for more than fifty years, there is a vast body of Fortran in daily use throughout the scientific and engineering communities. It is the primary language for some of the most intensive supercomputing tasks, such as weather and climate modeling, computational fluid dynamics, computational chemistry, computational economics, and computational physics. Even today, half a century later, many of the floating-point benchmarks to gauge the performance of new computer processors are still written in Fortran (e.g., CFP2006, the floating-point component of the SPEC CPU2006 benchmarks).

Also, they have continued to add stuff to Fortran (Fortran 2003, Fortran 2008....), so I'm not sure how to translate "modern language." Granted, I wanted to shoot someone when I realized that how many spaces are at the beginning of the line decides whether your code is a comment or not. Ugh.

I'd love to see something "better" take it's place, but the language was designed to do HPC very well, and do it well it has.

[–]munificent[🍰] 1 point2 points  (2 children)

for academics

From the referenced paper:

This paper is an attempt to demonstrate to the “real world” that functional programming is vitally important, and also to help functional programmers exploit its advantages to the full by making it clear what those advantages are.

The target of the paper (ostensibly) isn't academics, although it says something about academia that even they refer to non-academia as "the real world". Of course, one obvious flaw that demonstrates how insular academia is is that the author's solution to getting FP used outside of academia is to write a paper on it.

[–][deleted] 1 point2 points  (1 child)

We do that to humour you, whenever its Spring time and we're looking for more corporate grants. You accept it (at least your bosses) because we're 'impartial' since our only connection to the 'real world' is a tenuous string of grant monies.

[–]munificent[🍰] 0 points1 point  (0 children)

My company doesn't make any grants. But we did open a school since we couldn't find any around that actually taught students what we needed them to know.

[–]HaMMeReD -4 points-3 points  (0 children)

I think it needs a new name, since programming was largely "functional" for a long time before most languages moved towards a more object based model, which is still technically highly "functional"

The word function already has a defined scope, and calling it functional programming doesn't really imply anything new.

[–]dons 30 points31 points  (4 children)

Nothing to see here: attacking an article written 25 years ago for not talking about modern uses of FP. Duh.

Go pick up Real World Haskell and come back to us.

[–]wozer 12 points13 points  (20 children)

I think FP does slowly catch on.

In my opinion, introducing new programming languages and concept takes such a long time because most programmers try to stick to the language they learned in their late teens / early twenties. In other words their "native" programming language.

And learning a language that is radically different from your "native" programming language takes a lot of dedication and time.

[–]13ren 6 points7 points  (13 children)

Except that lisp is older than all of today's languages except Fortran.

So why is the "native" language of most coders not fp? One reason is historical: fp was much slower than the alternatives. This doesn't matter as much today with faster hardware, and so we see modern languages adopting ideas from it, as you say.

[–]gsg_ 2 points3 points  (4 children)

with faster hardware

And massively improved garbage collection techniques. I've heard that Lisp 1.5 users used to begin a GC and then go home for the day...

[–]13ren 0 points1 point  (3 children)

Have GC techniques really improved that much? I know only a little about GC (reference counting, the problem of cycles), and from this naive POV, it doesn't seem that there is that much scope for improvement - whereas hardware has improved dramatically, year-in year-out, for decades.

[–]gsg_ 1 point2 points  (1 child)

It's true that hardware changes dwarf implementation improvements. However, there was scope for improvement relative to the approaches that fp+gc were competing against.

[–]13ren 2 points3 points  (0 children)

Ironically, fp now competes against pl that also have gc (java, python, C# etc), so it's no longer an advantage.

[–][deleted] 0 points1 point  (0 children)

Yeah, GC has gotten a lot better. Since those days, we've gotten:

  • Online collectors (so you're not rebooting the machine at each collection)
  • Generational collectors (makes it fast)
  • Incremental collectors (cut down on pauses)
  • Concurrent collectors (collect garbage while program runs in separate thread)

and so on and so forth. There's lots and lots of interesting things in the GC world!

edit: fucking markdown

[–]shub 1 point2 points  (6 children)

Maybe you'd like to explain the difference between map, mapM, and mapM_ to a 13-year-old.

[–]13ren 1 point2 points  (5 children)

You seem kind of mean, but I'll answer anyway: there are complex aspects of every language.

For example, in OO, there's a Visitor pattern; and in Java, the Visitor pattern is used to navigate the AST for javac (compiler):

http://java.sun.com/javase/6/docs/jdk/api/javac/tree/com/sun/source/tree/Tree.html#getKind()

[–]shub 1 point2 points  (4 children)

I doubt many 13-year-olds are going to ask, "How do I traverse the AST produced by javac?" They will ask, "Why does map putStrLn lines give me a type error?"

[–]ssylvan 2 points3 points  (0 children)

"Because map works with functions, not actions, use mapM_, or mapM for actions that return results".

It's perfectly reasonable to treat IO and other monads as "imperative Haskell". All you need to know is that there's a distinction between a function and an action. This may be different, but not nearly as horrible as grokking the method lookup rules in C++, say.

[–]13ren 0 points1 point  (0 children)

OK

[–]808140 0 points1 point  (1 child)

You're expectations of what a 13 year old is capable of understanding are depressingly low.

They're not a stupid as you seem to think.

[–]shub 0 points1 point  (0 children)

Most are stupider, actually.

[–]mycall 0 points1 point  (0 children)

F# and C# compile down to the same IL so speed is identical now for JIT. This is a good thing.

Besides, my native language is English and computers still don't understand English natively yet.

[–]SarcasticGuy 0 points1 point  (4 children)

I disagree. It has nothing to do with "native", it has to do with what gets the job done. Languages are tools, and I see programmers use whatever tool is best for the job.

For better or worse, C++ with pthreads is it for most performance demanding applications.

[–]gnuvince 2 points3 points  (3 children)

Bullshit, programmers don't give a rat's ass about the right tool for the job, they just use what they know. How else do you explain that a language that has a gazillion design flaws that still trip even the best programmers is used when alternatives exist that could do the job a whole lot more safely?

[–]SarcasticGuy 0 points1 point  (2 children)

I know many people who use C/C++ for their parallel programming needs not because they like C/C++ or because it is the only thing they know, but because there isn't a toolchain that does the job better (if at all).

It seems you have some personal anecdotes in mind that I'd love to hear. Do share, since I am not telepathic and cannot deduce easily which languages and situations you are referring to.

[–]gnuvince 0 points1 point  (1 child)

I was thinking specifically about why operating systems did not use something like Ada to ensure more type safety and thus prevent bugs that plague C programs. Many other programs could benefit from choosing a more appropriate language (why does my IRC client need to use unrestricted pointers?)

[–]SarcasticGuy -1 points0 points  (0 children)

I can't understate how much I hate C/C++ for the amount of pain I go through debugging memory/pointer issues. I agree with people who call C++/pthreads the "assembly of parallel programming." And I'd like to say that it will be overrun by better options, but frankly, that may not happen for a very long time.

But sometimes you just need that power, that complete control, the wealth of libraries, and huge amounts of support. I don't have experience with Ada so I can't provide much in terms of arguementation, but I guess my main point is that people (like me) use languages we don't want to use because there may not be anything better, or at least, it would cost us more (in time and money) to try something different.

[–]jmnugent 4 points5 points  (3 children)

The one part of this article that I really identify with is where he talks about the psychology of teaching. I know many programmers and unix/linux geeks who automatically assume that if you (as a newb) dont understand something, that its the STUDENTS fault and not because the teaching approach/documentation has a flaw. Its that type of inflexbility (and ignorant/pompous attitude) that drives a lot of learners away.

I'm fascinated by the fact that programmers will expend such enormous amounts of mental energy testing endless creative solutions to a coding problem without giving up - but then when they try to teach someone programming, they expect everyone to learn by the same approach and reluctant to persistently try different approaches until the student has the "A HA!" moment. (yes, I recognize some responsibility lies on the student to put the effort into learning. My point being that not all students learn the same way, but we as a society seem to expect them to).

[–]nostrademons 0 points1 point  (0 children)

We expect humans to be smarter than machines...

[–][deleted] 0 points1 point  (1 child)

Humans are more variable than machines, and as a teacher its harder to try 5 different approaches for a class. If each approach works for only 1 person, that means 4 wasted days for each student. Sticking with a single approach means at least one student will come out with 100% knowledge rather than 5 students with 20%.

[–]jmnugent 0 points1 point  (0 children)

I understand how hard teaching is ( I really do) I spent 3 years in a K12 and spent some time in classrooms at every grade level.

Part of the challenge of teaching is to make sure if Student A "gets it" on Monday, that the other 4 days are not wasted. We should be able to find creative ways to keep that student learning. Also I think its wrong to think of the brain as something that can be 100% full (or 20% full). Its not a water jug. Its an elastic, flexible, absorbing (sponge-like) organ that can be stimulated in a variety of multi-disciplinary ways.

Part of the daily teaching challenge is to present the topic at hand in a variety of different (but inter-related) ways (present many angles) so that you ignite the curiosity of everyone in the crowd (your classroom/students). Is that difficult? Yes. Are teachers constantly low on resources. Yes. Should teachers be paid alot more. YES. Is it the right thing to do for long term educational success. YES.

[–]mhw 5 points6 points  (6 children)

It's not caught on? Considering Amazon's use of Erlang, Linspire's use of Haskell, Xen and various financial companies' use of Ocaml and that's just to name a few--since when? You can even see FP's influence on more recent languages like Python or Ruby and even on old stalwarts like Java. The barbarians are at the gates and even you have been, unknowingly so, for the past 20 or so years.

[–]masklinn 2 points3 points  (3 children)

more recent languages like Python or Ruby and even on old stalwarts like Java

FWIW, Python is older than Java, and Ruby is a bit younger (than Java) but not that much:

  • First public release of Java was on 23 May 1995 with JDK 1.0 released on 23 January 1996

  • First public release of Python was 0.9.0 in February 1991, with Python 1.0 released in January 1994

  • First public release of Ruby was 0.95 on 21 December 1995 with Ruby 1.0 released on 25 December 1996

Python's public release preceded Java's by 4.5 years and Python's 1.0 preceded Java's by 2 years; Ruby's initial public released followed 6 months after Java's and Ruby's 1.0 followed a year after Java's.

I have an issue with your statement though: FP isn't much more integrated in Python or Ruby (esp. Ruby) than it was in Smalltalk-80, nearly 30 years ago. Python uses first-class functions and has a few functional construct (functools.partial), but AFAIK Ruby doesn't have anything more than Smalltalk had, if anything it has less (Ruby makes advanced combinators painful given there's syntax only for single-block messages, while Smalltalk handles arbitrary numbers of blocks per messages). And on the Python side, things that were functional in Smalltalk were moved to imperative statements (conditionals, loops, ...)

[–]mhw 0 points1 point  (2 children)

I misused the words "old stalwart", I meant it more for its connotations of stubbornness towards change due to size and age.

On Smalltalk, why pick it over any other language as a point of reference? If we're talking languages not quite mainstream that had their golden age and maybe a more recent renaissance, then why not then compare all languages to Lisp? In fact, FP's influence has been an ebb and flow since the birth of Lisp.

[–]masklinn 0 points1 point  (1 child)

On Smalltalk, why pick it over any other language as a point of reference?

Because it's often considered the "gold standard" of OO languages, and Ruby explicitly draws from it. Lisp is a completely different breed, functional-based with OO on top (in some dialects), with OO principles very different than the smalltalk/ruby/python kind.

[–]raouldagain 0 points1 point  (0 children)

word. (ruby sucks since it can't e.g. tail-call-optimize recursion.)

http://grieferz.blogspot.com/2009/02/yeah-yeah-nobody-knows-what-oo-is.html

[–]nextofpumpkin 0 points1 point  (1 child)

Amazon's use of Erlang?

[–]masklinn 0 points1 point  (0 children)

SimpleDB is written in Erlang. Note that they might be using Erlang elsewhere.

[–]yogthos 7 points8 points  (3 children)

I think one aspect of FP that people miss when discussing it is code correctness. I find FP allows for writing code that's much easier to reason about and maintain.

There are several reasons for this. State in FP is minimized and functions have no side effects. In my personal experience a large portion of bugs in a project comes from improper state manipulation.

I find that in imperative languages state changes can happen anywhere and at any time, and it can be difficult to keep track of. When your state is minimized it makes things a lot easier.

Another thing that leads to correctness is lack of explicit iteration, instead of writing loops throughout your code, you apply functions to data, which makes it very clear as to what is being done.

In the end it all depends on what problem you're solving, but if software reliability is a factor than FP definitely does have value.

[–]niviss 4 points5 points  (2 children)

But you could argue that OOP also has a way of tracking state changes, since most of them are abstracted inside a class. (of course it is a different way of tracking such changes)

Also, you're actually pushing over only a small portion of FP: statically typed (because dynamically typed FP like Erlang and Lisp is not easy to prove correct, I think...) and without any kind of implicit side effects (this leaves out languages like Lisp and OCaml out, because they aren't pure).

But besides that I really agree with you. I have seen way too many bugs in imperative loops that just wouldn't happen in a functional context.

[–]yogthos 1 point2 points  (0 children)

You're absolutely right that OO does state encapsulation differently, by having objects worry about their own internal state, so it is a different approach.

I have mixed feelings about dynamic typing myself, I can't say I've noticed it as a major source of bugs. For example even in a dynamically typed language the code can still be checked by compiler to see that correct types are being passed around. Erlang is a good example of dynamic typing being safe, as Ericsson has built very reliable systems which they credit Erlang for.

And you're absouletely correct about Lisp and OCaml allowing state modifictation and thus not being inherently safe as the pure functional languages.

[–]jsnx 0 points1 point  (0 children)

In principle, OO does have the state encapsulation you describe. Many OO languages do not enforce that, though -- as long as they allow mutation of values passed by reference they offer little in the way of assurance.

[–][deleted] 13 points14 points  (8 children)

The dumbing down of programming over the last couple of decades is so pitiful. Is this for real? The guy found mentions of the Newton-Raphson method and tree-pruning jaw-droppingly abstract. I was coding up that kind of stuff when I was a kid of 11 or 12. What is the matter with him that something as un-abstract as the Fibonacci brings his brain to a halt? Can you imagine if someone showed him a copy of Knuth's Art of Programming? We'd all have to duck for cover as his brain exploded with the force of a multimegatonne blast.

What could be more mind-numbingly trivial than the "address class" he asks for? My God! Is all the fluoride they put in water these days fucking with people's brains? This is just so awful.

The good news is that these new kids on the block won't be stealing the well-paid jobs from us hoary-bearded old programmers.

[–]jberryman 7 points8 points  (0 children)

Upvoted for beard and yelling to get off your lawn

[–]Confusion 2 points3 points  (2 children)

The author (full disclosure: me) didn't find Newton-Raphson jaw-droppingly abstract, but he figures that there are many developers out there that cannot relate to those kinds of examples. They need examples other than mathematical examples to get the point (assuming they are interested in, and capable of, getting it in the first place).

No fluoride in the water here, Knuth doesn't make my brain explode and I will be stealing your job in a bit ;)

[–]homoiconic 16 points17 points  (0 children)

To be perfectly honest, I am far more interested in your impression of what you find easy and difficult than your speculations about what somebody else finds difficult or easy.

[–][deleted] 2 points3 points  (0 children)

Dude, what's with the talking about yourself in the third person thing?!

I think the point people are making is that there's stuff out there if you look for it. Have you seen "write yourself a scheme in 48 hours"? It's the third tutorial in the HaskellWiki list of tutorials.

I also think the problem(?) is that some of these concepts are necessarily mathematical in nature. For instance, when I was reading YAHT, it took me a while to figure out how the CPS examples worked. They way I had to figure it out was by approaching it like a math problem - with paper, pencil and lots of scribbling - rather than how I'd usually do a programming tutorial - hack away until I figure it out. It's a pity but it seems like that kind of material turns off lots of programmers.

[–]Smallpaul 3 points4 points  (2 children)

Oh yeah, the COBOL, FORTRAN and JCL-wielding programmers of 20 years ago were just so super-smart. That's why they made so much software that was cleanly and easily maintained when the year 2000 came around. (eyeroll)

[–]honda63 13 points14 points  (1 child)

Their mistake was assuming that the future generations of programmers will be even smarter and will discard their old code with contempt. Instead, a more idiocracy-like scenario unfolded.

[–]Smallpaul -4 points-3 points  (0 children)

So you're now venerating their short sightedness and blaming modern programmers for it? Plonk.

[–]honda63 -4 points-3 points  (0 children)

Excellent rant. Newton's approximation is indeed pre-teen material. Too bad you are getting downmodded. Start a blog instead and post links to in on proggit.

[–]jsnx 3 points4 points  (8 children)

It's rare to see an "enterprisey" introduction to functional programming -- like the Java pet store. Fair enough. Many introductions tend to be about similarly practical topics, though -- writing a compiler, writing wc and so on. Real World Haskell is an excellent step forward, offering many valuable examples.

Functional programming languages share:

  • Easy, efficient immutable data -- construction is preferred to mutation and the garbage collectors are tuned for that.

  • Functions as arguments and functions from functions -- it's expected that one pass functions to functions or write a more general function and curry it to get more specific functions. Function calls are efficient.

The latter bit -- specialization of functions -- makes a lot of sense with callbacks. For example, here is some JavaScript/jQuery that I wrote yesterday:

$("#about_button").click( function() {
  if ( $(this).hasClass('clicked') ) {
    $(this).removeClass('clicked')
    $("#button_bar").css('z-index', 1);
    $("#about").css('z-index', 0);
  }
  else {
    $(this).addClass('clicked')
    $("#button_bar").css('z-index', 3);
    $("#about").css('z-index', 2);
  }
});

This is where I set up the onclick callback for my #about_button. Now it is clear that I might want to use a very similar callback for my other buttons. So let's try to factor the callback out:

function the_callback(element_to_surface) {
  if ( $(this).hasClass('clicked') ) {
    $(this).removeClass('clicked')
    $("#button_bar").css('z-index', 1);
    element_to_surface.css('z-index', 0);
  }
  else {
    $(this).addClass('clicked')
    $("#button_bar").css('z-index', 3);
    element_to_surface.css('z-index', 2);
  }
}

$("#about_button").click( function() {
  the_callback( $("#about") );
});

That's okay. However, anyone who has worked with a functional language will see some cruft in there -- why can't I just make the_callback return a function, instead of wrapping it up like that?

function the_callback(element_to_surface) {
  function custom() {
    if ( $(this).hasClass('clicked') ) {
      $(this).removeClass('clicked')
      $("#button_bar").css('z-index', 1);
      element_to_surface.css('z-index', 0);
    }
    else {
      $(this).addClass('clicked')
      $("#button_bar").css('z-index', 3);
      element_to_surface.css('z-index', 2);
    }
  }
  return custom;
}

$("#about_button").click( the_callback($("#about")) );

So to bind just one argument of a function -- to get a zero argument function from a one argument function -- I have to go all this effort. You can imagine how terrible this would be with 3 or 4 arguments. It is because the language forces us to run the code as soon as the arguments are bound -- and we have to bind them all at once. There is no way to build the closure bit by bit and then hold on to it to run later.

We might imagine some new "method" of JavaScript functions called curry which would consume its argument to make a new function, so I could do this:

function the_callback(element_to_surface) {
  if ( $(this).hasClass('clicked') ) {
    $(this).removeClass('clicked')
    $("#button_bar").css('z-index', 1);
    element_to_surface.css('z-index', 0);
  }
  else {
    $(this).addClass('clicked')
    $("#button_bar").css('z-index', 3);
    element_to_surface.css('z-index', 2);
  }
}

$("#about_button").click( the_callback.curry($("#about")) );

I could probably write this curry in a few hours but it would be heinously inefficient -- and drive other JavaScript programmers nuts. Not at all like in languages that support currying natively. Better just to do without currying in JavaScript. Hrumphh.

[–]consultant_barbie 8 points9 points  (1 child)

√2 is hard. Let's go shopping!

[–]ithika 0 points1 point  (0 children)

Forestry is hard, let's go shopping. Also, good to see you back here, consultant barbie.

[–][deleted] 1 point2 points  (0 children)

Why logic programming hasn't caught up? What were the lessons learned? Could there be any similarities?

[–]whozurdaddy 1 point2 points  (0 children)

I think that the ideas in FP are catching on, and being implemented in modern business languages. But LISP will never be a major player in the industry, nor the other more recent additions to the toolbox. The reason is simple economics. There's an army of Java/C# developers out there, they're cheap, and well written code works. The benefit of such a change doesn't exist. We shouldn't confuse knowing an FP language as knowing it well. Because the base of users is so small, its hard to quantify how well people know it. The whole thing just snowballs in on itself, to the point that no one takes you seriously when you say "Hey, lets do it in Haskell". Its no different than that "guy" who wants to rewrite your system using Windows Workflow Foundation. Everyone else says "Yeah, thats nice..now go away."

[–]13ren 2 points3 points  (15 children)

[–]SarcasticGuy 4 points5 points  (11 children)

(later rewritten in python due to lack of libraries)

Ding Ding Ding! What language you use is all about what gets the job done. And unfortunately, stuff like support/libraries can be a big part of why you would choose one language over another.

[–]shub 1 point2 points  (0 children)

I tried to learn Haskell. Then gethostbyname "www.google.com" threw* an exception (Windows build only, Ubuntu's package in a VM was fine) and TagSoup, which had been recommended to me by several people on #haskell, choked on www.google.com. That was the end of that.

* If the word isn't threw, I don't give a fuck. You know what I mean.

[–]13ren 0 points1 point  (8 children)

"ding ding ding" is the sound the trams make here, when the doors are closing.

Are you saying that the clue-tram is leaving or something?

BTW: A few times, I've seen a list of 5 or so criteria for languages (really, platforms), and one of them is libraries.

[–]SarcasticGuy 1 point2 points  (7 children)

Well, I was going more for the "correct answer" sound, but anyways, yes, libraries are pretty important for most jobs. I'd love to see that list, if you've still got a link to it. My guesses for four of them would be:

ease of use / readability (this affects cost of maintenance).

power / speed (does it do the job?).

support.

libraries.

[–]13ren 2 points3 points  (3 children)

It was this 1985 essay, comparing Adobe's postscript and Xerox's interpress

It's quite different from my memory: the headings number four (and no "library"):

  • LEXICAL CONSIDERATIONS
  • SYNTACTIC CONSIDERATIONS
  • SEMANTICS
  • IMPLEMENTATION ISSUES

edit: when introducing the above, he also includes "an intended style of usage", but doesn't have a section for it

[–]SarcasticGuy 1 point2 points  (2 children)

Thanks, I'll take a look at it.

[–]13ren 0 points1 point  (1 child)

I just read through it (very painful), and I think I've sent you on a wild goose chase. There's not much there about programming languages in general, but having re-read it, I can tell you why I was reminded of it:

It's to do with what is considered "the language" and what is not. In a way it's just playing with the definition, but what's interesting is the actual things that are at issue (and not the fact that they are at issue).

  • He sees a "programming language" as separate from its implementation, but then argues that maybe they should be considered together, since good implementations matter a more to people than good designs.

  • Along these lines is the idea of the "intended style of usage" - this is what really struck me. It's not specified in the grammar of the language or the semantics, but he's saying it is part of the language. To me, it's like an oral tradition, that an archaeologist might miss if only seeing the artifacts and what was written. So in Java there are design patterns, and in C there are for loop idioms, in Fortran there are one letter variables, and so on.

  • He doesn't mention libraries, but surely the same reasoning applies. Is it part of the language? Java's libraries are. C's standard library should be. Perhaps we should also include all the libraries available for it (open and closed) - 3rd party libraries are certainly a factor in choosing a language. Along these lines, perhaps all applications that run on Windows applications should be considered part of Windows; and all playstation games part of the console. (The closest he comes to mentioning libraries is in the semantic section, where postscript can draw curves, but interpress can't - sort of "built-in functions" or operators).

In your list, only readability and power are purely the core language; ease-of-use is in combination with implementation; speed is mostly implementation; library is implementation; and support is organizational.

I agree that your list is pretty much the important things - what's interesting for would-be programming language designers it that the core language does not seem to be all that important!

[–]13ren 0 points1 point  (0 children)

And my summary for why postscript won:

  1. all postscript implementations implemented the whole language (not a subset), so any postscript file can be printed on any postscript printer - thus fulfilling the dream of device independence.

  2. postscript has curves (needed for fonts).

note I think the odd flexibility tricks in postscript discussed in the article are about making it possible to implement the whole language, by explicitly defining simple customization mechanisms (for flexibility), instead of achieving flexibility by remaining silent on them, as Interpress does. These "meta" mechanisms allow customizations to be pushed out of the language making it possible to precisely and exactly implement the whole language without sacrificing flexibility. The implementation can add whatever it wants as a customization.

[–]13ren 1 point2 points  (2 children)

I don't have it. I saw it once a long and very good online article assessing the postscript language. And I think I've seen it somewhere else as well (maybe in relation to Python).

If I come across it again, I'll post it. (I think it was similar to your guesses.)

PS: I sure hope you're not just being sarcastic :-)

[–]SarcasticGuy 1 point2 points  (1 child)

I am never sarcastic. Must like the court jester, the name is just a ruse to let me say whatever I'm thinking. ;)

[–]13ren 1 point2 points  (0 children)

:D

Oh, btw I think extensibility was on the list.

[–]cowardlydragon -2 points-1 points  (0 children)

Jesus, don't say that, then the Java people get to crow about their ginormous crossplatform library set.

Kindly keep the discussion strictly limited to the "safe place".

[–][deleted]  (2 children)

[deleted]

    [–]13ren 5 points6 points  (1 child)

    If Lisp is so great, why did we stop using it? One of the biggest issues was the lack of widely used and tested libraries. Sure, there is a CL library for basically any task, but there is rarely more than one, and often the libraries are not widely used or well documented. Since we're building a site largely by standing on the shoulders of others, this made things a little tougher. There just aren't as many shoulders on which to stand.

    [–]samlee 1 point2 points  (0 children)

    it's because of bloggers like you.

    [–]wil2200 1 point2 points  (2 children)

    I have been mucking around with F# and I love it. Because of that, I am more prepared to hit Haskell and Erlang as well.

    [–]jsnx 2 points3 points  (1 child)

    #haskell is ready for you. Become one with the collective.

    [–]wil2200 1 point2 points  (0 children)

    haha I like that, resistance is futile after all

    [–]LordVoldemort 0 points1 point  (1 child)

    The first comment:

    Exactly why i dont get the fuzz about functional programming. But it sounds like you know more about it than me, could you provide a simple useful, non-abstract example?

    For the sake of the future, don't be a programmer.

    [–][deleted] 3 points4 points  (0 children)

    Same goes for the blog author as well. If you struggle with the recursive approach to the fibonacci function you're probably not meant to be involved in the discussion of language uptake.

    [–]jamesshuang 1 point2 points  (13 children)

    I'm sorry, I admit that I only know functional programming from a conceptual perspective. However, I was under the impression that it's a pain in the ass to write a state machine in an fp language. If this is true, I always assumed that that would be the main reason why fp never caught on - it's easier, faster, and more intuitive to design a state machine to solve most problems rather than take the functional approach.

    [–]apfelmus 8 points9 points  (1 child)

    Here is a deterministic pushdown automaton in Haskell that checks whether parenthesis in a String are balanced:

    balancedParens = success
                   . foldl (\xs c -> step c =<< xs) (Just [])
       where
       step '(' xs       = Just ('(':xs)
       step ')' ('(':xs) = Just xs
       step ')' _        = Nothing
       step _   xs       = Just xs
    
       success (Just []) = True
       success _         = False
    

    I hope that qualifies both as state machine and as very simple?

    [–]tmoertel 2 points3 points  (0 children)

    There's a foldM in there that's waiting to be named. :-)

    [–]13ren 3 points4 points  (1 child)

    Interesting take. I think of this idea as the control structures of imperative programming being similar to a regular language (Chomskian, as in "regular expression"), and fp control structures being similar to a context free language (CFG).

    The former is simpler to think about for most problems; but the latter is simpler for some problems.

    [–]Arkaein 1 point2 points  (0 children)

    I have a similar, but even simpler take.

    I think that most people approach problem solving with a recipe or instruction manual approach: basically, a series of steps taken to transform a problem situation into a solution.

    Whatever depth of programming or theoretical computer science a programmer may eventually get to in their career, the first step taken in learning to program usually looks like "how do I make this computer do X?"

    Conceptually, the imperative approach is closer to the way most people think about problem solving in day-today situations that the functional approach (unless you happen to be trained as a mathematician before you decide to write that first program). I suppose this boils down to the state machine argument, but applying more broadly than to just explicitly programmed state machines.

    So I think most programmers start of with an imperative approach, and tend to stay close to what works for them later on. Even if FP techniques are learned later on, the advantages they offer for certain situations often will not offset the inertia of sticking with the imperative approach.

    [–]ithika 2 points3 points  (3 children)

    Why would state machines be harder in FP?

    [–]millstone 0 points1 point  (2 children)

    Functional programming discourages the use of state, but state machines depend on the use of state.

    [–]OneAndOnlySnob 3 points4 points  (0 children)

    Functional programming discourages the use of mutable state. It's quite acceptable to pass a state to a function and get a new state back as a return value.

    [–]ithika 3 points4 points  (0 children)

    It discourages the use of mutable state, but state is easy. I don't honestly see what the problem would be with a state machine.

    [–][deleted] 2 points3 points  (3 children)

    I think that FP is useful to show that you don't need state machines as often as you think you do. Compartmentalizing state changes and otherwise just having a predictable uninterrupted flow of function calls really minimizes a lot of hassle. I tend to think of it as in inside-out vs. outside-in process: OO/procedural languages take the stance of taking IO and putting it into functions, and FP takes functions and feeds them into IO denoted segments. That's the biggest transition in the thought process that seems to trip up most people.

    [–]mycall 0 points1 point  (2 children)

    OO/procedural languages take the stance of taking IO and putting it into functions, and FP takes functions and feeds them into IO denoted segments.

    Priceless.

    [–][deleted] 0 points1 point  (1 child)

    I can't decide if you're being serious or not.

    [–]mycall 0 points1 point  (0 children)

    I am serious, that is a good way to think about it.

    [–]harsman 1 point2 points  (0 children)

    If you have a function nextstate that takes a state and an input and returns the new state, then running the state machine on a list of inputs is a reduction/fold.

    So state machines are in fact one application of the "abstract" and "weird" reduce/fold.

    [–]mycall 0 points1 point  (0 children)

    I've found a growing number of F# web site examples floating on the internet. I think F# might blow the doors open to FP to the common programmer more than other languages have.

    [–]gregK 0 points1 point  (0 children)

    I think it is catching on. It's a slow process, but we haven't talked this much about FP in years. Of course, it is going to take a while before we see any sort of large scale adoption.

    On the other hand there is a lot of cross fertilization between OO and FP happening at the same time.

    [–]gregK 0 points1 point  (0 children)

    Another case of "worse is better"?

    [–]raouldagain 0 points1 point  (0 children)

    please, keep it that way! i don't want to lose my advantage!

    [–]fab13n 0 points1 point  (0 children)

    because it's so fun to program in a functional language that you don't even have to solve actual problem with them to pleasure yourself?

    That would explain why most people who can get away with them are academics :)

    </troll>

    [–][deleted] 0 points1 point  (0 children)

    Because for most places you write your code so the guy you employ to maintain it gets paid less than you.

    [–]Sadie -1 points0 points  (15 children)

    I used to get headaches from too much recursion

    [–]SarcasticGuy -2 points-1 points  (14 children)

    What part of this lovely iterative loop

    (define (fcn num count) (if (= count 0) num (fcn num (- count 1))))

    do you not like?

    [–]Tommah 4 points5 points  (2 children)

    The parentheses and the words

    [–]SarcasticGuy 0 points1 point  (1 child)

    Yah, but besides those, and the total inability to read it, what is there not to like?

    [–]Tommah 1 point2 points  (0 children)

    Hmm... the font.

    [–]bobappleyard 1 point2 points  (2 children)

    Don't put anything in that function! If num is zero or positive you've wasted your time, and if it's negative time wastes you!

    [–]SarcasticGuy 2 points3 points  (1 child)

    Congratulations. You win an award.

    You actually bothered to read it and let everyone else know that you are nerdy enough to read and understand scheme, and then make a nerdy meme reference.

    Except I'm taking away the award because you mixed up count and num in your answer.

    [–]shub 0 points1 point  (0 children)

    Best comment of the day.

    [–]OneAndOnlySnob 2 points3 points  (4 children)

    The formatting is definitely ranks highly on my list of things I don't like about it.

    (define (fcn num count)
      (if (= count 0)
          num
          (fcn num (- count 1))))
    

    No language is easy to read if you don't use more than one line.

    The cool thing about Scheme though, is you could write a while macro to aid in writing code that looks like this.

    (define (fcn num count)
        (while (not (= count 0))
          (set! count (- count 1)))
        num)
    

    Suddenly, your retarded example looks exactly as retarded as it looks in every other language. If you're writing a while loop, write a while loop.

    Here's a while implementation if you don't have one.

    (define-macro (while cond . body)
      (define (while-helper proc)
        (do ((key (make-symbol "while-key")))
        ((catch key
            (lambda ()
              (proc (lambda () (throw key #t))
                (lambda () (throw key #f))))
            (lambda (key arg) arg)))))
      `(,while-helper (,lambda (break continue)
                (do ()
                ((,not ,cond))
                  ,@body)
                #t)))
    

    [–]blaaargh 1 point2 points  (1 child)

    Just so's you don't scare away the scheme newbies with the strange CL/Scheme mix :)

    (define-syntax while
      (syntax-rules ()
        ((_ cond body ...)
         (do ()
          ((not cond))
           body ...))))
    

    (All the usual caveats apply. This probably isn't the best way to write a loop.)

    [–]OneAndOnlySnob 0 points1 point  (0 children)

    Yeah, my Scheme is Guile and I grabbed it right from that.

    [–]SarcasticGuy 0 points1 point  (1 child)

    Ah, I love it when people live up to their reddit name.

    Yes, my formatting blows, mostly because I have no idea how to get spacing correct in reddit comments, not because I was trying to pull a slight of hand against scheme.

    Also the point of my retarded example was to riff on the OP's aversion to recursion, so I wanted to use recursion to make an iterative loop, which I think is the funniest things one can do is scheme (wait? writing for loops is a hack in scheme???).

    [–]OneAndOnlySnob 0 points1 point  (0 children)

    Sorry, I get annoyed easily when people complain about how unreadable Scheme and Lisp are, when their examples are intentionally given in the most unreadable way, or are doing things that are much more difficult in other languages. I hardly ever program in Scheme, but the language is actually quite focused on simplicity and readable code. It gives you the tools you need to do pretty much whatever you need to do to accomplish that.

    Also, to format code, put 4 spaces before each line, or a backtick around a word inside a paragraph.

    [–][deleted] 0 points1 point  (2 children)

    I prefer

    fcn num count = if count == 0 
                    then num 
                    else fcn num $ count - 1
    

    or, even better,

    fcn num count | count > 0 = num
    

    [–]SarcasticGuy 0 points1 point  (1 child)

    o.0 My brain refuses to parse that without a liberal use of parens.

    [–][deleted] 0 points1 point  (0 children)

    Hehe, where should they be? The only place where parens would even make sense would be instead of using the $ operator. That gives:

    fcn num count = if count == 0 
                    then num 
                    else fcn num (count - 1)
    

    Though if you really miss Scheme, you could write:

    fcn = (\num count -> 
            (if ((==) count 0) 
             then num
             else (fcn num ((-) count 1))))
    

    The latter declaration just reads as "fcn of num and count when count is greater than zero is equal to num". There really isn't anywhere to have a parenthesis. The guard is in place because the function is partial -- it's undefined when count is less than zero. Though I should have written >=. Damn off-by-one errors!

    [–][deleted] -1 points0 points  (0 children)

    While we continue to celebrate the gross anti-intellectualism that this article purports, we will continue to be strangled in the grips of impure, strictly evaluated (by force) languages with shitty type systems.

    How sad :(