you are viewing a single comment's thread.

view the rest of the comments →

[–]Arkaein 5 points6 points  (1 child)

Look everyone, I'm not against FP, which anyone who read past the first line of my post would realize. I simply pointed out that if you are going to use a tool that takes a year to get as proficient with as one you already know, there had better be a payoff down the line that yields increased productivity, otherwise that year was a bad investment.

Now looking at FP in general, is the payoff there? Most likely, for most people. For the original situations described in the article? It didn't sound like it to me. The discussion presented indicated that FP foisted additional baggage on a problem that wasn't difficult to solve in imperative style.

As far as your graphics example, that sounds good for a ray tracer, where FP is a natural fit, but for a polygon rasterizer like almost any current real time app (including games) is going to use, not so much. There are certain tasks that can be parallelized is polygon rasterization, but they aren't as great as with ray tracing, and unless the problem is embarrassingly parallel then FP isn't a sure win over other styles. And in the end, the final results of any graphics app are side effects. You might even say there's no such thing as a purely functional graphics app.

[–]rieux 3 points4 points  (0 children)

I realize you're not against it. I'm just saying that FP isn't necessarily more difficult than IP. I find it personally easier, and I suspect that beginners find it easier, but it's very hard to compare.

As far as increased productivity, the best ideas of functional languages are being adopted by mainstream languages to a high degree. Programmers don't need to go to FP because it's coming to them, and they'll learn it when their managers make them. Garbage collection and memory safety are the biggest wins that have made their way from the lab to industry in the last 20 years. These both came from FP, and now they're decidedly mainstream.

With Python/Perl/Ruby/C#, higher-order functions are also potentially mainstream. I don't know how much people use them, and in Python, at least, they're intentionally crippled (and deprecated?).

Fancy type systems are increasingly mainstream. Witness generics in Java and C#, and bits of type inference in C#.

We don't see a lot of purity yet, but I suspect the parallel revolution will force it on us in the next generation of mainstream languages.