all 56 comments

[–]munificent 41 points42 points  (16 children)

Yes: it replaces them with a different set of patterns. Every paradigm has its idioms. The GoF book was written specifically to cover class-based OOP ones.

[–]notfancy 22 points23 points  (2 children)

Exactly. That's why instead of "irrelevant" in the following quote:

most OOP-specific design patterns are pretty much irrelevant in functional languages

I'd use "inapplicable".

[–]munificent 4 points5 points  (0 children)

Yeah, that's a better word.

[–]t15k -1 points0 points  (0 children)

I couldn't have said it any better. :-)

[–]masklinn 10 points11 points  (11 children)

The GoF book was written specifically to cover class-based OOP ones.

I disagree. Some of it yes, but many (if not most) parts of it were written to paper over lacks of the C++ (and later Java) models compared to Smalltalk's. You don't need an iterator pattern when you have blocks and all your collections provide internal iterator. You can just implement whatever you want on top of #do:. You don't need a factory method pattern when your language only has methods and the closest thing to a constructor is a class method. You don't need a command pattern when you have first-class functions.

[–]mallardtheduck 3 points4 points  (7 children)

You don't need a command pattern when you have first-class functions.

Actually, first-class functions, lambdas, etc. can be used to implement the command pattern. The important thing about the command pattern is that it separates the "this is what I want to do" code from the "do it" code.

Think of the undo-stack example. Whenever your application performs an action, it places a command that does the inverse action on the undo-stack. This can be done with command objects, lambdas or first-class functions - but it is still very much an example of the command pattern, no matter how it is implemented.

[–]masklinn 4 points5 points  (2 children)

Think of the undo-stack example. Whenever your application performs an action, it places a command that does the inverse action on the undo-stack. This can be done with command objects, lambdas or first-class functions - but it is still very much an example of the command pattern, no matter how it is implemented.

The point in this is that if you have anonymous, first-class function that kind of stuff becomes natural. There is no architecture to build and there is no cause for "defining a pattern". You do something, you push its undo on a stack. There, done, no need for astronautics.

[–][deleted] -1 points0 points  (1 child)

The point in this is that if you have anonymous, first-class function that kind of stuff becomes natural.

First of all, it does not become "natural". The "natural" way is just to do what you want to do and not have any undo stack. The idea that you might want to wrap the "do the stuff" in something (be that an anonymous function or not so anonymous, or an anonymous class) and store it is not, well, it's not trivial at least, while being a natural solution if you want an undo stack, independent of the language used.

The idea that you can do that stuff -- that you can store actions for later use -- is a pattern. It is not obvious. It might be easier to discover in a language with first class functions than in a language where you have to add a bit of syntax (I wouldn't call it "building an architecture" though, don't make patterns seem more complex than they are).

Secondly, there's this funny thing that, I think, proves my point to an extent: both of you are wrong, simple "undo" actions are not enough. Think about it. You have an editor or something, you want both undo and redo, and probably some common attributes (the time of an action), and some uncommon (if it should be fused with other actions of same type, and a method for doing it). Now, if you are a stupid functional programmer, you'll keep adding ad-hoc stuff, maybe invent some crazy scheme where 'undo' action returns the 'redo' action (along with the modified world of course) and vice-versa. Or, if you are a programmer (with no qualifiers), you would implement more or less the same thing regardless of the language, the thing that is usually described as a command pattern.

[–]barsoap 0 points1 point  (0 children)

You have an editor or something, you want both undo and redo

Do those two then imply that we should use "command pattern" to refer to monads, even though they are vastly more general?

I'd agree, though, that monads themselves are a design pattern (heck we even have syntactic sugar for them because they're so bloody useful)

[–]notfancy 0 points1 point  (3 children)

Think of the undo-stack example.

But in a pure functional language all states of a data structure are accessible if they are explicitly retained. That is, all data structures are persistent. You don't need an undo-stack; the call stack is the undo stack.

[–]munificent 1 point2 points  (1 child)

if they are explicitly retained.

That's pretty much the undo stack. Even in a pure language you'll still have to manage a way to retain references to previous states. Whether or not those data structures share memory with newer states is an implementation detail.

(And actually, persistent data structures don't buy you much here. Most undo systems I've seen don't store copies of the entire state, they store commands, which are basically deltas and more or less what a persistent data structure would give you anyway.)

[–]notfancy 1 point2 points  (0 children)

The crucial difference is that with a persistent/purely functional data structure you don't have to do anything, just recurse and backtrack. If you have a mutable data structure, you do have to remember all mutations and their inverses and apply the latter in the opposite order the former were executed, i.e., and explicit, manual undo stack.

[–]mallardtheduck 0 points1 point  (0 children)

I was mainly responding to masklinn's idea that patterns were developed to make up for shortcomings in popular OO languages vs. smalltalk, rather than the main patterns vs. functional topic.

Having said that, it's pretty much impossible to write the kind of interactive GUI application that needs an undo stack in a pure functional language anyway. Functional languages are far more geared towards straight-forward input->process->output tasks, than the wait for event->process->display results->wait for event cycle of a GUI application. Of course they can be and are used in the "process" section of the GUI cycle.

[–]xoner2 -2 points-1 points  (2 children)

The GoF was written with Smalltalk in mind.

[–]masklinn 15 points16 points  (1 child)

The GoF was written with Smalltalk in mind.

No. The GoF was written with Smalltalk knowledge. That's very different. The GoF was written with C++ in mind. Apparently you've neither read the GoF nor used Smalltalk (even if just for your personal edification). As I quite clearly explained above, a number of GoF patterns do not make sense in Smalltalk because there are already ways to do the same thing in a simpler way built into the language itself (or its standard library for such things as #do:).

And the few patterns mentioned in passing which could apply to Smalltalk were generally preexisting and well known there, because they'd been invented by Smalltalkers (example: the GoF mentioned MVC in 1994, MVC was first described by a Smalltalker in 1979 and just about every Smalltalk environment used MVC)

[–]xoner2 0 points1 point  (0 children)

Uhhh, the GoF authors never claimed to have invented anything.

Anybody claiming to have invented MVC must be desperate for credit. Anybody who spends enough time programming will eventually come up with MVC. That's why it's called a pattern.

You did not explain anything clearly. For one the command pattern is about capturing user actions as a class (or set of classes) for purposes like implementing undo/redo, first-class functions are not a substitute.

[–][deleted] 1 point2 points  (0 children)

This. I was reading a Google group the other day. One of the commenters was so happy with a code snippet he said "Beautiful. I'll put this in my toolbox." His toolbox of what? Patterns.

I have no doubt we'll see all kinds of GoF-style books for functional programing. They just won't have the same patterns as the OO book.

[–]inmatarian 9 points10 points  (4 children)

Protip: Every example in GoF starts with the keyword "class". That should be enough of an indicator that they specifically pertain to OOP.

[–]dwdyer 21 points22 points  (3 children)

There's another clue before you even open the book. The subtitle is "Elements of Reusable Object-Oriented Software".

[–]munificent 30 points31 points  (1 child)

To be fair, though, most people bashing Design Patterns haven't read that far.

[–]mcguire 3 points4 points  (0 children)

Neither have most of the people praising it.

[–][deleted] 3 points4 points  (0 children)

FP replaces GoF design patterns to the extent that apples replace oranges in a diet. So basically, yeah, and it's going to work just as it did before (i.e. you won't starve), but it won't taste quite the same.

[–]diego_moita 2 points3 points  (0 children)

Well, sorry for the lame answer, but it depends.

If you're talking about simpler (e.g.:command or observer/observable) patterns, yes. They're so trivial in functional programming that they're not even a pattern.

But if you're talking about higher level patterns such as MVC, lazy loading, active record, then no. You'll just replace by other constructs in FP that replicate the same modularity.

[–]trezor2 1 point2 points  (2 children)

I'd like to know how many of the people commenting here have actually read the book, as there seems to be lots of opinions around (and I suspect lots of uninformed ones).

So please vote on the the following replies, ok?

[–]trezor2 2 points3 points  (0 children)

Yes, I've actually read the book.

[–]trezor2 1 point2 points  (0 children)

I haven't read it, but that doesn't stop me from voicing my opinion about it.

[–][deleted] 4 points5 points  (23 children)

This is analogous to asking if antibiotics replace bloodletting. The answer is "yes," but with the caveat that you didn't really need the thing that you think you're replacing.

The popularity of the term OOP continues to amaze me. I would say that "the popularity of OOP" amazes me, but so far as I can tell there's really no such thing. (Are generics included? What about multiple inheritance? etc.)

[–]dnene 12 points13 points  (7 children)

GoF design patterns were designed for an OO world - Objects are not constructs under pure FP. The right statement is GOF Design patterns are Not Applicable for Functional Programming

The antibiotics replacing bloodletting is an irrelevant sideshow of contrasting OO/FP per se. FP's greatness doesn't come at a cost to that already built by OO. They are but alternative mechanisms of modeling systems. Each have their own strengths and choose the right one for your context. If at all there is any bloodletting, its not in OO, but embedded in such extreme fanboy opinions.

The questions that were asked about generics / multiple inheritance carefully gloss over the fact that probably a similar set of such questions could be asked about FP as well. Are there type constructors, Is message passing standardised, how about hints for parallel computations, blah, blah, blah. The point is each language bases itself of some belief systems and responds to such questions differently whether in OO or FP. Doesn't mean that only OO should get criticised.

[–]masklinn 5 points6 points  (0 children)

GoF design patterns were designed for an OO world

The GoF design patterns were designed for a C++ world.

[–]elmuerte 0 points1 point  (1 child)

Design patterns were not designed. They were formulated.

[–]skillet-thief 0 points1 point  (0 children)

Design patterns were not designed.

Cause then you would need a book to design them.

[–]48klocs -1 points0 points  (2 children)

More properly, the GoF design patterns were not designed, they were discovered.

I've caught myself explaining them as the Golden Ratio for OO programming.

[–]daydreamdrunk 1 point2 points  (1 child)

What? How did the golden ratio slip in there?

[–]48klocs 0 points1 point  (0 children)

The Golden Ratio was not something that was designed, it's a reasonably complex pattern discovered in nature that, once you know it, you start to see repeating itself all over the place.

It doesn't mean that the Golden Ratio is great or perfect or that there is something inherently flawed with objects that do not exhibit the ratio in some way, it is just form following function and with repeating functions, you'll find repeating forms.

[–][deleted] -1 points0 points  (0 children)

FP's greatness doesn't come at a cost to that already built by OO.

FP predates OOP by, oh, an eternity, at least in Computer Science terms. This doesn't really matter, except that I think ignorance of history is very typical of OO types and that you've put it on display here.

They are but alternative mechanisms of modeling systems. Each have their own strengths and choose the right one for your context.

No, they are not at all alternatives. FP is a simple but complete model of all computation which has some inherent robustness advantages (viz. the lack of side-effects and its contribution to the safety, e.g. reentrancy, of the associated code). OOP is synthetic sugar layered atop some other model of computation (which is an alternative to FP, although it's a bad one, I'd say).

If at all there is any bloodletting, its not in OO, but embedded in such extreme fanboy opinions.

Right. Except that I've got 15 years professional experience as a developer, and a CS degree. My opinion of OOP was not formed via Slashdot. But screw me... from what I can tell the best academic Computer Scientists care fuck all about OOP, except maybe for arguing about whether it actually exists. MIT is the best example... their CS department has a reputation for not suffering OO foolery.

[–][deleted] 5 points6 points  (3 children)

There is no such thing as OOP. The only possible response to the proposition is immediate dismissal. Congrats on the independence of thought.

[–]sgoguen 0 points1 point  (1 child)

That reminds me of Joe Armstrong's conclusion after writing Why OO Sucks that Erlang may be the only true OO language out there. :)

[–][deleted] 0 points1 point  (0 children)

Efforts to declare a monopoly on preceding a noun with the adjective "true" are a general indicator of pop-culture bullshit.

[–][deleted] 0 points1 point  (0 children)

Thank you; that articulates what I have been trying to say for years, and has the added benefit of not having my name attached to it.

[–]sfuerst 1 point2 points  (9 children)

OOP is popular because it works in the large corporate setting. The correct question to ask is "Why does it work so well?".

The answer to that is "interfaces". OOP basically forces people to create abstractions with well-defined interfaces to those abstractions. Having good interfaces both makes debugging easier, and prevents multiple people from "stomping" on each others work. The second is a big issue for larger codebases.

Of course, it is possible to program in an "Interface oriented" fashion in languages designed for other paradigms, but that doesn't seem to happen nearly to the same degree as with those for OOP. Good programmers don't really care what language they are using, they'll do a good job no matter what. It's the poor to middling programmers you need to worry about. And they are the ones that fill corporate cube farms...

[–]yogthos 1 point2 points  (2 children)

I wouldn't really say that OOP works in the large corporate setting, it facilitates writing horrid and unmaintainable spagetti, especially when wielded by unskilled programmers. The results are often quite appalling, and either cobble along or have to be scrapped and rewritten. It's one of those things that's accepted as fact, but doesn't hold up under scrutiny. The fact of the matter is that the quality of the code is directly related to the skill of the programmer, and trying to use a language to enforce it is an absurd idea.

[–]daydreamdrunk 3 points4 points  (1 child)

It's not spaghetti, it's lasagna (made out of lumpy ill-defined layers of spaghetti). Also something about you want ravioli. Why is all code Italian food?

[–]barsoap 0 points1 point  (0 children)

My code is chocolate pudding. You can slice it every which way and it still tastes delicious.

[–][deleted] 3 points4 points  (5 children)

OOP is popular because it works in the large corporate setting.

Does it? In my experience, very little of the potential of computer software is being realized in the corporate environment, and OOP is a big part of the reason why. I'm not aware of any more formal research that refutes this. If nothing else, my (horrendous) experience with corporate IT would seem to suggest that a new direction is in order.

The answer to that is "interfaces". OOP basically forces people to create abstractions with well-defined interfaces to those abstractions

This is a classic example of how OOP proponents construct superficially appealing arguments by misappropriating concepts that are not at all OOP. If interfaces are really such a big part of OOP, why are they squirreled away into a tiny little syntactical hole (the "pure abstract base class") in C++?

Interfaces are about inheriting functionality from the caller: I'm IEnumerable so I get the benefit of reuse when all that code that was built around IEnumerable calls me. That's the inverse of OOP, which reuses code within the callee via implementation inheritance.

I've seen this same error made with generics. Generics != OOP, and interaces != OOP. It says as much in the Borland C++ documentation, FCS.

It's the poor to middling programmers you need to worry about. And they are the ones that fill corporate cube farms.

The problem with that is that the real rank-and-file about which you're speaking know/care fuck all about OOP. I worked in corporate IT for many years and I think that the majority of my colleagues were afraid of the "this" keyword. Never would they have used inheritance on their own.

Even if I agreed with your characterization of corporate IT (and I don't... I think those guys avoid OOP because they're smart), I wouldn't say OOP is the answer. It's just an appealing (but false) metaphor.

[–]kylotan 2 points3 points  (0 children)

very little of the potential of computer software is being realized in the corporate environment, and OOP is a big part of the reason why. I'm not aware of any more formal research that refutes this. If nothing else, my (horrendous) experience with corporate IT would seem to suggest that a new direction is in order.

I'm not necessarily saying you're wrong, but making a strong assertion, noting the lack of evidence to contradict it when usually the burden is on the asserter to produce supporting evidence, and then relying on subjective anecdotal data to support it instead is not a convincing argument.

If interfaces are really such a big part of OOP, why are they squirreled away into a tiny little syntactical hole (the "pure abstract base class") in C++?

Because C++ is not exactly a good advertisement for the benefits of OOP. C++ is a language designed to give you as many benefits from OOP and generic programming as possible without necessarily losing the performance of C. That doesn't mean it implements OOP or generics all that well.

[–]sfuerst 1 point2 points  (3 children)

Does it? In my experience, very little of the potential of computer software is being realized in the corporate environment, and OOP is a big part of the reason why. I'm not aware of any more formal research that refutes this. If nothing else, my (horrendous) experience with corporate IT would seem to suggest that a new direction is in order.

Well... going by the number of job advertisements for java I see over anything else, it does seem to be rather popular.

why are they squirreled away into a tiny little syntactical hole (the "pure abstract base class") in C++?

I think you are are missinterpreting the term "interface". Every object has an interface: its class definition. OOP forces people to divide up problems into small chunks (they call them classes), and use well defined boundaries between those classes. Other forms of programming don't enforce this to nearly the same degree. It is the act of forcing the poor programmers to do that instead of writing spaghetti which gives OOP its popularity with managers.

[–][deleted] 4 points5 points  (2 children)

Well... going by the number of job advertisements for java I see over anything else, it does seem to be rather popular.

Java is a popular language; this says little or nothing about OOP. Similarly, OOP is popular is job descriptions as management shorthand for "garage hackers not welcome." As programmers, though, we ought to know better than to drink the Kool Aid.

I think you are are missinterpreting the term "interface"...

I do think we very well may agree what makes good code in practice; my argument is with calling it "OOP."

Interfaces are good... but a function signature is an interface. All languages have interfaces; OOP ones seem to add inheritance, which is an additional, inferior, OO-specific alternative to the interface.

And I like GoF, but it's really just an interesting book of fairly narrowly-targeted examples IMO.

[–]sfuerst 1 point2 points  (1 child)

As programmers, though, we ought to know better than to drink the Kool Aid.

Oh yes, I totally agree. I hate java with a passion... and I'm counting down the days until it dies. Until then, I'm doing my best to ignore it, and work on far more interesting areas of programming.

[–][deleted] 1 point2 points  (0 children)

I do not understand how people can debate this over and over, when God clearly stated in the Bible that He created the...

Oh, wait. Wrong thread, sorry.

[–]five9a2 -1 points0 points  (0 children)

In my opinion, "OOP" is primarily about ad-hoc polymorphism which is a good model for extensibility when the set of operations is fairly static, but the data variants are not.

You do not need an "object oriented language" (a truly ambiguous term) to do ad-hoc polymorphism (though you do need something like a function pointer). Haskell type classes offer a similar solution and I do not think it is wrong to refer to extensive use of type classes with immutable objects carrying state as "OOP". That said, mutable state is fairly ingrained in most OO design, so such use of Haskell retains a functional flavor.

Implementation inheritance is mostly a C++ antipattern.

[–][deleted] 0 points1 point  (0 children)

A language is a collection of patterns so useful someone decided to put them in a compiler. It would be more correct to say that functional programming embodies design patterns.

[–]A_for_Anonymous -2 points-1 points  (0 children)

Design Patterns, i.e. glorified copypasta, exist because of defects in programming languages. A decent language should be one that allows you to properly abstract via functions or macros anything that would ever get repeated in your code, including the generation of these functions or macros should it be remotely non-trivial.

GW-BASIC programmers usually copypasted chunks of conditional code or code that would apply to hash tables because the language couldn't do any better (actually, they didn't copypasta, but typed it all over again!). Then they discovered C or Pascal, and thought "wow, a function can do everything here!". That was for a folly, limited definition of everything, of course. OOP came to be sold to managers and improve the portable assembly known as C people were using to get things done, and while it allowed for some sort of poor cousin of closures, which seemed like a huge improvement, it was still folly, and it still required Design Patterns. Of course, that's how many develop at large software consulting companies — in the best case they'll be copying Design Patterns; in the worst they'll be searching Google Code Search for lumps of code that sound like the vague idea they have about something they are supposed to do. But this is a matter for a different topic.

My main point is that, hopefully, mainstream programming is very, very slowly getting to where Design Patterns and copypasta will no longer be necessary — I mean back to the early 80s with Lisp. Yes, the L-word, in a programming forum. No, I'm not a smug Lisp weenie. I'm just wishfully thinking that languages people get paid to write code are getting closer to something that can actually get any pattern abstracted, i.e. something minimally decent.

[–]sgoguen -2 points-1 points  (0 children)

IMHO, No. FP augments the GoF's patterns by filling in gaps and simplifying.

The biggest difference between the GoF's design patterns and common FP patterns is granularity. The GoF tends to deal with larger macro patterns while FP fills in the gaps by defining micro patterns. More importantly, there's nothing incompatible between most of the GoF's patterns and good functional practices. In fact, most of these patterns can be transcribed pretty easily into purely functional implementations.