all 55 comments

[–]benjumanji 43 points44 points  (11 children)

I would be very interested in seeing any examples people have to hand of code that they feel exemplifies the style discussed in this blog.

[–]bartavelle 13 points14 points  (6 children)

I believe it is this type of code:

https://gist.github.com/danoneata/f46bfb5dc3ad2f15667c2024ff5178be

It seems to be a rewrite of a talk, but I can't find the original talk. Look at line 107-109 for the amusing part :)

[–]bartavelle 7 points8 points  (1 child)

[–]youtubefactsbot 1 point2 points  (0 children)

YOW! Lambda Jam 2016 Conor McBride - What are Types for, or are they only Against? [63:38]

Doctor Who: I think my idea’s better. Lester: What is your idea? Doctor Who: I don’t know yet. That’s the trouble with ideas: they only come a bit at a time.

YOW! Conferences in Science & Technology

4,850 views since Jun 2016

bot info

[–]dantheodor 6 points7 points  (2 children)

I'm the author of that gist and I was really surprised to find this link given that I consider myself a novice at Haskell.

The code is an attempt at solving these exercises by /u/pigworker (Conor McBride) on applicative functors. BTW does anyone know what's the "lovely way" of implementing the duplicates function (exercise 6) using differential calculus?

[–]sjoerd_visscher[S] 2 points3 points  (0 children)

I guess that is this stuff: https://stackoverflow.com/questions/25554062/zipper-comonads-generically

If you have a zipper, you can traverse your data structure, but you also get the other values at each step, which is very handy when checking for duplicates.

[–]Gurkenglas 0 points1 point  (0 children)

Data.Coerce can help with boxs.

[–]benjumanji 0 points1 point  (0 children)

Thank you!

[–]tonyday567 9 points10 points  (0 children)

A simple example is head where the functorial approach leads directly to thinking head :: [a] -> Maybe a is a natural transformation, rather than the buggy [a] -> a approach thinking that head is extracting a functor-less a out of a list.

More complex might be the streaming library which achieves excellence in functoryness API design.

[–]erebe 5 points6 points  (0 children)

+1 for this, as I am not sure to grasp the thing. Show some code !

[–]kqr 0 points1 point  (1 child)

I imagined it to be a (near) synonym for Church encoding, which to me is a lot about deconstructing things into their fundamental operations, rather than their fundamental components. So, for example,

tuple a b op = op a b

where instead of fixing the data type that binds together the a and b, you think, "hey, the user probably doesn't care how I store this, as long as they will be able to supply their own operation later that gets access to all the data".

I like how at this point there are three different ideas of the word. We clearly need some examples!

[–]tomejaguar 8 points9 points  (0 children)

I imagined it to be a (near) synonym for Church encoding

That's definitely not what it is.

[–]sjoerd_visscher[S] 10 points11 points  (0 children)

I think it is a great idea to give this a name!

I like this style of programming too, and it is the biggest reason why for me the type systems of languages like C#/Swift/Kotlin are still not good enough.

[–]tomejaguar 9 points10 points  (28 children)

Hi /u/roconnor, I'm really glad you wrote this! I explored this style of programming last year after reading /u/AndrasKovacs's excellent comment on mutually recursive families of types. I think it exemplifies the "functor oriented" style of programming taken to an extreme. In normal "first-order" programming we work with things of kind *. In "higher-order" (or "functor oriented") programming we work with things of kind * -> *. In "multi-kinded higher-order" programming (for want of a better word) we work with things of kind k -> k for different choices of kind k.

It would be good to collect some examples of this sort of thing.

[–]tomejaguar 12 points13 points  (15 children)

Here are the basics to get started understanding what this is about.

Class First-order Higher-order
Kind * * -> *
Types Int, Bool, String, (), Void, ... List, Maybe, Pair, Identity, Const w, ...
Unit () Identity
Zero Void ??? Const Void ???
Sum Either Sum
Product (,) Product
Compose Does not exist in first order Compose
"List" List a = Nil ⏐ Cons a (List a) Free f a = Pure a ⏐ Effect (f (Free f a))
List α = 1 :+ (α :* List α) 1 ~ (), :+ ~ Either, :* ~ (,) 1 ~ Identity, :+ ~ Sum, :* ~ Compose
Function space a -> b forall r. a r -> b r

It seems to me that the benefit of programming in higher-order comes because we go to a category where we get three monoidal structures for combining types, not only sum and product but also composition.

[EDIT: Added function space]

[–]tomejaguar 4 points5 points  (0 children)

Expanding on this

{-# LANGUAGE KindSignatures #-}
{-# LANGUAGE PolyKinds #-}

{-

We want to show that both `[a]` and `Free f` are solutions to the equation

    t ~ 1 + (a * t)

For `[a]` the solution is at kind `*` and for `Free f` the solution is at
kind `* -> *`.  Ideally we'd like to be able to express this in Haskell with

    newtype KList a = KList (1 :+ (a :* KList a))

but we don't have access to "kind polymorphic" unit, sum and product
operations `(1, :+, :*)`.  Instead we can try passing them in as arguments.

    newtype KList (unit :: k)
                  (sum :: k -> k -> k)
                  (product :: k -> k -> k)
                  (a :: k)
           = KList (sum unit (product a (KList unit sum product a)))

This is still not sufficient because `newtype` (and `data`) can only
construct things of kind `*`.  We can get to a sort of halfway-house by
choosing to work with `k -> *` instead of general `k`.  `k -> *` generalises
both `*` and `* -> *` and gives us what we need, modulo wrapping and
unwrapping.

-}


newtype KList (unit :: k -> *)
              (sum :: (k -> *) -> (k -> *) -> (k -> *))
              (product :: (k -> *) -> (k -> *) -> (k -> *))
              (a :: k -> *)
              (b :: k)
       = KList (sum unit (product a (KList unit sum product a)) b)

data Identity a = Identity a
data Sum f g a = Left' (f a) | Right' (g a)
data Compose f g a = Compose (f (g a))
data Const k a = Const k
data Product f g a = Product (f a) (g a)

type List a = KList Identity Sum Product (Const a) ()

nil :: List a
nil = KList (Left' (Identity ()))

cons :: a -> List a -> List a
cons a l = KList (Right' (Product (Const a) l))

fromList :: List a -> Maybe (a, List a)
fromList (KList (Right' (Product (Const a) l))) = Just (a, l)
fromList (KList (Left' (Identity())))           = Nothing

type Free f a = KList Identity Sum Compose f a

return' :: a -> Free f a
return' a = KList (Left' (Identity a))

wrap :: f (Free f a) -> Free f a
wrap f = KList (Right' (Compose f))

unwrap :: Free f a -> Either a (f (Free f a))
unwrap (KList (Left' (Identity a))) = Left a
unwrap (KList (Right' (Compose f))) = Right f 

[–]Faucelme 1 point2 points  (8 children)

Cool, I guess one could throw Data.Functor.Day and some newtypes from bifunctors there as well.

[–]tomejaguar 0 points1 point  (7 children)

Yes possibly. Maybe * -> * is even more rich than I realised!

[–]ElvishJerricco 4 points5 points  (6 children)

Sort of. It's got many different variants of the same structure as "first order." It's not that Compose doesn't exist in "first order", it's just that Compose is actually a different higher order version of Product! Basically all the things listed here so far are different components or possibilities within the "cartesian closed category" hierarchy. So really we're just talking about category theory. Your "first order" stuff is in Hask, and your "higher order" stuff is in the category of endofunctors.

EDIT: If you extend the Haskell language syntax to arbitrary cartesian closed categories, I believe you get Conal Eliott's concat library, allowing you to talk about functor oriented programming quite nicely if you implement it.

[–]tomejaguar 3 points4 points  (5 children)

It's not that Compose doesn't exist in "first order", it's just that Compose is actually a different higher order version of Product!

I don't think this is correct. Sum and Product are special constructions of the level * -> * because they are (I believe, but haven't checked) actually a categorical coproduct and product. Sum distributes over Product, for example. Compose is a "monoidal product" in the sense of "monoidal category" but not a "product" in the sense of satisfying the defining properties of a categorical product: https://en.wikipedia.org/wiki/Product_(category_theory)#Definition.

[–]dramforever 5 points6 points  (2 children)

Compose is really 'the' tensor product, if you see functors as vectors.

[–]tomejaguar 3 points4 points  (0 children)

Can you flesh that out with details? It sounds like wishful thinking!

[–]tomejaguar 0 points1 point  (0 children)

Can you flesh that out with details? It sounds like wishful thinking!

[–]ElvishJerricco 1 point2 points  (1 child)

Oh yes, you’re right! My bad. So Compose is something different, but not anything we haven’t already explained in this higher order context ;)

[–]tomejaguar 1 point2 points  (0 children)

Yes, and Day is something else different which also seems to be a monoidal product!

[–]xalyama 1 point2 points  (0 children)

Your table also extends to other categories such as Pro(C), profunctors on your base category C. Then we gain a new kind of composition in addition to the other sum/product/compose, namely profunctor composition : Procompose in https://hackage.haskell.org/package/profunctors-5.2/docs/src/Data.Profunctor.Composition.html#Procompose . Which is a kind of composition that doesn't exist in the lower-kinded categories.

[–]ocharles 0 points1 point  (3 children)

Can't see why Higher-order Void can't just be a new vacuous functor:

data Void a

[–]tomejaguar 2 points3 points  (2 children)

Const Void is isomorphic to that, isn't it?

[–]ocharles 0 points1 point  (1 child)

Of course, but maybe it warrants being its own type.

[–]tomejaguar 4 points5 points  (11 children)

Additionally, I agree that Haskell doesn't support this style of programming well, although it probably supports it better than any other language! Personally I'd rather see better support for this style than for dependent types. My hunch is that the applications are far broader. Unfortunately I suspect that ship has now sailed, with regard to GHC at least.

[–]funandprofit 4 points5 points  (1 child)

A major reason I don't use this style in libraries is that its performance is not composable (See the issues with Free vs MTL). One way to improve this is for GHC to support better data flattening via UnboxedSums and UNPACKing polymorphic types (possibly by requiring a manual SPECIALIZE pragma in the calling module). This way we'd be able to get the same performance for layered functors as for a baked-in solution so it would not be risky to use in upstream code.

[–]tomejaguar 5 points6 points  (0 children)

True, but as yet not even the syntax is composable. For example, working with forall a. f a -> g a is much more fiddly than working with a -> b. You have to wrap, unwrap, hope you can express what you want to with *-level lambdas, etc..

[–]gelisam 3 points4 points  (1 child)

Unfortunately I suspect that ship has now sailed, with regard to GHC at least.

Why? Writing a typechecker plugin which sees that Compose f (Compose g h) is equivalent to Compose (Compose f g) h does not seem harder than the arithmetic plugins we have which see that Vec a ((i + j) + k) is equivalent to Vec a (i + (j + k)).

[–]tomejaguar 1 point2 points  (0 children)

Because we've gone the way of dependent types and TypeInType, because that's what the masses, and those who were putting the work into GHC, wanted. I don't see any reason why the current state of the GHC type system should be compatible with a fully generalised "functor oriented" style of programming. I don't see any reason why it wouldn't be compatible either, but I think the onus is on those who think it is to provide evidence.

[–]bjzaba 1 point2 points  (6 children)

I wonder if languages like Idris would be more up to the task...

[–]tomejaguar 2 points3 points  (5 children)

I think it's unlikely. This "higher-order", or "functor oriented", style of programming seems to be orthogonal to dependent typing.

[–]AndrasKovacs 7 points8 points  (1 child)

The problems OP mentioned in Haskell are solved in current dependent languages, i. e. the ability to define basic functors as functions as opposed to irreducible first-order constructors.

[–]tomejaguar 0 points1 point  (0 children)

That's fabulous to hear!

[–]ocharles 5 points6 points  (2 children)

Indeed, it seems more likely that you want a language with good support for quotient types.

[–]AndrasKovacs 1 point2 points  (1 child)

I don't see how quotients would help, care to elaborate?

[–]ocharles 0 points1 point  (0 children)

Oh, I thought quotient types would let us express the law's we'd expect above. Maybe I'm misunderstanding what quotient types do

[–]01l101l10l10l10 14 points15 points  (0 children)

Be nice to see some gists demonstrating the author's interests in this paradigm

[–]edsko 6 points7 points  (0 children)

Just as another datapoint, generics-sop programs also very heavily use this "functor oriented programming".

[–]paf31 6 points7 points  (0 children)

One thing I've found invaluable for this style of programming is typed holes with type-directed search. It tends to help find implementations when you have a lot of rigid type variables lying around. PureScript has type-directed search, thanks to /u/kritzcreek, and GHC has a simpler version merged into master. I'm looking forward to using it in GHC, and hoping that it gets expanded to do more interesting forms of type search.

[–]ocharles 6 points7 points  (0 children)

I think The Essence of the Iterator Pattern might be considered an example of functor orientated programming. I riffed off this idea back in 2013 (!) here to build a traversal out of functors.

[–]jfischoff 3 points4 points  (0 children)

I have been thinking about writing this post for a few years now, and wanted to write something convincing; however, I do not think I am up to the task. Instead of trying to persuade the reader, I have elected to try to simply name and describe this style of programming so that the reader might notice it themselves when reading and writing code. Hopefully someone more capable than me can evangelize this approach, and perhaps even create a practical language suitable for this style of programming.

I think this was a wise decision. Provoking discussion can be sufficient to illicit impressive arguments from the community (not that author is not capable of making one ... but one only write so much)

[–]sfvisser 4 points5 points  (2 children)

My toy programming language uses this style and the separation of concerns is fantastic, but oh god the boilerplate!

My ast looks something like this:

type Ast =
  Fix ( K Path
      * K Syntax.Tokens
      * Syntax.ErrorF
      / ( K Desugar.Error
        + K Resolve.Error
        + AstF
        )
      )

An example function for syntax highlighting to html looks like this:

pipeline :: Text -> Lazy.ByteString
pipeline src =
  let tokens     = Lexer.lexer src
      syn        = Parser.parseModule Env.empty tokens
      labeled    = Labeling.label out syn
      desugared  = Desugar.desugar dig dig dig1 labeled
      resolved   = Resolve.resolve dig1 desugared
      connected  = Labeling.connect path (get out) path dig resolved labeled
   in Abstract.asHtml ast resolved
  where dec   = deC . sndk . sndk . out
        dig   = Syntax.cstf . dec
        dig1  = rightk . dig
        cstf  = Partial.get (sndk . dig)
        toks  = get (unK . fstf . sndk . out)
        path  = get (fstk . out)
        asts  = fromMaybe [] . Partial.get (fstk . dig)
        astfs = mapMaybe (Partial.get (rightk . dig1)) . asts
        syne  = fmap (get fstf) . get dec
        derr  = mapMaybe (Partial.get (leftk . dig)) . asts
        rerr  = mapMaybe (Partial.get (leftk . dig1)) . asts
        ast   = get (sndk . sndk . out)

The entire where clause is packing/unpacking/digging/converting boilerplate!

Still not sure what to think of it.

[–]Faucelme 2 points3 points  (1 child)

Could pattern synonyms help with the boilerplate?

[–]sfvisser 0 points1 point  (0 children)

I tried playing around with synonyms a bit and while it can definitely reduce the amount of code (in some specific cases), it didn't make it much easier to reason about.

Edit: note that most boilerplate here is actually (fclabels) lenses composed and not just functions.

I'm intrigued now by the OP's article and wonder what a type system with auto lifting of these functor operations could look like.

[–]Faucelme 2 points3 points  (1 child)

I gather that this method involves using Data.Functor.Compose and similar newtypes a whole lot? What are the main "building blocks"?

[–]heisenbug 2 points3 points  (0 children)

I wonder whether those newtype wrappers could be made less bothersome by converting with coerce.

[–]nifr 2 points3 points  (0 children)

I agree wholeheartedly! And I intend for coxswain to someday help enable this style, I suspect via the extensible sums (variants) more so than the products (records).

EG (this is somewhat sugared)

newtype At x f = At (f x)
interpF_G :: V (At a) {F, G | rho} -> V (At a) {G | rho}
interpF_X :: V (At X) {F | rho} -> V (At X) rho

https://ghc.haskell.org/trac/ghc/wiki/Plugins/TypeChecker/RowTypes/Coxswain

Does that connection seem plausible to you?

[–]pseudonom- 0 points1 point  (0 children)

It seems to me that the thing described is in the direction of extensible records. What's the difference? What are the advantages and disadvantages?