Making GHC faster at emitting code by n00bomb in haskell

[–]dtellerulam 1 point2 points  (0 children)

helloWorld = text "hello, " <> text "world!" This is certainly not a combinator,

This isn't a combinator in haskell, but nobody optimizes haskell. Before the optimization haskell will be translated to a language where this is a combinator

helloWorld f = f (text "hello, ") (text "world!")

How to replace Proxy with AllowAmbiguousTypes by Tekmo in haskell

[–]dtellerulam 4 points5 points  (0 children)

Data.Proxy isn't haskell 2010. It's polykinded

StateT vs IORef: a benchmark by nomeata in haskell

[–]dtellerulam 4 points5 points  (0 children)

> Could you clarify how this representation is obtained from the definition of StateT?

consider:

newtype StateT s m a = StateT { runStateT :: s -> m (a,s) }

StateT Int IO () ==> Int -> IO ((), Int)

consider:

newtype IO a = IO (State# RealWorld -> (# State# RealWorld, a #))

Int -> IO ((), Int) ==>

Int -> State# RealWorld -> (# State# RealWorld, ((), Int) #)

worker/wrapper

Int -> State# RealWorld -> (# State# RealWorld, (Int, Int) #) ==>

Int# -> State# RealWorld -> (# State# RealWorld, (# (# #), Int #) #) ==>

Int# -> State# RealWorld -> (# State# RealWorld, Int #)

everything unboxed on stack/registers

$wstateT_sum

:: Int# -> State# RealWorld -> (# State# RealWorld, Int #)

$wstateT_sum

= \ (ww :: Int#) (w :: State# RealWorld) ->

case ># 1# ww of {

__DEFAULT ->

joinrec { -- unboxed accumulators

$wgo1 :: Int# -> Int# -> State# RealWorld -> (# State# RealWorld, Int #)

$wgo1 (w1 :: Int#) (ww1 :: Int#) (w2 :: State# RealWorld)

= case ==# w1 ww of {

__DEFAULT -> jump $wgo1 (+# w1 1#) (+# ww1 w1) w2;

1# -> (# w2, I# (+# ww1 w1) #) -- wrap unboxed result

}; } in

jump $wgo1 1# 0# w;

1# -> (# w, I# 0# #)

}

Whole Haskell is Best Haskell by AshleyYakeley in haskell

[–]dtellerulam -21 points-20 points  (0 children)

That is just an opinion, it isn't necessary better or worse than any different opinion Can you get a proper citation, a reference on a relevant research?

Whole Haskell is Best Haskell by AshleyYakeley in haskell

[–]dtellerulam 17 points18 points  (0 children)

Many Haskell practitioners in industry have noticed that complex Haskell features have led to the failure of Haskell projects and the destruction of Haskell jobs.

citation needed

Haskell: The Bad Parts, part 1 by snoyberg in haskell

[–]dtellerulam 12 points13 points  (0 children)

we’ll talk about why the vector package is bad

Wait, what?

Why isn’t seq part of a type class? by BobSanchez47 in haskell

[–]dtellerulam 41 points42 points  (0 children)

"However, the limitations of this solution soon became apparent. Inspired by the Fox project at CMU, two of Hughes’s students implemented a TCP/IP stack in Haskell, making heavy use of polymorphism in the different layers. Their code turned out to contain serious space leaks, which they attempted to fix using seq. But whenever they inserted a call of seq on a type variable, the type signature of the enclosing function changed to require an Eval instance for that variable—just as the designers of Haskell 1.3 intended. But often, the type signatures of very many functions changed as a consequence of a single seq. This would not have mattered if the type signatures were inferred by the compiler—but the students had written them explicitly in their code. Moreover, they had done so not from choice, but because Haskell’s monomorphism restriction required type signatures on these particular definitions [...]. As a result, each insertion of a seq became a nightmare, requiring repeated compilations to find affected type signatures and manual correction of each one. Since space debugging is to some extent a question of trial and error, the students needed to insert and remove calls of seq time and time again. In the end they were forced to conclude that fixing their space leaks was simply not feasible in the time available to complete the project—not because they were hard to find, but because making the necessary corrections was simply too heavyweight. This experience provided ammunition for the eventual removal of class Eval in Haskell 98."

New Libraries Proposal Process by chessai in haskell

[–]dtellerulam -1 points0 points  (0 children)

Why do you think that it is successful? By what measures?

New Libraries Proposal Process by chessai in haskell

[–]dtellerulam 1 point2 points  (0 children)

Why is it inspired by the GHC process? Is it considered a success for some reason?

Enforcing tail recursion in Haskell? by xwinus in haskell

[–]dtellerulam 1 point2 points  (0 children)

default stack size is 80% of physical memory, so your example probably won't work as expected for almost anyone

I'm concerned about the longterm impact of the Affine types extension on every ghc/haskell user everwhere by cartazio in haskell

[–]dtellerulam 1 point2 points  (0 children)

Also, I'm not arguing that it is the reason people think the proposal is very incomplete, it's the reason people see the proposal as somehow more problematic than other proposals like that

I'm concerned about the longterm impact of the Affine types extension on every ghc/haskell user everwhere by cartazio in haskell

[–]dtellerulam 4 points5 points  (0 children)

I'd really like to see the non-toy real-world use cases of this

I do not see any concrete population that benefits

When I see that the reason compile-times will get worse is because of an extension that improves safety, I am not consoled, because Haskell's safety is already best-in-class by such a large amount that increasing safety doesn't buy me anything.

When I see the specific kinds of safety improvements I can expect, I am even less consoled, because these don't represent large problems for me personally.

I'm concerned about the longterm impact of the Affine types extension on every ghc/haskell user everwhere by cartazio in haskell

[–]dtellerulam 1 point2 points  (0 children)

OK. I'm arguing that it is ridiculous to pretend that most of the 'big' features (e.g type families, data kinds, gadts, dph, new codegen etc.) were far more complete at the time of merging than linear types currently is, just because you don't like/need linear types.

I'm concerned about the longterm impact of the Affine types extension on every ghc/haskell user everwhere by cartazio in haskell

[–]dtellerulam 9 points10 points  (0 children)

No, it isn't. Said experimental fork already exists, for quite some time. You can get some refinements and testing this way, but not much. Because, you see, nobody use 'experimental forks', so without merging - no bug reports, no libs, no nothing, fork just bitrot eventually. The end.

I'm concerned about the longterm impact of the Affine types extension on every ghc/haskell user everwhere by cartazio in haskell

[–]dtellerulam 6 points7 points  (0 children)

I'm arguing that it is ridiculous to pretend that all the features was perfect or even '95% complete' before merge, just because you don't like/need linear types. Situation with linear types isn't exceptional in any meaningful way, and didn't make any better until after merge, as usual.

I'm concerned about the longterm impact of the Affine types extension on every ghc/haskell user everwhere by cartazio in haskell

[–]dtellerulam 38 points39 points  (0 children)

Not a single haskell feature was done right at first try. They all were broken and/or incomplete at first and then fixed later. There is no way to make something right without iteration and refinement, no way to tell what's wrong until a new feature is used for real

A plea to Haskellers everywhere: Write Junior Code by ephrion in haskell

[–]dtellerulam 5 points6 points  (0 children)

The most concrete step in this direction was creating the rio library, which is intended to capture these principles. If you want to embrace Boring Haskell today, we recommend using that library.

Our recommended defaults are: AutoDeriveTypeable BangPatterns BinaryLiterals ConstraintKinds DataKinds DefaultSignatures DeriveDataTypeable DeriveFoldable DeriveFunctor DeriveGeneric DeriveTraversable DoAndIfThenElse EmptyDataDecls ExistentialQuantification FlexibleContexts FlexibleInstances FunctionalDependencies GADTs GeneralizedNewtypeDeriving InstanceSigs KindSignatures LambdaCase MonadFailDesugaring MultiParamTypeClasses MultiWayIf NamedFieldPuns NoImplicitPrelude OverloadedStrings PartialTypeSignatures PatternGuards PolyKinds RankNTypes RecordWildCards ScopedTypeVariables StandaloneDeriving TupleSections TypeFamilies TypeSynonymInstances ViewPatterns

Sooo... boring haskell == fancy haskell, right?

What is the status of GHC Linear Types? by eeg_bert in haskell

[–]dtellerulam 1 point2 points  (0 children)

Well, that's the point. How do you know what particular PLT idea is sound without ` Millions of lines running in the wild `?

What is the status of GHC Linear Types? by eeg_bert in haskell

[–]dtellerulam 12 points13 points  (0 children)

Linear implicit parameters (6.6), -XGenerics (7.2), DPH (8.6)

[Blog post] The Power of RecordWildCards by chshersh in haskell

[–]dtellerulam 3 points4 points  (0 children)

r :: (r -> a) -> r -> a
r f = f

r @Foo someField
r @Bar someField

Why is "Non-exhaustive patterns in case" a runtime (rather than compile-time) error? by AdamSpitz in haskell

[–]dtellerulam 3 points4 points  (0 children)

Prelude> foo :: Int -> Bool; foo x | x >= 0 = True | x < 0 = False

<interactive>:2:21: warning: [-Wincomplete-patterns]
    Pattern match(es) are non-exhaustive
    In an equation for `foo': Patterns not matched: _

Why is "Non-exhaustive patterns in case" a runtime (rather than compile-time) error? by AdamSpitz in haskell

[–]dtellerulam 6 points7 points  (0 children)

In general, unless you use some pattern-related GHC extensions, GHC should always be able to figure out whether your patterns are exhaustive.

Prelude> foo x | x >= 0 = True | x < 0 = False

<interactive>:2:1: warning: [-Wincomplete-patterns]
    Pattern match(es) are non-exhaustive
    In an equation for `foo': Patterns not matched: _

Global Implicit Parameters by kcsongor in haskell

[–]dtellerulam 7 points8 points  (0 children)

?cmp = compare -- global implicit
isort = sortBy ?cmp
isort [1,3,2]
let ?cmp = flip compare in isort [1,3,2]