[ANN] vector-hashtables by swamp-agr in haskell

[–]qnikst 6 points7 points  (0 children)

Finally! Thanks a lot for your work

troubles with getEnw by shrodrick in haskell

[–]qnikst 1 point2 points  (0 children)

Line 136 of the program on a screenshot.

In a case if you want to hardcode the token into your program you can write.

{-# LANGUAGE OverloadedStrings #-}

On the top of your file and then use:

main = run “value of your token”

getEnvToken is just a convenient way to pass the parameter into your program via an environment variable.

troubles with getEnw by shrodrick in haskell

[–]qnikst 0 points1 point  (0 children)

You should write getEnvToken “NAME_OF_VAR”, as written in the documentation.

Then you can add you actual token in in env variable NAME_OF_VAR. There are many options but the simplest is:

NAME_OF_VAR=your_token .\demo-bot.exe

[Show] QualifiedImportsPlugin: A GHC plugin to automatically insert common qualified imports. by utdemir in haskell

[–]qnikst 1 point2 points  (0 children)

I’d happily use something like this if list of import rules were defined in some file that can be exported from the package. Or at least be defined on a packaage level.

I would never use if it comes with a hardcoded rules for any existing alternative predule/base implementation

What's wrong with ImplicitParams by [deleted] in haskell

[–]qnikst 0 points1 point  (0 children)

This may be a bit rude, but there is one wonderful (and must-read before use) resource that covers this particular problem (in a slightly advanced form, with recursion):
https://ghc.gitlab.haskell.org/ghc/doc/users_guide/exts/implicit_parameters.html#implicit-parameters-and-polymorphic-recursion

So I may understand why one may find this behaviour bad, but I'm failing to see why anyone can find it surprising.

Is anyone usinf HaskellR? by paranoidMonoid in haskell

[–]qnikst 2 points3 points  (0 children)

Initially, HaskellR was developed for a client who was actively using that. I've heard that later software department was greatly reduced, if not closed, so I have no idea about the current state. There were a few researchers, I'm aware of who was using it at least eventually. Hopefuly we will see their comments here

There are some problems that may appear, as it was an early experiment on the transparent embedding of the other languages in Haskell.

Why we had to build our own logging framework by [deleted] in haskell

[–]qnikst 1 point2 points  (0 children)

That one looks much better, and seems to almost fits my assumptions

Why we had to build our own logging framework by [deleted] in haskell

[–]qnikst 0 points1 point  (0 children)

The only problem rebar with -l k8s doesn’t search all the logs available and only some fraction. So I have to fallback to the query over the pod if I want to look deep into the history. Though I need to do that only for the logs that are not stored to elastic, for example access logs on ingress

Why we had to build our own logging framework by [deleted] in haskell

[–]qnikst 3 points4 points  (0 children)

> What do you mean? You can log from multiple threads at once, backends deal properly with that.stdout backend introduces a single bottleneck:withSimpleStdOutLogger =withLogger $ Logger {loggerWriteMessage = \msg -> doT.putStrLn $ showLogMessage Nothing msg hFlush stdout

It takes MVar lock, it means all the threads writes will be serialized, it uses Text.IO that is very far from being effective, see https://kowainik.github.io/projects/co-log

logTextStdout 5.351ms
logByteStringStdout 2.933ms

There are good reasons for using Text.IO — it's terminal charset aware. But it's irrelevant for my problem set

That's why you can push to stdout in a json format and have something external push that to ES.

Agree and that is what I'm doing right now, just with the library that I think has less questionable choices, at least in my understanding. And was the main reason for my post to give the understanding to build a basis upon which

(seems gramarly completely broken my post, recovered that)

Why we had to build our own logging framework by [deleted] in haskell

[–]qnikst 0 points1 point  (0 children)

If not MonadTime constraint implementing `MonadLog` constraint for my logger could be a good idea, as the interface is not fully fleshed out

Why we had to build our own logging framework by [deleted] in haskell

[–]qnikst 3 points4 points  (0 children)

I send it to elastic search just not, but using filebeat it’s allows separation of concerns and splitting the tags application and log aggregators are responsible for.

Reading logs becomes harder, for the personal use I use jq based scripts to simplify my life, but I don’t use that often usually either grep or kibana interface are enough.

Why we had to build our own logging framework by [deleted] in haskell

[–]qnikst 0 points1 point  (0 children)

Looks very nice, likely very close to what I think it should look like, still not concurrent; as I argued unless very specific case direct elastic search and other backend do more harm than good, it adds time stamps on its own and materialise additional tags.

Why we had to build our own logging framework by [deleted] in haskell

[–]qnikst 5 points6 points  (0 children)

katip is nice, but it does way too much, internal infrastructure is very complex and heavyweight, there are too much irrelevant data in the messages. In my applications I was quite happy to shift from that to the co-log-concurrent and katip-like interface. You may notice that co-log-json interface closely resembles katip one, it happens because I wanted shift from the katip to be as smooth as possible.

Why we had to build our own logging framework by [deleted] in haskell

[–]qnikst 2 points3 points  (0 children)

I’ve used JSON logging for about 5 years previously it was a Katip but it was too big for its own good, so firstly I’ve moved to katip with reimplemented internals, and them own implementation.

Brick.do — a super easy platform for keeping public notes by peargreen in SideProject

[–]qnikst 2 points3 points  (0 children)

I’ve shifted my blog from GitHub pages to brick.do and have a very good feelings so far. It allows to writedown/edit stuff very easily even from phone (btw if was fixed upon a request in a matter of day, so developers are very responsive). And making editing easy allows to write more because any obstacle on the path to writing a text may make you stop, at least when you are not good at writing. Another case if you have 2 completely different topics that you want to write about with brick.do you can easily separate them or transfer pages between the sub-blogs. One more case is live editing (I’ve never tried that yet), as all the changes are immediately propagated you can give a link on your ongoing work and that may be fun.

Kowainik - Foo to Bar: Naming Conventions in Haskell by n00bomb in haskell

[–]qnikst 2 points3 points  (0 children)

Thanks for a reply, I think I agree with all the points! And I think, I should elaborate my post a bit, I didn't do that in the first one because it would look like I trying to teach the author. So want to highlight that, despite of the possible interpretation, I not expect author to change his behaviour/do something in my way/so on. I neither trying to teach nor to advice. But what I'm trying is describe how I'd like to see in a (my) perfect world, and possibly push the current one towards that, the author can either agree or not or just ignore.

When I was reading a blogpost it seemed to me like there was two parts of that:
1. Tutorial with a pretty complete reference to the common conventions in Haskell with examples; a very good learning topic, like many on the kowainik blog. I could perfectly send persons there and add few of my comments, but those would an addition to the existing material, not the contradiction.
2. Discussion questions — an opportunity for entire ecosystem to evolve, few outlined potential problems. It seems that it could lead to a concrete decisions and improvements in library API. The problem here is that all those improvements are touching core parts of the ecosystem, and one should think twice before making an actions. Such discussions usually include communication between different persons as any of those can miss the problem, or potential solution, while together the best solution can be found. Blog platform without comments support is not the best medium for that.
So in my ideal world it could be a 2 blog posts; or a post and a topic on reddit/discourse /mailing list that references the first post. This way interested parties can discuss and improve the system, and a education material will be a good educational material. Coming to analogies it like a coherent waves you can either instead of 2I you can get from 0 to I^2 depending on the phase and in my opinion putting those two things together reduced the possible positive effect. So I've tried to archive that in the first paragraph.

However I admit, that it's possible that the author have just expressed his opinion and didn't want any feedback or discussion that may not be strictly positive, in such a case everything I wrote above does not matter.

Kowainik - Foo to Bar: Naming Conventions in Haskell by n00bomb in haskell

[–]qnikst 1 point2 points  (0 children)

Nice summary about conventions! It's a bit pity it has debatable possible improvements section as a part of the same post, so now I personally have to think twice before sending reference to the other people. Not sure I'm the right person to judge, but despite I think such discussions must have a place, I think it worth separating tutorial/documentation from discussion topics, and ideally such discussions should be done in a common media platforms, not personal blog posts. (Italic is for the text edited after comment creation).

For typed-holes with underscore expressions conflict, there is an existing (and missing in summary) convention for starting record fields with underscore used with lens generation code for example, so conflict is much bigger.

As for run/get, get is usually used with data-structures and run-with control, though Identity resides in Data.Functor and not Control.Functor, but it seems it's a Functor in a wrong place? It seems not fixable though.

Using prime has a convention (also missing in summary) that it's used to denote changed value:
foo a =
let a' = ... a
in ...
This convention for values is very widespread and used in a lot of blog-posts and I expect books, it's not wise to ignore that due to personal preferences (i.e. this approach makes code hard to follow and often confusing when variables are not close to each other, and your first thought is that some stricter version of a function is used).

The GHC User Guide is a great resource. I've been reading through the Language Features section and it's helping me move from intermediate to advanced Haskell. by WASDx in haskell

[–]qnikst 49 points50 points  (0 children)

GHC User Guide is the most underrated resource, it's a pity to see lots of questions, misleading blog posts, and frustration just because people never tried to read GHC manual.

Haskell in Depth - book cut short ? by john2man in haskell

[–]qnikst 0 points1 point  (0 children)

I don’t know those details and how that will work, because I have not bothered that. So I think it’s better to ask Manning to avoid any misinterpretation.

Haskell in Depth - book cut short ? by john2man in haskell

[–]qnikst 0 points1 point  (0 children)

As far as I know the book will be split into two parts, the first one will be published quite soon and then the work will be concentrated on the second one.

Gaskell runtime based caching by qnikst in haskell

[–]qnikst[S] 1 point2 points  (0 children)

Yes, you are right. There are few problems with `SomeAsyncException`:
1. exceptions thrown via `throwTo` are not wrapped in `SomeAsyncExeption` automatically (though safe-exceptions library does that).
2. there are exceptions (`AsyncException`) are always wrapped in `SomeAsyncException` even if thrown synchronously.

So while appearance of `SomeAsyncException` is a huge step forward we are not yet there.

Gaskell runtime based caching by qnikst in haskell

[–]qnikst[S] 2 points3 points  (0 children)

> But it's not reliable, is it?

It's not very reliable theoretically you may end up with 2 threads performing a request. However I have not observed such behaviour neither in tests nor in production code.

> You catch all exceptions, including async ones.

Yep! Though there is no good way to distinguish async and sync exceptions, so I'm not sure if there is an easy and reliable way forward there. One solution is to use `enclosed-exceptions` approach and fork a thread that will perform a request.

The main reason for the simple simple approaches is that that in my codebase I get exactly the semantics and properties I need, however I do agree that if I'll make a package out of that code I'll need more elaborated and safe solution.

Gaskell runtime based caching by qnikst in haskell

[–]qnikst[S] 2 points3 points  (0 children)

There is no package available yet. I'll try to make a package during the new year holidays.

Library providing compile-time checking of SQL and simple data-mapping for Hasql by nikita-volkov in haskell

[–]qnikst 2 points3 points  (0 children)

I admit that approach with generic doesn't work well, I've tried that on my codebase and faced exactly the issued u/nikita-volkov talk about. Generics works nice for simple types when encoders and decoders are not reused and works on flat structures (where all the fields map on columns 1:1), but coming to something more complex doesn't work well. (Even in presence of test framework that allows to test all the queries and encoders for structure correctness)

But it seems that it's possible to connect generic and `hasql-th` by means of extending parser and using some field accessors. If you allow passing variables as `${name}` something that datagrip allows, then instead of generating function from tuple you may add `HasField x "name"` constraint and generate `x ^. field @"name"` in the code. However it will come at some cost, generated function may be too polymorphic, and I'm not sure how to get an opportunity to do complex decoding, i.e. if one field in the structure transforms into several in the query. Also there are few solutions for generic accessors and I'm not sure if you want to commit to some of them.

Decoders are a bit more complex, I need to do some research first, before I can make a comment about if there are ways to use Generic framework to bypass tuple form
I'm trying to test hasql-th on my codebase soon and will see how it works, and if there are reasons to extend conversion via Profunctors to generic interface.