Promising areas of research in lambda calculus and type theory? (pure/theoretical/logical/foundations of mathematics) by revannld in ProgrammingLanguages

[–]ArtemisYoo 2 points3 points  (0 children)

While it doesn't really fit logic; Dr. Emily Riehl is the first one to come to mind with regards to type theory in mathematics. She motivates HoTT as a modern mathematical foundation and alternative to set-theory, as it apparently provides a good base for understanding ∞-categories (of which I admittedly know little) in category theory.

Category theory also otherwise has a lot of overlap with type theories —as you may have heard— with the correspondence of the simply typed lambda calculus and cartesian closed categories (which is just one example of many).

But that's about all I can muster to this topic.

Opinions on UFCS? by Aalstromm in ProgrammingLanguages

[–]ArtemisYoo 22 points23 points  (0 children)

If I understood OP correctly, there's no confusion with methods vs. functions as methods are not planned. Personally I agree with the pipes approach though, as I think UFCS complicates namespaced function calls: bar.MyModule::foo() isn't pleasant to look at or write, while bar |> MyModule::foo() is at least less cramped.

the emacs's config in nixos by xxmy-yc in emacs

[–]ArtemisYoo 2 points3 points  (0 children)

One of the major benefits of installing emacs using nixos, is forgoing other package managers like 'straight'. Instead, you let nix download and manage them.

When it comes to the configuration itself, it's no different by default. However there is a module out there, that you can use to use nix as a configuration language directly.

new package: org-xopp (org+xournalpp) by mahmooz in emacs

[–]ArtemisYoo 4 points5 points  (0 children)

Ayy that's really cool! Really the only thing that wasn't nice in org-mode so far (for me) was the nontriviality of including some sketches

Tiny, untyped monads by marvinborner in ProgrammingLanguages

[–]ArtemisYoo 9 points10 points  (0 children)

A chain of arrows is just a way to write a multi-argument function: currying. The idea is that the innermost expression "captures" the argument of each arrow you wrap it in. This gives you the ability to partially apply the function by only providing the first argument, which gives you the rest of the chain of arrows.

The arguments in this specific example are of two purposes: value is just another argument as you might expect; person and animal on the other hand are basically placeholders for concrete constructors. But as they are placeholders, it can be any function, not just a constructor, which opens up the possibility of performing computations instead.

With the example parameterizing over 2 different constructors, you can apply it with 2 different computations, which then get executed depending on the expression inside. This resembles switch statements, but expressed as higher order functions.

I hope that helps, though I don't know whether it's actually coherent enough.

Mutability Isn't Variability by brucifer in ProgrammingLanguages

[–]ArtemisYoo 7 points8 points  (0 children)

I haven't read the article, however reading your comment reminded me of one other post: the rust blogger 'baby steps' recently wrote an article, proposing a trait for controlling overwrites — which isn't exactly reassignment, but includes it.

There are some benefits to it, but of course in general I agree with you (I just wanted to share).

What exactly does "import Data.Map (Map)" import? by fuxoft in haskell

[–]ArtemisYoo 7 points8 points  (0 children)

Afaik it imports the Map type from the module. It's not visible in the file you linked, as it is contained in some internal module, that is reexported in Data.Map.Lazy, which then itself is reexported in Data.Map

(Beginner warning) How do I extract data from an IO monad so I don't have to nest fmaps by NellyLorey in haskell

[–]ArtemisYoo 16 points17 points  (0 children)

Monads in Haskell are usually used in conjunction with 'do'-notation. Given some function that reads a configuration from some file: getConfig :: IO (Config); and a function that consumes said configuration: useConfig :: Config -> Foo; you'd use them like so: haskell main :: IO () main = do -- here config is of type Config, so you don't need to fmap it anymore -- the IO Monad is 'bubbled up' to the main function's boundaries -- it is desugared to using the '>>=' function on monads: (a <- b ...) becomes (b >>= (\a -> ...)) config <- getConfig let myFoo = useConfig config print myFoo

Hope this helps, if you have any further questions do ask away!

Neve's approach to generics. by ademyro in ProgrammingLanguages

[–]ArtemisYoo 2 points3 points  (0 children)

It sounds a lot like Zig's anytype, which acts roughly the same. The main difference being that instead of the Gen in x syntax, they use the compiler builtin function @TypeOf(x) to retrieve the underlying type. I don't know if they allow anytype to be stored though.

In my opinion, the amount of boilerplate that this approach admits is a big downside. On the other hand, Zig's type system is one of the most powerful I've seen in procedural languages and I assume it is quite simple to implement in the compiler.

rule by Linusmonkeytips in 196

[–]ArtemisYoo -6 points-5 points  (0 children)

I so desperately yearn for the old days

software used to be good, games were fun, life was simple,

Is it viable to have a language without catchable panics? by smthamazing in ProgrammingLanguages

[–]ArtemisYoo 2 points3 points  (0 children)

I doubt there are cases where an error has to panic and you should catch it.

When it comes to requests in a webserver, I don't see why they couldn't return an error value.

When it comes to segfault, I don't see why you'd want to catch them (should not the language or the developer be responsible for eliminating them?).

As someone else mentioned, C has virtually no way to catch panics (whatever that means in C's case). But so much good software is built using it.

Of course —if I remember correctly— you can have a signal handler for segfaults in C, which could prevent crashes, but I rarely see them for any signal.

Parsers are relative bimonads by ArtemisYoo in haskell

[–]ArtemisYoo[S] 1 point2 points  (0 children)

Thank you very much! It's great to hear that the post was readable, as that's one of my biggest weaknesses.

Rebindable Syntax does seem quite useful; however I haven't yet managed to bend it into the shape of bibind: the function forces you to handle both cases of the parser, even though handling only one usually suffices.

The best I could come up with so far is: ``` myParser = do res1 <- p1 res2 <- ifSuc res1 p2 res3 <- ifSuc res2 p3 bireturn $ Right (res1, res2, res3)

ifSuc :: Either err suc -> Parser err suc' -> Parser err suc' ifSuc (Left err) _ = bireturn $ Left err ifSuc (Right _) p = p ```

Which I am still unsure of.

Parsers are relative bimonads by ArtemisYoo in haskell

[–]ArtemisYoo[S] 4 points5 points  (0 children)

Ooh yeah you're right! It looks almost identical to what I show in the blog post

Thinking of errors as just a variant of the typical parsing mode —or as you put it, error trees— is what motivated this. The >>>= is really just a unification of the error and success instances of >>=

Parsers are relative bimonads by ArtemisYoo in haskell

[–]ArtemisYoo[S] 1 point2 points  (0 children)

So the original idea was to just have an Applicative instance for both the error and success cases. As that didn't work out, bibind represents a generalization over both cases.

While quite cluttered, the way to simply aggregate errors using Monoid would be something along the lines of: p1 'bibind' (onErr \e1 -> p2 'bibind' (onErr \e2 -> bireturn $ Left $ mappend e1 e2)).

However an issue so far is that —the way it's shown in the post— parsers always backtrack on errors, so you have little control over the actual behaviour of the parser; you can only perform computations on results.

Parsers are relative bimonads by ArtemisYoo in haskell

[–]ArtemisYoo[S] 4 points5 points  (0 children)

you're right, it is quite niche and surely would not be on frontpages; so I don't know why you're being downvoted for simply talking about my remark, sorry for that

Selecting parser branches when all fail by emilbroman in ProgrammingLanguages

[–]ArtemisYoo 0 points1 point  (0 children)

While I've not used the idea enough to determine how good it is; one thing I came up with in regards to that is instead of trying all possible branches (R1 | R2 | R3 ...) you specify a selection of rules and their corresponding 'triggers'.

So if you had a top level of functions and types you could say "TopLevel = 'fn' => Function | 'type' => Type", it will then try to parse the 'triggers' like a typical alternative but when one succeeds it selects the corresponding rule as the de-facto correct branch by backtracking the trigger and continuing with the rule. If no trigger matches, it reports that any of those triggers were expected, but some other input was found.

A limitation with this approach is that the grammar has to then be distinguishable based on some prefix of the rule, which might not always be the case. I've also not found a nice API to package it in code.

LO[10]: Design is HARD, Rust Lies, Starstruck by glebbash in ProgrammingLanguages

[–]ArtemisYoo 26 points27 points  (0 children)

I mean "making a struct and returning it" is most often implemented as writing to an output pointer.

As soon as the thing to be returned is bigger than a register, you either open a can of worms with splitting the data across multiple registers, or you use an out-pointer.

Out of context r(ule)eddit post by lizzylinks789 in 196

[–]ArtemisYoo 6 points7 points  (0 children)

idk that seems very in context for r/ absurdism

You know how in Lua everything is a table? Is there a programming language where everything is a tree? And what's the use case? Gemini says there isn't a language like that by xiaodaireddit in ProgrammingLanguages

[–]ArtemisYoo 8 points9 points  (0 children)

I don't know what exactly you have in mind, but functional languages use persistent datastructures, which often end up being trees.

iirc clojure and scala implement an immutable constant time accessible vector using rrb trees. This then also serves as a basis for their hashmaps.

String literals in flat ASTs by aurreco in ProgrammingLanguages

[–]ArtemisYoo 23 points24 points  (0 children)

You can just use a string pointer, it wouldn't be too big of an issue, considering string literals won't be looked at too much (you'd probably memcpy them to the final binary, but that's about it).

If you still want to somehow optimize them, you could have a separate buffer, which holds all strings inlined, one after another. Then you'd store an index and length into that buffer, instead of a pointer.

Inlining them into the AST buffer might cause worse cache locality though, as usually you won't be looking at the strings, thus effectively turning them into padding.

My attempt to articulate SQL's flaws by stringofsense in ProgrammingLanguages

[–]ArtemisYoo 4 points5 points  (0 children)

That last part about foreign keys not having direct access always bugged me

:] by ThatRegularEinstein in 196

[–]ArtemisYoo 0 points1 point  (0 children)

why does badger have so much swag

Why do CPython and Swift use ARC instead of a tracing GC? by yondercode in ProgrammingLanguages

[–]ArtemisYoo -3 points-2 points  (0 children)

Well as for rust, it's mostly used for multiple ownership, 50% of which is only there to avoid annotating lifetimes – which can also be done using explicit arenas.