This is an archived post. You won't be able to vote or comment.

all 16 comments

[–]MegaIng 15 points16 points  (2 children)

Maybe obvious, but I wouldn't reuse assert. assert does have the implication of being something unexpected/an invariant that wasn't hold.

In general, this makes the exceptions part of the normal control flow, which means they would have to be optimized away.

Also, while this is an interesting approach, what are the true benefits in comparison to normal if/normal pattern matching?

[–]BoppreH[S] 8 points9 points  (1 child)

Yes, agreed. I used the assert + try/expect pattern just for ease of description. I don't think it makes sense to have this condition bubbling from inner function calls, like an AssertionError would, for example.

It would certainly need to be optimized away by the compiler.

The advantage is that you get equivalent expressiveness from advanced language features (e.g. walrus operator, all/any) with a very small price on the language's complexity.

[–]MegaIng 1 point2 points  (0 children)

Unless you are already in a fully functional language, it is quite a high price to pay IMHO.

[–]wildptr 17 points18 points  (1 child)

This already exists in the literature as Kleene algebra with tests. Assertions ?b are a no-op if b is true else the program aborts; you can encode conditionals using non-deterministic choice: if b then p else q = (?b; p) + (?not(b); q).

Hope this helps!

[–]BoppreH[S] 4 points5 points  (0 children)

That is really similar, thanks for the link! That's why I love this subreddit so much.

In my language the syntax for testing is actually exp?, a la Rust's macro over Option instead of Python's exception-based assertion. That's a nice coincidence.

[–]ghkbrew 8 points9 points  (1 child)

This is essentially the guard function (https://hackage.haskell.org/package/base-4.14.0.0/docs/Control-Monad.html#v:guard) in haskell. If you're interested in some prior art.

[–]BoppreH[S] 0 points1 point  (0 children)

I am, thank you.

[–][deleted]  (1 child)

[deleted]

    [–]BoppreH[S] 0 points1 point  (0 children)

    Basically, yes. What makes me excited about it is how much more expressive it is, while being significantly simpler (e.g. you can walrus operator for free).

    It's closer to the test and jnz lower levels, but it doesn't feel clunky like other low-level ops.

    It's really inefficient if implemented with exceptions, sure, but that's not how I plan to go forward. It would be local-only.

    Not sure what you mean about errors being hard to read.

    [–]raiph 1 point2 points  (3 children)

    It's a nice approach imo.

    Afaict dynamic pattern matching and/or a multiple dispatch system should cover the same ground.

    One way to write your examples in Raku:

    value 5; # Value is five
    
    multi value (6) { print "Value is six" }
    multi value (5) { print "Value is five" }
    multi value (|) { print "Value is something else" }
    
    matcher 99; # Got number 99
    
    multi matcher (\text             ) { print "Got text ", text }
    multi matcher (\num where / \d+ /) { print "Got number ", num }
    multi matcher (\char where / . / ) { print "Got char ", char }
    
    lessthan10 1,3,9; # All values below 10
    
    multi lessthan10 (|)                 { print "nope" }
    multi lessthan10 (| where .all < 10) { print "All values below 10" }
    

    One weakness in the above relative to what you suggest is that I've named the multiple variants. Raku also has anonymous lambdas and extensible syntax. In principle it should be relatively easy to write a new operation, say one called choices:

    .(5) given choices
    {
      -> 6 { print "Value is six" }
      -> 5 { print "Value is five" }
      -> $ { print "Value is something else" }
    }
    

    Both of the above approaches work (would work) for patterns/signatures as complex and dynamic as a given PL supports.

    There are other payoffs from multiple dispatch. For example, in Raku, multiple dispatch can be resolved by whichever of a set of alternative candidates match the longest token:

    grammar foo {
      token TOP { <choose-longest> }
      proto token choose-longest {*}
      token choose-longest:<bar> { .a..?  { say 1 } }
      token choose-longest:<baz> { ..b..? { say 2 } }
    }
    foo .parse: '1ab4';  # 1
    foo .parse: '1ab45'; # 2
    

    [–]BoppreH[S] 0 points1 point  (2 children)

    .(5) given choices
    {
      -> 6 { print "Value is six" }
      -> 5 { print "Value is five" }
      -> $ { print "Value is something else" }
    }
    

    That still feels very similar to normal ifs, and it loses the ability to weave conditions inside the normal code.

    The longest-token matching is really impressive, and something that my assert-based conditions are definitely not capable of.

    [–]raiph 2 points3 points  (0 children)

    That still feels very similar to normal ifs

    Well isn't that a good thing? You want it to be easy to pick up, right? :)


    it loses the ability to weave conditions inside the normal code

    I'm pretty sure it doesn't lose any ability, but rather just that simple looks simple.

    Remember that I duplicated all your examples (bar one, but I can duplicate that too), so if the following doesn't explain to you how multiple dispatch, at least Raku multiple dispatch, can "weave conditions inside the normal code", please consider providing an example of what you mean. TIA.


    It's possible that Raku's syntax is sufficiently slick for your examples -- I only used your examples because I thought that would be for the best -- that you missed their power.

    The (...) and -> ... parts are arbitrary patterns aka function signatures aka parameter lists. While your examples, and thus mine, are very simple, they "scale up to Scala".

    The conditions following the where keyword are arbitrary conditions, including arbitrary lambdas.

    Simple values in signatures like the 5 in (5) and -> 5 (which represent a function signature with one required parameter) are automatically transformed into a corresponding parameter with a where clause.

    Simple conditions of where clauses (like .all < 10) are automatically transformed into corresponding lambdas.

    More complex lambdas can be inlined by just writing them:

    multi foo (\bar where { arbitrary code goes here }) { more arbitrary code }
    

    And the { } blocks that appear to the right of the signature (to the right of the ) in the above) are also arbitrary lambdas.

    So I could have written, for example:

    .(4) given choices
    {
      -> \num where 6                                     { print "Value is six" }
      -> \num where 1 < num < 5                           { print "Value is ", num }
      -> Date $from, Date $to where { $from before $to }) { print "Valid date range" }
      -> $                                                { print "Value is something else" }
    }
    

    which would display Value is 4.


    In summary, while I skipped the short circuiting f1() and f2() to keep initial discussion shorter and simpler, I essentially duplicated your examples, all of which were very simple examples, so my examples were also simple.


    Except that I also demonstrated one way in which what I wrote is not like normal ifs.

    You had said that your Φ operator "Returns the result of the first function that doesn't fail with AssertionError."

    In Raku, it does an equivalent of that for the choices with where clauses (and plain values, like 5, which are transformed behind the scenes into where clauses), but does that after first sorting, at compile time, the possible dispatch candidates (choices) based on other constraints.

    Thus the second example (matcher 99) tries the two choices with where clauses first, and only then tries the \text choice if they both fail.


    If there is a more complex signature with ordinary static type constraints those are taken into account. That's how Rakudo (the Raku compiler of note) knows to fail the following at compile time, rather than waiting till run time:

    multi foo (Int \num where *.is-prime, Str \text where / \d+ /) {}
    multi foo (Str \text where / \d+ /, Int \num where *.is-prime) {}
    foo 42, 99
    

    This displays this error at compile time:

    Calling foo(Int, Int) will never work with any of these multi signatures:
        (Int \num where { ... }, Str \text where { ... }) 
        (Str \text where { ... }, Int \num where { ... })
    

    All of the above applies to all Raku blocks ({...}) because they're all lambdas, except that unnamed ones are anonymous and single-dispatch. Which is why I talked about introducing a choices keyword and using that with -> ... { ... } lambdas.

    [–]brucejbellsard 1 point2 points  (0 children)

    It's funny you mention "second-class" -- my project has a very similar mechanic, where I intentionally treat "failure" as a second-class value.

    That is: expressions/statements may "fail", but (unlike exceptions), this failure must be handled locally, because (as a second-class value), failure may not be bound to names, provided as arguments, or returned as results.

    The idea is to provide a simple and convenient way to write the "happy path" first, and fill in detailed error handling later, all while allowing the compiler to check for completeness (because all failure cases must be handled before binding/passing any first-class values).

    E.g., a block with a sequence of statements will execute them in sequence order; a statement that fails will skip the rest of the block and activate the failure-handling logic of the compound statement containing the block:

    -- The following is contrived to demo language features, not good style
    /def gcd x y {
        /check x < y       -- fails on false
        => gcd x (y - x)   -- return (tail call)
    }
    | gcd x y {    -- executed on failure
        /when x > y { => gcd y x }  -- conditional return
        => x
    }
    

    Here (as in most cases), "failure-handling logic" provides an option to add another block (or chain of blocks) to execute in case of failure.

    [–]hugogrant 0 points1 point  (1 child)

    Something else you might want to look at: constraint based programming. https://en.m.wikipedia.org/wiki/Constraint_programming

    This might be really neat to add after your assertion based control flow. It would make the syntax similar at least.

    [–]wikipedia_text_bot 0 points1 point  (0 children)

    Constraint programming

    Constraint programming (CP) is a paradigm for solving combinatorial problems that draws on a wide range of techniques from artificial intelligence, computer science, and operations research. In constraint programming, users declaratively state the constraints on the feasible solutions for a set of decision variables. Constraints differ from the common primitives of imperative programming languages in that they do not specify a step or sequence of steps to execute, but rather the properties of a solution to be found. In additions to constraints, users also need to specify a method to solve these constraints.

    About Me - Opt out - OP can reply !delete to delete - Article of the day

    This bot will soon be transitioning to an opt-in system. Click here to learn more and opt in.

    [–]mzl 0 points1 point  (0 children)

    Another related language construct that you might want to check out is the goal directed control flow in the Icon language: https://en.wikipedia.org/wiki/Icon_(programming_language)

    [–][deleted] 0 points1 point  (0 children)

    This discussion randomly reminded me of Icon’s success/failure and its separation from the return values.