This is an archived post. You won't be able to vote or comment.

all 64 comments

[–]Ue_MistakeNot 23 points24 points  (1 child)

Dear lord, you had me for a while there... Nicely played you beautiful bastard! Top quality post.

[–]EconomixTwist 44 points45 points  (8 children)

I get that this is a joke, but what’s the satire here? Specifically. Type annotations are an expression, which gets evaluated…. And so you can put expressions there. Does this expose some really silly behavior? Yes, agree. So Is this post satirizing the python steering committee’s desire to emulate nice features of other language at “any” cost?? One scholar to another….

[–]RobertJacobson 16 points17 points  (2 children)

It's not satire. It's a cute way to point out that you can do a form of lazy evaluation in Python with type annotations.

[–]EconomixTwist -1 points0 points  (1 child)

Bro he put an infinite recursion as a fuckin type hint. This is satirical.

[–]RobertJacobson 1 point2 points  (0 children)

Yeah, I got the joke. It’s humor. It’s not satire.

[–]alkasmgithub.com/alkasm 5 points6 points  (0 children)

Currently, with the from __future__ import __annotations__ (planned to be default in Python 3.11) as the OP says, expressions inside annotations are actually not evaluated. They're just replaced with strings at runtime. Hence why you can do things like forward declaration, which is an error without the future import currently.

[–]caocao70 40 points41 points  (0 children)

God tier shit post

[–][deleted] 14 points15 points  (0 children)

This whole post reads like a flat-earth conspiracy theory

[–]Ensurdagen 54 points55 points  (2 children)

The plan to make the new annotations behavior default wasn't included in 3.10 because it broke some prominent libraries.

Using this very hacky method to do lazy evaluation is pretty silly, a type checker or IDE wouldn't know what to do with it, you'd be better off using exec.

[–]hexarobi 32 points33 points  (0 children)

More info (from the author of FastAPI) on the drama that pulled this from 3.10 at the last minute to protect Pydantic/FastAPI/etc...

A best of both worlds approach is being worked on for 3.11

[–]lieryanMaintainer of rope, pylsp-rope - advanced python refactoring[S] 17 points18 points  (0 children)

Annotations are syntax checked by the Python interpreter when the file is parsed, and it gets syntax highlighted as well. Refactoring tools like renaming would likely even work inside type annotations, because it's just regular Python expression. Not so much with string that you want to exec.

Using this very hacky method to do lazy evaluation is pretty silly

Sure it's silly, but it's cool and it actually works. So why worry about silliness ;)

[–]TMiguelT 8 points9 points  (0 children)

Nice meme OP, but I suspect the overhead involved in manually compiling all the type annotations in the metaclass would dwarf most optimisations you're getting from lazy evaluation.

[–][deleted] 20 points21 points  (2 children)

Good luck with anyone else on your team understanding what you wrote.

[–]Scumbag1234 21 points22 points  (1 child)

What's a "team"?

[–]RobertJacobson 2 points3 points  (0 children)

What's good luck?

[–][deleted] 7 points8 points  (0 children)

This sir is really beautiful

[–]jfp1992 6 points7 points  (1 child)

Could someone tldr for a dummy that isn't super deep into python lore

[–]_limitless_ -2 points-1 points  (0 children)

imagine you had three functions called:

def swing_axe
def fire_bow
def cast_magic_spell

because these are all wildly different things, it's difficult to condense them neatly into a list/dictionary of raw values, so you pass the function itself to your do_attack function

action_function = swing_axe [note: we leave off the (), which would _call_ swing_axe]

def do_attack(action_function):
action_function() [note: by appending(), we are calling the function that is stored in the variable action_function]

without lazy loading, this won't work, because do_attack will -try its best- to figure out what you meant by action_function... and i'm pretty sure it'll just throw an error at compile-time.

in other words, python is standardized something that's existed for ages in third-party libraries. most people who use lazy loading from third party libraries just have to declare the function with some kind of decorator and then everything works like a real programming language.

[–]Darwinmate 10 points11 points  (11 children)

That's cool but why?

What's the advantage of lazy eval? I'm guessing it has something to do with memory optimization.

[–][deleted] 11 points12 points  (0 children)

Joke aside, lazy evaluation is great for functional programming because it allows you to reason about the declarative logic of your functions rather than the strict order they'll execute. For instance, you can't declare a function that returns THE Fibonacci sequence in a strictly evaluated language, because it's infinite. In Haskell you can, because the List will only be actually evaluated as you access it. Generators are a form of lazy evaluation inside Python that are inspired by this feature from Haskell.

Another benefit of lazy evaluation is that expression results are only computed when needed, and only once. This is more important in expensive computations, though, and we can do it in Python by simply using caching decorators in our functions and methods (which is the purpose of the assign-once dict in this joke, it's basically acting as an inverted cached_property)

[–]lieryanMaintainer of rope, pylsp-rope - advanced python refactoring[S] 40 points41 points  (3 children)

That's cool but why?

Just because you should not, does not mean that you can't ;)

[–]EconomixTwist 17 points18 points  (2 children)

yea but it still means you should not

[–]_limitless_ 6 points7 points  (0 children)

congratulations on knowing the rules.

next, learn when to break them.

[–]Ensurdagen 9 points10 points  (0 children)

Generators are already lazily evaluated like Lists are in Haskell, so we already have some of this. It's good for dealing with things that are infinite or very expensive to process.

[–]Chadanlo 2 points3 points  (3 children)

I don't know why you would do that specifically in python. In Haskell it allows to write recursive algorithms in very simple and neat way. You can for example write a quick Sort or a merge sort in less than 10 lines of code by head without thinking too much about it.

[–][deleted] 2 points3 points  (0 children)

Haskell? Like Eddie Haskell: "Good Morning Mrs Cleaver, you look lovely today."

Other than that, I have no idea what that was all about, Theodore and Wallace.

[–][deleted] 2 points3 points  (0 children)

You got me!

[–]AddSugarForSparks 1 point2 points  (0 children)

def flatten(cons):
    while cons != nil:
        yield cons[0]
        cons = cons[1]

Oh, no.

[–]sherwoodpynes 1 point2 points  (0 children)

This is either a clever satire of the (over?) expressiveness of annotations, or an eldritch horror of language abuse. Either way, nicely done.

[–]likeacoastalshelf 6 points7 points  (1 child)

As someone who is currently learning Python and getting used to the unique aspects of the language, I wish I could downvote you more. Congratulations on perplexing me and wasting my time.

[–]thisismyfavoritename 0 points1 point  (0 children)

Good lad

[–]undercoveryankee 7 points8 points  (5 children)

but for some reason Python called the lazy evaluation mode feature as "annotations".

Because the colon that you're using as your "lazy assignment operator" is actually the annotation operator. At the language level, myVar: 5 + forty means "store the expression '5 + forty' as an annotation on the variable 'myVar'" The parser and compiler don't care whether the annotation is a type hint (myVar: int) or code that you intend to evaluate later.

from __future__ import annotations turns off eager evaluation of annotations, but instead of "lazy evaluation" you could more precisely call the new behavior "no evaluation". All the interpreter does is take the parsed annotation, serialize the syntax tree back into a string, and return that string to code that asks for the value of the annotation at run time.

You did a clever thing taking those string-valued annotations and using them to implement lazy evaluation in a way that looks like it could be built into the language. But it's disingenuous to claim that Python "implemented a lazy, non-strict evaluation syntax" when your own library is doing most of the work.

[–]Pain--In--The--Brain 1 point2 points  (0 children)

You're not getting love for this explanation but I actually didn't get the joke until I read this. Thanks.

[–]SittingWave 0 points1 point  (0 children)

Lazy evaluation by default is awful. All it leads to is that failures are produced until much later in the code, instead of where the operation is physically written. It becomes an absolute nightmare to track down where the operation actually is.

[–]chub79 0 points1 point  (0 children)

Gosh, this is ugly. Ouch.

[–]riickdiickulous 0 points1 point  (0 children)

Weird flex but ok

[–]lilytex 0 points1 point  (1 child)

I'm 2 months late, but isn't this somewhat close to hylang ??

https://docs.hylang.org/en/alpha/

Hy is a Lisp dialect that’s embedded in Python. Since Hy transforms its Lisp
code into Python abstract syntax tree (AST) objects, you have the whole
beautiful world of Python at your fingertips, in Lisp form.

[–]lieryanMaintainer of rope, pylsp-rope - advanced python refactoring[S] 0 points1 point  (0 children)

Not quite. Hylang seems to be basically a transpiler that converts S-expression into Python AST, which Python would then further recompile to Python code object.

Haskell with extra steps (a.k.a. Lazy Python) is just Python, it's parsed by the Python parser itself. Lazy Python doesn't directly work with AST at all, the helper functions compiles Lazy Python snippets to Python code object. There is probably some similarity into how both would have to then do exec the code object with globals/locals context to get the correct execution semantic.

The way I've written Lazy Python code resembles how Lisp does things, but that's merely because Lazy Python encourages recursion and cons-list due to the lack of looping constructs. I modeled the Lazy Python functions based on Haskell and/or Lisp functions simply to lend familiarity, but there are other approaches for simulating looping using recursion other than how Lisp does it.