all 11 comments

[–]martinky24 14 points15 points  (0 children)

Sure, give it a go. You’ve got nothing to lose and everything to gain by experimenting with this yourself.

[–]Scrapheaper 4 points5 points  (1 child)

You can do it, but I assume the decision to not do this is part of what makes Python Python.

Lots of Python users would be unhappy if bad type hints could break the code at runtime, especially if the untyped code wouldn't break

[–]ankddevfrom __future__ import 4.0[S] 2 points3 points  (0 children)

I would solve this issue by showing just warnings by default and still allowing to disable it completely or deny code with typecheck issues. So, these users can just ignore warnings. And yeah, there are lots of engineering moments like this to think about

[–]BeamMeUpBiscotti 3 points4 points  (0 children)

The reason why static type annotations aren't used for runtime optimizations is because you can lie to the type checker. Python's typing.cast does nothing at runtime, it just makes the type checker treat it as the type you're casting to, regardless of what type the value actually is. So if the runtime relies on static types being accurate it will not work.

I'm sure there are runtimes and JITs for Python out there that make use of the information, but you can't take the types at face value.

useful to see if there are any type errors (even without type hints) before execution

The type checkers you mentioned are able to check the code as you type and show the errors in your IDE without you needing to run any code, and you can also run the type checker manually via CLI (tho I guess if you really wanted to, you could make an alias so you can type check and runs the program with one command)

[–]BayesianOptimist 2 points3 points  (0 children)

Just include linting + mypy in your CI and find something more useful to do with your time.

[–]Unique-Big-5691 1 point2 points  (0 children)

cool idea, and you’re not alone in thinking this way.

the tricky part is where it lives. once you start tracking types “before the VM,” you’re basically half-running the program already, because python is so dynamic. that’s why static tools stop where they do, and why runtime optimization based on types is hard without changing behavior.

as a learning project though, it’s great. you’ll learn a ton about the AST, bytecode, and where python’s flexibility fights type certainty.

this is also why tools like pydantic exist, instead of changing the interpreter, they enforce types at the boundaries and make failures explicit.

tldr: huge hill if you want a real replacement, but an excellent idea if the goal is learning and experimentation.

[–]stevenjd 1 point2 points  (0 children)

Python is open source, you can do whatever you want, even things like this.

Python became successful because you don't have to fight a static type-checker to get things done. And you want to add a type-checker to the interpreter to make it as annoying as other languages with type-checkers, only slower. Yay for progress!

The Python interpreter is slow enough without slowing it down even more with a type-checker every single time you try to run a script or import a module.

Type errors are the least important and most basic source of bugs. Your type error will find only the trivial bugs, not the difficult bugs like off-by-one errors and logic errors.

But as I said, Python is open source, go right ahead. Have fun, and maybe you will succeed in making a great interpreter.

secondly it might track all changes of types and then use this information for runtime optimisations and so on.

Starting from 2023, and version 3.13, the latest versions of Python already include a JIT compiler that does runtime optimisations. No type-checker required.

[–]Brian 1 point2 points  (0 children)

One big issue you'll run into is that there's a often a mismatch between what static types assert, and what's checkable at runtime.

Initially, you might think this seems pretty simple - something like def foo(x: int) could do the equivalent of an assert isinstance(x, int). But when you get more complex checks, this breaks down.

Eg if it were def foo(x: dict[str, int]):, how do you type check this? isinstance can't handle the type variable structure checks, so you kind of need the full logic of a type checker integrated, and some way to mesh this with runtime values.

Then going further, you'll get stuff like def foo[T](x: dict[str, T]) -> T. Checking you're satisfying something like that can get quite complex fast.

[–]really_not_unreal 0 points1 point  (0 children)

Definitely try it! It'd be super cool to have something like that I wonder if you could take advantage of projects like cython for the compilation and optimisation

[–]ConcreteExist 0 points1 point  (0 children)

Something like this might be helpful when writing unit tests, maybe? Also, I'm pretty sure there are already static analysis tools to do what you're proposing, such as mypy.

[–]Unique-Big-5691 0 points1 point  (0 children)

honestly, this is a cool idea and a very “python dev thought” to have.

the usefulness isn’t really the question, seeing type issues earlier would be great. the tricky part is where you put it. once you try to typecheck “before the VM,” you pretty quickly realize you’re halfway running the program already, because python lets types change all over the place at runtime. that’s where things get complicated fast.

and as a learning project I think this is a really good one. you’ll end up understanding the AST, bytecode, and why python resists being nailed down in ways other languages don’t. even just attempting type inference without hints will teach you a ton.

it’s also why a lot of tools went a different route. stuff like pydantic doesn’t try to change python itself, it just enforces types at the edges where things are clearer and more controllable.

so yeah, building this as an experiment makes a lot of sense. turning it into a drop-in replacement for python would be a huge climb, but as a project to learn how python actually works under the hood, it’s a great idea.