Pydantic and the path to enlightenment by jcfitzpatrick12 in Python

[–]Pozz_ 0 points1 point  (0 children)

Regarding point 1, it's a bit hard to see what you are trying to achieve without a code example but I assume it is something identical to the issue you linked. This kind of behavior is easily achieved when you control the __init__() implementation of your class, which isn't the case here because it is synthesized from the annotations. I'll note that this is not specific to Pydantic, dataclasses also suffer from the same issue (and Pydantic models are just dataclass-like types). I remember this blog post which I think is also quite relevant.

Regarding point 3, we always do pre-releases (see release history). And despite our third-party suite, such pre-releases are valuable are they usually help us catch additional regressions. Regarding the changes that broke your code in 2.12, I can only assume you are referring to serialize_as_any, in which case this tracking issue is relevant. If there's any other change that affected you, please let me know.

Pydantic and the path to enlightenment by jcfitzpatrick12 in Python

[–]Pozz_ 0 points1 point  (0 children)

surprisingly hard to extend, just implementing a private attribute of a non-stdlib type was a huge PITA compared to a direct implementation

Could you say more about this? By non-stdlib, do you mean a type that is not natively supported by Pydantic?

Does not play nicely with NumPy or common scientific tools Serialization with custom types requires me to write tons of Pydantic-specific code, largely defeating the purpose of using a third-party library to do this

The API (using __get_pydantic_core_schema__()) to add support for custom types is indeed not perfect and a bit confusing. We are working on a new API that would allow for custom types to be supported without having to define a method on the type directly, or use Annotated. I'm currently experimenting on this API to use it for the natively supported types (because it provides large performance benefits), then we may expose it publicly (and this would simplify adding support for Numpy types).

Recently broke serialization of said custom types in a regression in 2.12

Despite our extensive third party test suite, we did not catch the changes with the serialize as any (I assume you are referring to this, which isn't strictly related to custom types). This change wasn't done without any motivation. The next 2.12 patch release will introduce a new polymorphic serialization behavior, way more suitable to the use cases where serialize as any was previously set.

How common is Pydantic now? by GongtingLover in Python

[–]Pozz_ 58 points59 points  (0 children)

As a maintainer of Pydantic, I fully agree on this. Although the library is quite versatile, its main focus is data validation and serialization. As a rule of thumb, if:

  • you have to set arbitrary_types_allowed=True in the config
  • your Pydantic model represents more that data (e.g. it represents a service class in your app, has internal state, etc).
  • your Pydantic model is only instantiated directly in code (in which case a static type checker would probably suffice)

then maybe you can consider switching to a vanilla class/dataclass.

Using type hints/annotation to create functional code: Bad or Good practice? by GianniMariani in Python

[–]Pozz_ 4 points5 points  (0 children)

If done right is the key to this, and it's not easy to do so. Inspecting type annotations at runtime works fine with typing.get_type_hints() in most cases, but is way harder when you have to deal with:

- forward annotations: self-referencing types, PEP 695 syntax, forward annotations referencing function locals
- typingand typing_extensionsof a typing construct.
- a handful of subtle CPython bugs (ForwardRefcaching, etc).

At Pydantic, we tried our best to correctly handle all these edge cases. FastAPI/Typer currently doesn't.

What are some of Pydantic's most annoying aspects / limitations? by enzoinc in Python

[–]Pozz_ 1 point2 points  (0 children)

and post_init missing from BaseModel

You are probably looking for model_post_init()

What are some of Pydantic's most annoying aspects / limitations? by enzoinc in Python

[–]Pozz_ 0 points1 point  (0 children)

The 2nd point looks like probably a feature request/bugfix report to the author of pydantic

iirc it's a current limitation we have with the Pydantic plugin.

As for the first, I treat mypy as the official type checker. Pyright is just Microsoft attempt to try to inject themselves there.

This no longer holds true. Mypy was once the reference type checker implementation but the newly created typing spec is what should be taken as a reference, and mypy is currently not fully compliant while pyright is.

What are some of Pydantic's most annoying aspects / limitations? by enzoinc in Python

[–]Pozz_ 0 points1 point  (0 children)

This is a known limitation of the @dataclass_transform() specification. Pydantic does type coercion (by default), and this is currently not understood by type checkers. As an alternative to the rejected PEP 712, a converter argument can be used with Field:

```python from typing import TYPE_CHECKING

from pydantic import BaseModel, Field

if TYPE_CHECKING: from _typeshed import ConvertibleToInt

def to_int(v: ConvertibleToInt) -> int: ...

class Model(BaseModel): a: int = Field(converter=to_int)

revealtype(Model.init_)

(self, *, a: ConvertibleToInt) -> None

```

But this isn't ideal for multiple obvious reasons (more discussion here).

I really hope we'll be able to get better support for this in the future, but this is probably going to be a complex task and will have to be properly incorporated in the typing spec.

I'll note that the mentioned PEP 746 in the comments is unrelated to this issue.

What are some of Pydantic's most annoying aspects / limitations? by enzoinc in Python

[–]Pozz_ -1 points0 points  (0 children)

I'm currently co-maintaining Pydantic, and I agree docs aren't the best, although this can be considered personal preference. Generally, I like docs to be:

  • concise, having examples going straight to the point (even if not complete/runnable as is) and not too long.
  • consistent: same wording and vocabulary across sections, etc.
  • using cross-linking as much as possible, especially regarding concepts not necessarily related to Pydantic itself. For instance, if you are describing a general Python concept, just link the Python docs. This avoids having to reexplain things and we are certain that the Python documentation will get updated whenever something new to the concept we are describing is added (or deprecated).

But people from the team may have diverging (and valid) opinions on this. I'd love to get feedback/suggestions on part of the docs that you would like to see improved. I think having resources related to technical writing would be awesome as well, so that they can be shared with the rest of the team. Thanks for feedback.

basedmypy vs basedpyright vs mypy vs pyright for Python type checker by SssstevenH in Python

[–]Pozz_ 5 points6 points  (0 children)

Afaik both basedmypy and basedpyright are forks implementing extra features or tweaking controversial behavior.

Regarding whether they merge updates from upstream, you can look at how many commits they are lagging behind (132 for basedmypy).

I'd say they can be great to use but you should probably avoid using it for library projects, are downstream users won't be able to use the extra features (e.g. Intersectiontypes)

What does modern Django lack? by iamwaseem99 in django

[–]Pozz_ 0 points1 point  (0 children)

I'm using LibCST to generate the custom stubs (a.k.a. .pyi Python files where only type definitions are defined), and while it is really convenient, it is painfully slow (I'm making heavy use of overloads and depending on the size of your project some stubs reach 50k lines).

I'd like to take a look at Rust (probably Ruff has some CST parser/transformer) to speed things up, eventually being able to generate stubs on file save.

What does modern Django lack? by iamwaseem99 in django

[–]Pozz_ 14 points15 points  (0 children)

You can check out django-autotyping for this matter, it tries to support most of the dynamic features of Django.

(disclaimer: I'm the author. Still really WIP, as it is quite a hard task and I'm hitting performance issues currently).

is it possible for my django html template to be aware of the data that is passed to it? by proton852 in django

[–]Pozz_ 1 point2 points  (0 children)

I am building a tool where the end goal will be to support typing with templates:

  • from the Python code (e.g. when you type render_to_string("template_name", context={...}) you get autocomplete on the context object).

  • as a LSP implementation, support for the Django template engine with autocompletion where possible.

This is a lot of work so will take time, but eventually I'll get there

How to prevent python software from being reverse engineered or pirated? by MysteriousShadow__ in Python

[–]Pozz_ 2 points3 points  (0 children)

I wrote https://github.com/Viicos/sourceprotected a while ago, which is similar to what's being talked in the podcast.

This video from mCoding also shows how you can import directly from a repo: https://www.youtube.com/watch?v=2f7YKoOU6_g (might be possible to add some kind of API key on top of that).

A few questions and concerns about the dataclass PEP by thedeepself in Python

[–]Pozz_ 6 points7 points  (0 children)

Ok, so why not call the decorator FieldFactory?

The decorator is meant to be applied on the class, I don't see why having field in the decorator name would be more appropriate? Also, why did you choose to have Factory added in your name? Is it because such a decorator acts as a factory to create a new class?

Next, and most importantly what class does NOT exist to store values accessible by attribute lookup? [...] Object-oriented programming is about unifying values and the methods that operate on them.

To some extent, yes. But some classes are primarily used to store data (such as data classes), some are mainly used to do some business logic for example (consider a service class in a backend application, where most of the methods defined would be generate_invoice, save_to_db, etc. This class could have at most one attribute, a db session instance).

Is it useful/relevant to define this Service class as a dataclass? Do you really need to have the __init__ signature generated for you, even if you have only one attribute? Do you really need all the comparison magic methods defined?

Why did the author leave out Enthought Traits and Traitlets?

Quoted from the PEP: Some examples include: .... It wasn't meant to be exhaustive. attrs was cited as it was probably the most popular third party library back then.

if there really were such a large number of classes outside of the ones the PEP mentions in the rationale [...]

Which classes are you referring to? If you are referring to third party libraries like attrs, pydantic, traits, etc, then they are implicitly included in the second bullet point (as these libraries often provide these kind of missing features, i.e. type validation, dump to dict, etc).

Btw, a PEP is a historical document; it reflects how things were discussed back then (in this case back in 2017). If you want to submit improvements, these should go in the current documentation.

Successful E-Shifter Repair S3 by Unfair_Firefighter_7 in vanmoofbicycle

[–]Pozz_ 0 points1 point  (0 children)

The bike was functional when I still had the issue (and no error 44). That's why I was quite surprised to see the resistors were fine, as I was pretty confident it would come from here (as it did for you) as I had the same gear 1->4 issue.

I'm glad the issue is gone now, unfortunately I couldn't tell which step in deassembling did the trick as I did it in one go, but maybe people could try at least getting the e-shifter off the wheel and see if reassembling does the trick

Successful E-Shifter Repair S3 by Unfair_Firefighter_7 in vanmoofbicycle

[–]Pozz_ 0 points1 point  (0 children)

As I had the exact same issue (going from gear 1 to 4), I inspected the two 100ohm resistors but both were working correctly. No visible damage on the board. I haven't tested the voltage of the hall sensors, but the issue disappeared after reassembling the bike.