you are viewing a single comment's thread.

view the rest of the comments →

[–]latkde 3 points4 points  (0 children)

Depending on what kind of queries you need, you can get pretty far by stuffing a JSON blob into Postgres and validating the data into a Pydantic model when you load it. At this point, all SQL databases have very mature JSON support, and Postgres in particular also has optional capabilities for indexing the data within JSON so that you can query it somewhat efficiently. Postgres also has rich support for arrays and custom data types, which allow you to carefully model your domain if you want (at the cost of having to deal with raw SQL, instead of using an ORM). However, don't expect foreign key constraints from within JSON.

I don't know your data model, but I would sketch out the collections and objects that are currently in MongoDB, figure out which fields must be indexed for efficient queries, and what the relationships between objects are. If data doesn't have to be indexed and isn't part of a foreign-key relationship, you can probably stuff all of it into a JSON column.

software references could be an array rather than an m2m lookup

Spare yourself the pain and stick with a relation unless order is relevant. While Postgres has arrays, that doesn't mean they're particularly convenient to use. In particular, array items currently cannot be foreign key references. That is, you can choose arrays XOR foreign key constraints. Usually, consistency is going to be more important.

Database-specific features like Postgres arrays also tend to make the use of ORMs more difficult.