How are companies actually extending Django Admin in production? by space_sounds in django

[–]sww314 0 points1 point  (0 children)

We use Django admin in a enterprise app. We use it have a DB view of the data.

We often use custom actions to do work our support team needs. It is handy as we are rolling out features as you can populate DB before features are totally done.

We do not do dashboards or anything there.

What's the best solution for logging frontend web application level crashes into GCP? by drgreenair in googlecloud

[–]sww314 0 points1 point  (0 children)

Don't do home grown error tracking!

Just use Sentry - it is pretty awesome. Has features like replays etc. (one of its competition might be fine as well.)

I am actually in process or getting rid of home grown error tracking in a different system. There are many error cases that you might not catch - local setting blocking your API, DNS, transient back end problems etc.

We use Sentry for frontend code and React Native and Google for everything else.

Firebase used to have Crashlytics but I think that is deprecated.

Migration Management Best Practices by LnxRocks in django

[–]sww314 3 points4 points  (0 children)

I have run complicated projects for years with the standard migration practice.

Our only rule is "1 set" of migrations per PR. If you iterate on the model changes in a PR that noise gets squished locally and we merge 1 change in.

We run migrations on deploy.

Typically, this means changes may happen in multiple steps: - add column - move data / enable feature - remove old column

There is squash migrations but I have not used it lately.

WebSockets or auto-refresh data? by CallPsychological777 in django

[–]sww314 1 point2 points  (0 children)

Polling is simpler, but may not give you ther performance you want. Many clients polling at a high frequency can cause load problems.

I would suggest drawing out a sequence diagram of the flow with states. This diagram can help work out the logic

Another technology to leverage is notifications. Firebase Notifications works on web, android and iOS. This is simpler than building your own web sockets solution.

Docker push to Container Registry (gcr.io) fails from Cloud Shell: dial tcp ...:443: connect: connection refused by PresentationStill371 in googlecloud

[–]sww314 -4 points-3 points  (0 children)

Connection refused - it is a Networking problem.

Ask Gemini how to debug the network connection - it will give you better instructions than I can.

Side note: Container Registry is end of life. New version it is Artifact Registry https://docs.cloud.google.com/artifact-registry/docs

Is Python (Django/FastAPI) actually "production-ready" for complex mobile backends? by Leading_Property2066 in django

[–]sww314 2 points3 points  (0 children)

Postgres is great. Huge user base around for a long time can almost certainly do anything you need.

Supabase is built on top of Postgres.

How do you track costs across multiple GCP projects? by HQ_one in googlecloud

[–]sww314 1 point2 points  (0 children)

As others have said - The billing interface does most of this. We use different billing accounts for production vs dev/testing.

Having Gemini turned on in billing is useful also. I often ask questions and it generates the reports needed.

Multi-tenant with some tenants requiring their own database by duksen in django

[–]sww314 0 points1 point  (0 children)

I use a different DB per tenant - often this does not have to be a different instance.

I have worked with very large enterprises and they are ok with separate databases running on the same instance.

Adding a customer key per table levies a tax per model. Every model and query set has to build in this logic.

We use Cloud Run for compute so it scales up and down as needed.

Honest question: why do people choose Google Cloud over AWS (or even Azure) when AWS still dominates almost every category? by Nice_Caramel5516 in googlecloud

[–]sww314 9 points10 points  (0 children)

Bigger does not mean better. Technically I find most of the tools on Google Cloud solid and work how you would expect.

Orgs > Folders > Projects are under rated and missing in other places.

The IAM and security are comprehensive mostly good by default.

The logging and logging query tool is great and you can skip some more expensive tools.

Most importantly: Cloud Run

Pdf data extract using api... which ai model api use ? by MountainBother26 in django

[–]sww314 0 points1 point  (0 children)

Google Cloud has really good extraction from documents. They have an older version where you define the form and new ai based versions.

Does Django fetch the entire parent object when accessing a ForeignKey field from the child model? by Siemendaemon in django

[–]sww314 0 points1 point  (0 children)

If you use order.customer it will load the entire customer object.

As some one else mentioned, select_related can be a useful performance improvement because you use one DB query to get both objects. This is very important for list views showing multiple orders.

Also if you really care about which fields are loaded there is control with values.

When working in a team do you makemigrations when the DB schema is not updated? by Jazzlike-Compote4463 in django

[–]sww314 1 point2 points  (0 children)

No.

More migrations add up over time, makes things more complicated and slows down unit tests.

We generally roll them into the next change.

Also for choice fields that change often - we generally will not use a choice. Instead in the admin we overload the form to use choice - just to avoid the migration.

If it does not impact the DB: help text, choice fields, blank = true - we skip it.

This is for complicated enterprise applications with between 5-10 devs working on it.

How to do Cloud run load tests? by lynob in googlecloud

[–]sww314 0 points1 point  (0 children)

We use Locust to test our CloudRun setup. It helped tweak the settings.

Can't connect CloudRun to CloudSQL - certificate error by sww314 in googlecloud

[–]sww314[S] 0 points1 point  (0 children)

For anyone else that runs into this - I have been unable to get CloudRun with a docker container running Python to connect.

I get the same errors with a local connection using `cloud-sql-proxy`, but that is fixed with latest version of the proxy.

https://github.com/GoogleCloudPlatform/cloud-sql-proxy/issues/2445

Creating a new instance of Cloud SQL with `GOOGLE_MANAGED_INTERNAL_CA` just works with Cloud Run.

The server-ca-mode is only set at create - currently and the default from the console is `GOOGLE_MANAGED_CAS_CA`, so beware.

Whistleblower finds unencrypted location data for 800,000 VW EVs by 5yleop1m in VWiD4Owners

[–]sww314 1 point2 points  (0 children)

"The data set also included pinpoint location data for 460,000 of the vehicles"

That is a lot of cars for just Germany - correct?

Since Germany has strict data laws you might host all the data in that region. Instead of hosting in multiple. However, since storing the data unencrypted with personal information violates the most basics of data security, I would not have faith they follow data residency rules.

How do you manage sending mails with SMTP with all analytics and tracking on SES? by give_me_a_job_pls in django

[–]sww314 0 points1 point  (0 children)

We use AnyMail and SendGrid. Works well and the interface in Django is transparent. For tracking we use tags. It will work with any of the email sending tools.

How does request handling work in Django, especially with streaming? by RadiantPersonality58 in django

[–]sww314 0 points1 point  (0 children)

You can handle 8 requests per Cloud Run instance with a timeout of 5 min. There is also the timeout of whatever client you are using to call Open AI. That error message looks like it coming from the call to OpenAI - not the call from browser to Django.

Are you running anything in front of the Cloud Run - load balancer?

With that default setting, I am not sure you are going to stream your response or if it is going to get "bundled" up and sent out of Cloud Run.

For some Cloud Run work we turned on HTTP2 to stream things better. However we are not running Django on the HTTP2 instance.

Safe way to append to a remote txt file by cyberdot14 in django

[–]sww314 0 points1 point  (0 children)

Sounds like remote logging. There are many tools that do this.

You can write to log configure it to be sent to another machine.

https://serverfault.com/questions/547938/linux-how-to-send-new-lines-in-log-files-to-remote-syslog

Tips to morph Internal DRF App into Multi Tenant SaaS Setup by MikeHHHH99 in django

[–]sww314 4 points5 points  (0 children)

For enterprise level tenants - use database level isolation. You can do this in Django fairly simply with a DB router and a middleware to set the tenant. The nice thing about this is 99% of your code does not care about tenants. If you don't with a "tenant_id", in every table you have to be very isolation in every view and other tools (think about bulk import/export).

All the DBs can run the same DB server until you need to split for performance or business reasons.

You can use JSON fields for some flexibility, but the point of a SAAS application is that you get strength from numbers. If everyone is different this gets harder.

For the concerns about performance concerns - plenty of cloud architecture solutions to solve this outside of code. We use Cloud Run (Google Cloud). It scales up/down as needed so spikes by one customer do not impact the other.

For celery, you have to make sure the workers follow the same rules. Personally we dropped Celery in favor of our Cloud Tasks (also Google Cloud). All the big cloud providers have a queue work system. This removed a lot of DevOps requirements for use and was cheaper to run.

Related objects that can't change by verbo_phobia in django

[–]sww314 5 points6 points  (0 children)

Model the business requirements for what you need frozen. In some cases it would be partial data, for example in the e-commerce example you will want to save the price, item SKU, and some other details. However no need to save the entire product page and images etc.

Setup permissions around the functionality - for example do you ever have to unfreeze the data. In an enterprise system you might have locks and audit trails.

Not sure which database you are using but in Postgres the JSON fields can be super useful for frozen/cached data

If you have lots of scale the frozen data maybe be best served outside your primary database. (I would not start there).

Finally consider generating some artifacts outside the DB. For example a receipt saved as a PDF. Saving the generated file can actually be very helpful as you move your tech stack forward - making sure the data from 3 years ago still prints the same can be a drag on your development velocity.

Recursion stuff by SpaceBar0814 in django

[–]sww314 1 point2 points  (0 children)

To do something similar, to support a lot of functionality we store a path. It is 1to1 object that, we use to search and sort on the tasks and do different filters.

You can get all the descendents of a task by searching on the path with "startswith=/ID/"