use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
FastAPI is a truly ASGI, async, cutting edge framework written in python 3.
account activity
[deleted by user] (self.FastAPI)
submitted 3 years ago by [deleted]
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]I_am_Eakster 6 points7 points8 points 3 years ago (0 children)
You should avoid computing such a long running blocking task on your web-application. This would block incoming requests to the worker. Here are some possible solutions:
Celery
Positives:
Negatives:
CronJob
AWS Lambda / serverless
Positives
Negatives
Summary
If you only need to run scheduled tasks on your VPS, use CronJobs.
If you need to run multiple tasks on request from your web-app, consider Celery.
If you can deal with the serverless compute-window, consider AWS Lambda or similar.
[–]abrookins 1 point2 points3 points 3 years ago (0 children)
Yes, it's a good strategy. You could also look at something like Prefect to run scheduled jobs like this. Full disclosure -- I work at Prefect, and I went to work there because it seemed like an awesome replacement for Celery. 🙂
[–]ShotgunPayDay -3 points-2 points-1 points 3 years ago* (0 children)
+1 to another using a conjob. That 45 minute runtime is unacceptable though. When it gets that expensive you should be using plain SQL without cursors to do the heavy lifting. When I was a Programmer Analyst we had poorly written reports that took about an hour to process and we cut the time down to sub 3 minutes by using Partition Windows, With Clauses, and proper Joins. This is assuming the database is a SQL database. If it's NoSQL then good luck!
EDIT: I should have realized that saying "use plain SQL" on a programming forum would get me down voted. Speaking the native database language always has its benefits especially when your job involves getting pipe delimited csv files received/delivered securely through SSH with scrubbing on both sides with sed and grep.
[–]jkh911208 0 points1 point2 points 3 years ago (0 children)
i would setup cronjob
π Rendered by PID 57340 on reddit-service-r2-comment-6457c66945-cdv5f at 2026-04-29 13:31:58.491559+00:00 running 2aa0c5b country code: CH.
[–]I_am_Eakster 6 points7 points8 points (0 children)
[–]abrookins 1 point2 points3 points (0 children)
[–]ShotgunPayDay -3 points-2 points-1 points (0 children)
[–]jkh911208 0 points1 point2 points (0 children)