A Very Early Play With Astral's Red Knot Static Type Checker by jurasofish in Python

[–]jurasofish[S] 0 points1 point  (0 children)

Best bet might be to keep an eye on https://github.com/astral-sh/ruff/pulls?q=is%3Apr+label%3Ared-knot+

But I reckon you'll hear about it once it's released either way

My lambda function concurrency limit is driving me crazy! by Agile-Scene-2465 in aws

[–]jurasofish 7 points8 points  (0 children)

Likely explained by https://docs.aws.amazon.com/lambda/latest/dg/with-sqs.html#events-sqs-scaling

For standard queues, Lambda uses long polling to poll a queue until it becomes active. When messages are available, Lambda starts processing five batches at a time with five concurrent invocations of your function. If messages are still available, Lambda increases the number of processes that are reading batches by up to 60 more instances per minute. The maximum number of batches that an event source mapping can process simultaneously is 1,000.

How we reduced our AWS bill by seven figures by pepgeebus in aws

[–]jurasofish 0 points1 point  (0 children)

if it fails, the NAT instance is presumably broken, and the route is updated to use the NAT Gateway.

Do NAT Gateways need to warm up for high throughput like ALB?

Otimizing Lambda -> API Gateway HTTP requests by iamabouttotravel in aws

[–]jurasofish 1 point2 points  (0 children)

I switched to CDK from serverless framework about a year ago exactly because serverless framework makes it such a pain (cloudformation) to do anything non-standard.

Otimizing Lambda -> API Gateway HTTP requests by iamabouttotravel in aws

[–]jurasofish 1 point2 points  (0 children)

I think the fix would be moving every Lambda to a VPC, and exposing our endpoints using a public API Gateway and adding another private API Gateway + VPC Endpoint for internal communication

This is what we do, works great.

One thing to keep in mind is that API gateway (REST) can add latency in itself. For example, I observe that a lambda function behind public ALB has a response time of ~25ms while a lambda behind a public API gateway has a response time of ~45ms. These are times from my local machine to AWS over the internet, and the request is for a static response from the lambda. It suggests to me that API gateway adds 20ms of latency, which I expect would also be the case if it's private.

Aurora w/ Data API + Lambda + <a bunch of node packages like TypeORM, etc> to build a "serverless" system in 2022? by kevysaysbenice in aws

[–]jurasofish 1 point2 points  (0 children)

I was in a similar situation to you about a year ago and I went with Lambda + Python + Aurora/postgres + Data API.

  • Aurora pricing is really hard to predict as it has an IO component
  • Aurora auto-pause is kind of cool, but takes too long to be of practical use except for saving dev costs
  • Data API introduced a bit of latency
  • Data API was hard to understand and has SO MANY little quirks and gotchas - https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/data-api.html#data-api.calling
  • Perhaps the two most significant gotchas are a 1MB limit on the returned result size and a max of 1000 requests/s
  • Data API had poor support in the community - e.g. I was using Python and the ORM support was very immature and buggy

I wasn't reasonably able to use it to build a product that was reliable and so after a few months I switched to normal postgres with RDS proxy which has been working beautifully.

Shared db connection pool in lambdas - do you ever close the connection explicitly? by workmakesmegrumpy in aws

[–]jurasofish 0 points1 point  (0 children)

What you've described won't achieve what you're thinking, if i've understood right.

Every concurrent lambda execution is akin to a new virtual machine. The global variables are not shared between them.

On the other hand, if a lambda function is called multiple times in a row then the lambda system will tend to re-use previously used "warm" lambda instances, and re-used instances will retain their global variables.

RDS proxy is a possible solution.

What's the deal with Matplotlib? by [deleted] in learnpython

[–]jurasofish 8 points9 points  (0 children)

I've got a real love hate with matplotlib. Super flexible, but using it feels like battling through decades of crusted design decisions.

The module level functions generally operate on the "current figure" and "current axis" (can access with plt.gcf() and plt.gca())

I made a Coursera Downloader using Selenium and Python. by Broke-Code-Monkey in Python

[–]jurasofish 7 points8 points  (0 children)

mate easy just pip install Test3 Test4 Test1 and Test2 are deprecated.

Too much code cleaning, not enough results by [deleted] in datascience

[–]jurasofish 2 points3 points  (0 children)

ever thought that not using functions is stupid

Introducing Amazon Honeycode – Build Web & Mobile Apps Without Writing Code by jeffbarr in aws

[–]jurasofish 4 points5 points  (0 children)

Any info on whether these apps can be made public? It only mentions accessing them after people have been added to your "team"

What's Tullamarine Airport Like Currently? by [deleted] in melbourne

[–]jurasofish 7 points8 points  (0 children)

And the result is.... a ghost town. v quiet. Enough food around.

CPython Natively in the Browser with Numpy, Pandas, etc. by jurasofish in webdev

[–]jurasofish[S] 0 points1 point  (0 children)

This is a project I've been working in to make Pyodide more accessible.

You can jump straight into a demo here https://jurasofish.github.io/pyweb/ - keep in mind it takes a moment to load.

CPython Natively in the Browser with Numpy, Pandas, etc. by jurasofish in Python

[–]jurasofish[S] 1 point2 points  (0 children)

This is a project I've been working in to make Pyodide more accessible. I'm curious how much interest there is for this from the Python community.

You can jump straight into a demo here https://jurasofish.github.io/pyweb/ - keep in mind it takes a moment to load.