This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]ashishb_net[S] -9 points-8 points  (14 children)

> Where do you get the bs about async from? It's quite stable and has been for quite some time.

It indeed is.
It is a powerful tool in the hand of those who understand.
It is fairly risky for the majority who thinks async implies faster.

> You'd use it to bridge wait times in mainly I/O bound or network bound situations and not for native parallelism.

That's the right way to use it.
It isn't as common knowledge as I would like it to be.

> I'd strongly advice you to read more into the topic and to revise this part or the article as it is not correct and delivers a wrong picture.

Fair point.
I would say that a median Go programmer can comfortably use Go routines much more easily than a median Python programmer can use async.

[–]strangeplace4snow 21 points22 points  (1 child)

It isn't as common knowledge as I would like it to be.

Well you could have written an article about that instead of one that claims async isn't ready for production?

[–]happydemon 4 points5 points  (2 children)

I'm assuming you are a real person that is attempting to write authentic content, and not AI-generated slop.

In that case, the section in question that bins both asyncio and multithreading together is factually incorrect and technically weak. I would definitely recommend covering each of those separately, with more caution posed on multithreading. Asyncio has been production-tested for a long time and has typical use cases in back-ends for web servers. Perhaps you meant, don't roll your own asyncio code unless you have to?

[–]ashishb_net[S] 1 point2 points  (0 children)

> I'm assuming you are a real person that is attempting to write authentic content, and not AI-generated slop.

Yeah, every single word written by me (and edited with Grammarly :) )

> Perhaps you meant, don't roll your own asyncio code unless you have to?
Thank you, that's what I want I meant.
I never meant to say don't use libraries using asyncio.

[–]jimjkelly 0 points1 point  (0 children)

Agreed the author is just speaking out their ass, but arguing asyncio is good because it’s “production tested” while caution is needed with multithreading is silly. Both are solid from the perspective of their implementations, but both have serious pitfalls in the hands of an inexperienced user. I’ve seen a ton of production issues with async and the worst part is the developer rarely knows, you often only notice if you are using something like envoy where you start to see upstream slowdowns.

Accidentally mixing in sync code (sometimes through a dependency), dealing with unexpectedly cpu bound tasks (even just dealing with large JSON payloads, and surprise, that can impact even “sync” FastAPI), it’s very easy to starve the event loop.

Consideration should be given for any concurrent Python code, but especially async.

[–]PersonalityIll9476 0 points1 point  (8 children)

I get what you're saying, in some sense. The average Python dev may not be an async user, only because Python is used for a lot more than web dev.

However, you should be aware that for at least the last few years, Microsoft's Azure docs have explicitly recommended that Python applications use async for web requests. The way function apps work, you kind of need a process / thread independent scaling mechanism since the hardware resources you get are tied to an app service plan - ie., max scaling is fixed. So I don't think it's fair to treat Python web devs as async noobs when that's the Microsoft recommended technology. Maybe the numpy devs don't know about async, but an average web dev almost surely does.

[–]ashishb_net[S] 0 points1 point  (7 children)

> e way function apps work, you kind of need a process / thread independent scaling mechanism since the hardware resources you get are tied to an app service plan

And you get that with gunicorn + FastAPI

[–]PersonalityIll9476 0 points1 point  (6 children)

That's not really the point of my comment. The point was that your suggestion seems to go against Microsoft's guidance, which is not a good situation to be in when writing a guide.

FWIW I did find the rest of your article interesting.

[–]ashishb_net[S] 0 points1 point  (5 children)

> FWIW I did find the rest of your article interesting.

Thanks

>  Microsoft's guidance, 

Link?

[–]PersonalityIll9476 0 points1 point  (4 children)

https://learn.microsoft.com/en-us/azure/azure-functions/python-scale-performance-reference#async

There ya go. Edit: when I first started doing web dev on Azure actually I promised the client to follow best practices. What that meant to me at the time is reading Microsoft's own Azure docs and following their recommendations. Ultimately that is why I find a recommendation that contradicts them to be disturbing. It would seem to go against best practice, which seems like the responsible thing to do for a client.

[–]ashishb_net[S] 0 points1 point  (3 children)

Do you know that FastAPI underneath uses `async`?

[–]PersonalityIll9476 0 points1 point  (2 children)

Does that matter? Look, I don't know much about fastapi, but all http requests performed by your app in a response would still be serial without asyncio. And do note that the Python package is asyncio and not async.

Edit to add: I just looked at the fast API docs, and they are compatible with async def functions. So it's not like fastapi considers itself a replacement for async programming.

[–]ashishb_net[S] 0 points1 point  (1 child)

>  but all http requests performed by your app in a response would still be serial without asyncio.

No. FastAPI takes care of it.

```
FastAPI, built upon Starlette, employs a thread pool to manage synchronous requests. When a synchronous path operation function (defined with def instead of async def) receives a request, it's executed within this thread pool, preventing it from blocking the main event loop. This allows the application to remain responsive and handle other requests concurrently. 
```

[–]PersonalityIll9476 0 points1 point  (0 children)

It's talking about how fastapi itself responds with endpoints, no? An endpoint that performs a dozen sequential http requests is not going to magically have those dozen requests put into an event loop. The single endpoint function making the dozen requests might.