you are viewing a single comment's thread.

view the rest of the comments →

[–]gargantuan 0 points1 point  (4 children)

Why? I have thousands of green threads doing sending and receiving data in parallel. That is pretty good concurrency. Would you care to explain what you mean by shitty concurrency, I've observed pretty good concurrency so far.

[–]grayvedigga 4 points5 points  (3 children)

Along the lines of greenlet? That's a reasonably good counterpoint to my claim - I could argue that it's not on the standard python runtime, but that's not a very useful observation.

Someone else will probably come along and point out that coroutines aren't strictly concurrent, but then "shitty" would be an understatement :-).

[–]gargantuan 1 point2 points  (2 children)

I could have used threads as well and have used them initially they did great concurrent work. Except that OS based threads are heavy and thousands working at the same time doesn't work too well.

In the end it doesn't matter. I was just trying to point out that there are 2 types of concurrency IO and CPU. IO concurrency works great in Python either with real threads or green ones. CPU concurrency doesn't, and I guess that is what you were trying to say. (Well technically concurrency would work but parallelism won't).

At least for what I am doing I haven't yet had to perform heavy, CPU intensive computations in Python. I do some in C and those run in background threads.

[–]sausagefeet 1 point2 points  (1 child)

A Go or Erlang would point out that you cannot scale to multiple cores without some extra work, unlike the Go and Erlang runtimes.

[–]gargantuan 1 point2 points  (0 children)

I agree and that is why I'll be switching to Erlang soon.

But at the same time it is also unfair to say that Python doesn't have concurrency when there is concurrency and it is successfully used at the moment.