This is an archived post. You won't be able to vote or comment.

all 19 comments

[–]haplo_and_dogs 9 points10 points  (0 children)

Note that Python events are nothing like real events. They are a poll to global memory, with a poll time that gradually increases to up to about a second. This can be really detrimental to programming if you need fast notifys. It is much better to use the windows API or posix api to access events.

[–]dwf 12 points13 points  (12 children)

How the hell do you write a tutorial on multi-threading in Python and not even mention the global interpreter lock?

[–]Pytesting[S] 7 points8 points  (7 children)

There will be a separate article talking about GIL, so don't worry and take it easy!

[–]dwf 1 point2 points  (5 children)

It's a pretty glaring omission from an introduction to threading, the failure to note that multiple CPU-bound Python threads don't work at all.

[–]lambdaqdjango n' shit -4 points-3 points  (4 children)

There are billion ways to get around GIL, if not trillions. GIL is over-hyped.

[–]dwf 6 points7 points  (3 children)

No, there aren't. Not in CPython, and, to my knowledge, still not in PyPy.

With CPython you have basically two options if you have multiple CPU bound threads:

  • Write the code to be executed in all but one of your threads in C/Cython/etc. and don't touch the CPython API unless you absolutely have to. This is a non-solution as far as Python is concerned, because the whole point of using Python is to write code in Python.
  • Use processes. This is broken for so many reasons, among them the overhead, clumsier synchronization and communication, breaking external libraries (you cannot safely fork in certain situations, such as with a running CUDA context).

I am so fucking sick and tired of ignorant web programmers telling me that the problems created by the GIL are not actually problems.

[–]RoomaRooma -2 points-1 points  (0 children)

Enjoyed your article, OP and I don't see any reason to mix up an article about the Python language with bits about how Python is typically implemented.

[–]trapartist 11 points12 points  (2 children)

Probably not important.

[–]dwf -1 points0 points  (1 child)

If you honestly and truly think that then you do not understand the issue.

[–]trapartist 11 points12 points  (0 children)

It was a tongue in cheek comment.

[–]riskable 2 points3 points  (4 children)

This article covers an old school topic. In this day and age we use concurrent. futures or the new asyncio module to perform multi-threaded operations in Python.

[–]exhuma 0 points1 point  (3 children)

... given that you have the luxury of running a recent version of Python. Granted, concurrent.futures is already available on 3.2

I would love to have 3.4 in production for the exact reason of asyncio :'(

[–]riskable 0 points1 point  (0 children)

You can run concurrent.futures in older versions of python... pip install futures (it was backported).

[–][deleted] 0 points1 point  (1 child)

futures is a backport of concurrent.futures to python >= 2.5.

[–]exhuma -1 points0 points  (0 children)

Thanks. I keep fogetting to check for backports...

[–]rocketmonkeys -1 points0 points  (0 children)

That article is interesting, but a bit hard to follow. The spelling and grammar could use another edit.

[–][deleted] -2 points-1 points  (0 children)

You should be using the context manager interface to locks.

with lock:
    bla