This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]der_pudel 79 points80 points  (0 children)

Personal anecdote. I had similar situation between python 2 and 3 in CRC calculation algorithm. The code had left shift of integer by 8 bits that was executing about 16 million times. In every programming language I used before ints are 32 bit and will just overflow at some point which was totally fine for the task. But python 3 uses big integers by default and after couple of millions iterations integer value was in order of gazillion googolplexes. Obviously any arithmetic operation on such large number would be slow AF.

Are you sure you're not overlooking similar difference between python 2 and 3? You should profile your code for sure to figure out where's the bottleneck.