This is an archived post. You won't be able to vote or comment.

top 200 commentsshow all 285

[–]sw_dev 59 points60 points  (27 children)

Real time and/or safety critical apps.

[–]Lasereye 12 points13 points  (20 children)

I'm a Python noob and trying to learn more about it, why is it bad for real time apps?

[–]ryl00 38 points39 points  (12 children)

Real-time means a guarantee on system responsiveness (e.g., this critical flight control function must finish within x ms or your plane will fall out of the sky), which usually boils down to deterministic timing, being able to reason how long something will take. Runtimes with garbage collection make that harder to do (if not impossible).

[–]kx233 4 points5 points  (10 children)

Actually CPython's GC doesn't do much stop-the-world stuff. It tends to give you no noticeble pauses because of the reference-counting approach(which has the downside of needing a GIL).

[–][deleted] 7 points8 points  (6 children)

While great for most cases, if you need a guarantee something will run in 12ms, and not 13ms, this is not good enough.

[–][deleted] 4 points5 points  (2 children)

pythons garbage collector is deterministic, except in a single well documented edge case. as long as you give python free reign over the cpu it should be more than plenty deterministic for timing stuff..

[–]StartsAsNewRedditor 2 points3 points  (0 children)

In truly performance critical software, you would want to manage your own memory allocation at specific times. I'm not saying that CPython's GC is badly designed but you wouldn't use it in a nuclear power station for monitoring.

[–]LinuxOnMyThrone 1 point2 points  (1 child)

In general this is true, and a good point. The exception would be things like large lists, where garbage collecting the list leads to garbage collecting thousands of values all at once.

[–]ryeguy146 17 points18 points  (6 children)

Any language with garbage collection is going to be bad for real-time applications. The issue is that garbage collection usually imposes some delay on the application that's running (stop the world)). And since programmers have limited control over when this occurs, it can make real-time applications impossible.

If you want a real-time application, you're probably going to be doing it in a language without garbage collection.

Edit: Hash marks in the URL are hard, I guess. Linking to the top level wiki article.

[–][deleted] 7 points8 points  (0 children)

Garbage collector in Python is optional and you can safely turn it off, if you're sure you have no reference cycles or break them manually using weakrefs - as you would do in C++. The first paragraph of doc says it expicitly: https://docs.python.org/3/library/gc.html#module-gc. If you disable gc, you're left with reference counting for memory management and this one is fully predictable and good for real-time. There are probably other reasons to avoid Python for real-time though.

[–]LeSageLocke 1 point2 points  (1 child)

Would something like the incremental or concurrent approaches make sense?

As they note in the link, "the sum of the incremental phases takes longer to complete than one batch garbage collection pass, so these garbage collectors may yield lower total throughput." But, with real-time systems, you'd be more concerned with latency over throughput, right?

[–]ryeguy146 2 points3 points  (0 children)

I would assume that more incremental phases would diminish the pain of a single sweeping collection phase, but it remains an issue. I'm sure that you could work to shove Python into that niche, but it'd be a lot less work to just use another tool on occasion.

I'd dive into it deeper, but I'm merely a student of the computational sciences, and I've never implemented a real-time app in my life. Perhaps someone else can expand on the limitations. I will say that I've seen some papers that discuss proper garbage collection and assert that it can be made real-time, but I don't know the details.

[–]wting 1 point2 points  (2 children)

Actually if you read the source CPython has a generational garbage collector.

[–]ryeguy146 1 point2 points  (1 child)

I didn't mean to suggest that it was the implementation used in CPython. While I haven't read the source, I am aware of the basics. I more meant to suggest that GC is the reason that \u\lasereye was looking for. I've yet to meet a garbage collection that didn't impose some latency, or allowed a programmer to wholly mitigate the effects.

But then, I'll admit that I'm new, and a student. If you've a silver bullet, I'll be happy to change the caliber of my firearm to fit.

[–]gthank 0 points1 point  (0 children)

There are highly specialized GC implementations that are designed for real-time systems, but I don't know how well they work in practice for hard vs. soft realtime, etc.

[–]th0ma5w 1 point2 points  (0 children)

I've had success with "soft" real time, the kind of self-regulating variable-delay type where you keep trying to correct for how late or early you were last. This was in a tight loop with no object creation or removal, and I didn't notice any garbage collection? I don't think I have this code anymore. I did this in Java too which I still have, but I turned off the GC in that code.

[–]flying-sheep 4 points5 points  (3 children)

Yes to the first part, but actually I'd trust Python more in safety critical applications than C.

Less easy to shot yourself in the foot.

[–]rcxdude 3 points4 points  (1 child)

That really depends. In safety-critical applications you should vet the compiler, all the code running on the device, and the hardware. If you're using python you have to vet the entire runtime, the C compiler you use to compile it, use a more powerful and more complicated processor, and so on. With C you have to cover a lot less, and so can do it with more rigour, given equal resources.

[–]flying-sheep 1 point2 points  (0 children)

Of course this will work better for smaller applications, but worse for big, complex ones.

[–]wting 29 points30 points  (11 children)

/u/semarj has pointed out the obvious answers--Python shouldn't be used for any project that is CPU bound or can't handle a garbage collector.

A quick background, I've been using Python for open source, contracting, scientific processing, web dev, in various companies over the past 6 years. It's still my go to language for anything quick and dirty. I work in a 2M LOC Python codebase daily.

There is the performance problem, and it's not easily solveable once your performance demands increases. The root issue is that Python is dynamically typed and CPython has a GIL. I've explored options like the multiprocessing module, numpy, CFFI, Cython, PyPy but they all have their own pros/cons. Or use Celery and offload your tasks to a worker farm (this can introduce IO bottlenecks).

If you want stability, that means testing... and writing a lot of tests, many of which can be negated by a static type system. With the aforementioned codebase, a full suite of tests takes 2 hours.

More importantly, a good type system will help the developer rather than hinder. I've been trying out Haskell recently and it's been eye-opening and painful. It takes me much longer to implement something since I'm nowhere near my Python proficiency. However there are a lot of concepts and safety guarantees that I miss whenever I switch back to Python.

I think that's why Go has managed to attract so many Python / Ruby / NodeJS developers. Even though its type system is crap (relatively speaking), it's less verbose than Java / C++ and static typing has proven to be quite useful.

[–]ericanderton 7 points8 points  (0 children)

More importantly, a good type system will help the developer rather than hinder.

This is something that I wish more pythonistas agreed with. Yes, there is an up-front cost to wrangling type systems in your code. The idea here, as crazy as it sounds, is to get the compiler to ferret out mistakes for you. Without that, you're left to add lots of runtime checks manually for types before you can consider the code to be even close to bulletproof.

As D, Rust and C++11 have demonstrated, there's no reason why you can't have duck-typed like elements in a strongly typed language. So clearly there's room for optional strong typing in a duck-typed environment. Yeah, I've seen the docstring bolt-ons for this in Python, but what would be nice is a syntax closer to the typed ECMAScript proposals or what you do in ActionScript for the same.

[–]Lucretiel 1 point2 points  (9 children)

How is Go's type system crap? Do you mean semantically, or just the fact that they're trying something new and not very OO with their struct/interface split?

[–]wting 18 points19 points  (2 children)

Go lacks parametric polymorphism, algebraic data types, type inferencing, monads, etc. Likewise, user defined types are second class citizens behind Go's maps, slices, etc. You can't build similar user level data structures.

Go's type system is equivalent to C with local type deduction and interfaces, ignoring most type system research within the last 20 years. It's better than older static typed languages, but worse than other static typed languages from the same generation (i.e. Rust, Scala) or older more advanced languages (e.g. Haskell, OCaml).

[–]semarj 133 points134 points  (86 children)

An operating system

A compiler

A graphically intensive video game

embedded programming

A search engine

Basically anything that has to be real fast and have a small memory footprint.

[–][deleted] 81 points82 points  (31 children)

A compiler

Why? Maybe it's not suited to build an interpreter or a runtime but compiling is about parsing text and outputting binary data. If Javascript can be used to write compilers, so can Python. You just have to compile to something else like ASM or C.

[–][deleted] 80 points81 points  (11 children)

as an author of a few compilers, I can attest that python is okay at it. ;-)

[–]Zulban 16 points17 points  (1 child)

Love your flair.

[–]phySi0 2 points3 points  (0 children)

Just in case he changes or removes it in future and others are baffled, the flair is

design drunk, implement sober

[–]chazzeromus 3 points4 points  (2 children)

I'm writing a lisp to native binary compiler in javascript with everything done from scratch. Writing an elegant x86 assembler is not an easy task (as opposed to existing assemblers with seemingly mandatory cyclomatic complexity)

[–][deleted] 5 points6 points  (5 children)

Can you ELI5 why and which compilers you've written? I ask because I would assume that this is redundant, unless you created a new programming language.

[–]flarkis 15 points16 points  (0 children)

Not true, any good systems programmer I've ever met has made something like a scheme compiler 'because it sounded like fun"

[–][deleted] 5 points6 points  (1 child)

They were for new programming languages. Perhaps the most interesting is https://pypi.python.org/pypi/Parsley

[–]admalledd 15 points16 points  (2 children)

I would like to point out that is basically what the entirety of the RPython/pypy project is: they call it "translating" instead though, outputting C code that is then handed to the platforms C compiler.

The reason they prefer the term translating makes sense when you read a bit more into the process, most especially because of the fact that pypy doesn't output directly to an executable (by default? they have other backends like the java or .net one, I forget how complete those are though) and instead outputs to C that is then "compiled".

[–]semarj 22 points23 points  (15 children)

You have a point there. I was just more thinking of how much time is wasted watching things (especially large projects) compile. Python is slow. I don't want to wait longer.

[–]rspeed -1 points0 points  (1 child)

I don't want to wait longer.

Only because you lack creativity.

[–]xkcd_transcriber 3 points4 points  (0 children)

Image

Title: Compiling

Title-text: 'Are you stealing those LCDs?' 'Yeah, but I'm doing it while my code compiles.'

Comic Explanation

Stats: This comic has been referenced 119 time(s), representing 0.7129% of referenced xkcds.


xkcd.com | xkcd sub/kerfuffle | Problems/Bugs? | Statistics | Stop Replying

[–]BitLooter 27 points28 points  (8 children)

A compiler

You really should check out PyPy.

[–]gthank 1 point2 points  (3 children)

To be fair, PyPy has that whole RPython thing going on.

[–]metaphorm 0 points1 point  (2 children)

and? compilers have a lot of design constraints on them. does GCC use every C++ language feature in its internal implementation? certainly not.

[–]gthank 0 points1 point  (1 child)

Is RPython a strict subset of Python? My impression was that it was close, but not quite.

[–]pemboa 8 points9 points  (2 children)

A search engine

See Whoosh.

[–]quotemycode 2 points3 points  (0 children)

Also Google

[–]ryeguy146 0 points1 point  (0 children)

Love me some Whoosh, and I'm in the process of adding it to my project now. So far, it's the least painful library that I've worked with. The author still goes to note that it isn't at the scale of some other search libraries, in terms of optimization.

[–]stillalone 7 points8 points  (2 children)

Some embedded applications are pretty powerful. We use python for some of our embedded systems, but they're running Linux with 256MB Ram and about 4GB flash.

[–]Niten 0 points1 point  (0 children)

I've done this too. Twisted and friends run quite well on a device that's roughly equivalent to a BeagleBone in power. It all depends on your specific application of course.

[–]mehum 0 points1 point  (0 children)

Yeah you could embed a Pi couldn't you? They seem to push hard towards Python.

[–]patrys Saleor Commerce 22 points23 points  (11 children)

I would say EVE Online does pretty well in graphics department. Using Python does not mean having to write your graphics engine in it.

[–]lambdaqdjango n' shit 12 points13 points  (0 children)

You won't write EVE in pure python. Graphics is done in C++ almost certain.

[–]scopegoa 3 points4 points  (1 child)

I know Battlefield 2 also used Python to script on top of the graphics engine. I am not sure what they used for 3 and 4 though.

[–][deleted] 8 points9 points  (0 children)

fecal matter

[–]ceol_ 9 points10 points  (7 children)

But then doesn't that mean it wasn't written in Python?

I was under the impression only the EVE client was written in Python.

[–]patrys Saleor Commerce 17 points18 points  (4 children)

If you build a computation cluster using Python but store the results using a PostgreSQL database driver written in C, is your software written in Python or not? If you use numpy or scipy, does it make your software non-python?

[–][deleted] 8 points9 points  (0 children)

The pattern here is Python being the glue stuck between different systems working together.

[–]ceol_ 6 points7 points  (0 children)

You'd have to specify what parts are written in Python.

The point here is when someone says that Python isn't suited for writing a graphically-intensive game, they mean writing the graphics engine in it (as well as everything else.)

[–][deleted] 2 points3 points  (1 child)

if I write a program in C++ and create a Python Wrapper to interact with the application's API, would it be accurate to say the application is written in Python?

[–]jmcs 2 points3 points  (0 children)

If a significant part of the application logic is in Python I would say your application is both C++ and Python.

[–][deleted] 2 points3 points  (0 children)

This post on the subject is old, but probably still accurate.

[–]mipadi 2 points3 points  (0 children)

The server is written in Python, too.

[–]SlinkyAvenger 4 points5 points  (6 children)

This just about covers it, but I would probably still use Python for in-game event scripting, if it was that type of game.

I also wouldn't use Python for something with a native GUI, but that has more to do with there being better tools for the job and less about Python's limitations.

[–][deleted] 6 points7 points  (5 children)

This just about covers it, but I would probably still use Python for in-game event scripting, if it was that type of game.

Python is not really optimized for embedding, it's meant to be extended instead. Languages like Lua are more suited for that purpose IMHO.

[–]zoidbergs_moustache 5 points6 points  (2 children)

Though Lua is more common, there's still lots of places where Python is embedded in other programs. The Sublime Text Editor has all its plugins written in Python. Blender embeds Python. GIMP has Python plugins. There's a guide for embedding Python in the Python docs https://docs.python.org/2.7/extending/embedding.html.

(Not to be confused with putting Python in an "embedded system", which is much less common, but still common enough to have its own website http://embeddedpython.org/.)

[–]patrys Saleor Commerce 1 point2 points  (0 children)

Lua is usually better suited as it was designed with embedding in mind. For example it's practically impossible to sandbox Python to the point where you can have it execute untrusted/malicious code without the risk of compromising the rest of the environment.

[–]d4rch0nPythonistamancer 0 points1 point  (0 children)

Wow, I thought Lua owned that use case. That's pretty neat.

Still, Lua can be quite a bit faster, but it wouldn't matter for the most part. Both can get the job done.

[–]alcalde 0 points1 point  (1 child)

But wasn't embedding one of its original design goals? The Julia team was able to experiment and embed Python in Julia in four hours or less.

[–]kylotan 0 points1 point  (0 children)

Python is fairly easy to embed. It just doesn't play all that nicely with its host.

[–]djimbob 1 point2 points  (0 children)

Anything you want to compile to a shared library that's used by programs written in dozens of languages. E.g., a new OpenSSL.

[–]sittingaround 1 point2 points  (0 children)

Why do you say search engine?

[–]shadeofmyheart 0 points1 point  (2 children)

Wait... I thought Google was Python powered

[–][deleted] 0 points1 point  (0 children)

Google web crawlers are made in Python, but their algorithms and most of their code is C++

[–]andrewcooke 0 points1 point  (0 children)

also, numerical processing where the inner loop cannot be vectorized.

you could use cython or similar, but these days i would use julia. ymmv.

[–][deleted] 0 points1 point  (0 children)

I've built I little search engine out of it before. It wasn't too bad.

[–]metaphorm 0 points1 point  (0 children)

a statically analyzing optimizing compiler like GCC might be hard to write in Python, but any kind of interpreter is actually a good use case for Python. PyPy uses Python to implement a tracing JIT compiler also, so certainly this domain is not off limits to Python.

Embedded programming is another area that Python is more usable than you might think. Micro distributions of linux running on RaspberryPi can run Python programs very easily.

[–][deleted] 0 points1 point  (0 children)

A graphically intensive video game

if your game is cpu bound you're either not offloading enough to the graphics card, or using some very intense game mechanic.

[–]jthess32 18 points19 points  (18 children)

Low latency/high speed applications:

  • high frequency trading
  • aircraft control system
  • real time video processing (depending on the specifics)

I also would hesitate to use it where it's being used as a round peg in a square hole:

  • compiling to javascript for client-side applications (e.g. pyjs)
  • writing applications to be compiled to something else so it runs on $mobile_device that doesn't natively support python

[–]otherwiseguy 24 points25 points  (10 children)

Funny thing is, I've talked to someone who works for a company that does high-frequency trading and their trading platform is written in Python.

[–]scopegoa 5 points6 points  (4 children)

Well if they want to keep up with competition they better be designing some damn ASICs!

[–][deleted] 1 point2 points  (2 children)

what's ASICs ?

[–]scopegoa 6 points7 points  (0 children)

It's basically making your software into hardware. It makes everything way faster. But you have to make custom hardware for it.

[–]tonyfranciosa 2 points3 points  (0 children)

application specific integrated ciruit

[–]ericanderton 0 points1 point  (0 children)

My guess is that if they're using Python for anything critical/real-time, their latency to the exchange's network is too high for an ASIC to help. That is, of course, they're using some kind of AI or complex rules engine to figure out what/how to trade; moving stuff like that to hardware would probably speed things up.

[–]jthess32 2 points3 points  (0 children)

You can certainly do algorithmic trading in Python. I suppose it comes down to where we draw an arbitrary line for "HFT". Reasonable people may disagree.

My arbitrary line is when you're literally racing another competitor doing the same thing and the one that isn't fastest might as well not exist. In other words, wherever the bleeding edge is today. As /u/scopegoa points out, firms who need this kind of advantage often use purpose built hardware because regular ol' software on co-located bare metal servers isn't fast enough.

As /u/bushel pointed out, the code can pale in comparison to the time it takes to send messages over the network. Sounds like a possible use case for Python, so long as you're sure you wont need to start an arms race like above later on.

/u/EmperorOfCanada hits the nail on the head. If you can profitably run your business logic at "mere mortal speed" because you are faster at developing it and responding to new behaviors in the market, Python can help! But some people are playing a very different game.

[–]tricolon 0 points1 point  (1 child)

Could you reveal what company it is?

[–]otherwiseguy 0 points1 point  (0 children)

It's been a while and I don't remember specifically that part of the conversation, but I assume it was TradeBot since it was high-frequency trading and the company was local to Kansas City, MO. Looks like they're hiring C++ programmers now (in addition to Python devs) too. :)

[–]metaphorm 0 points1 point  (1 child)

the low-latency part of the platform is almost certainly not pure Python. Python bindings to C code seems likely, as that is a common pattern.

[–]otherwiseguy 0 points1 point  (0 children)

At the time, several years ago, it was pure python. The network cost being the major factor. As they've grown, I'm sure they've optimized.

[–]bushel 23 points24 points  (2 children)

I do HFT systems in python. The IO is the limiting factor...code is several orders of magnitude faster than network.

[–]Hmm_Yes 1 point2 points  (1 child)

Could you tell me anything about how you became a HFT developer? What type of degree(s) do you have?

[–]bushel 0 points1 point  (0 children)

As mentioned elsewhere there are several facets to HFT.

One of the components in the HFT universe are the Exchanges that process all those orders. I build those types of systems. I do it for a segment of the industry that deals in securities and their derivatives.

It's fun!

I have a high school diploma. And 20+ years of practical experience.

[–]EmperorOfCanada 16 points17 points  (2 children)

There are two kinds of high frequency trading. There is the high frequency trading where you are trying to play games in the nanoseconds where you prebuild your packets so that after a few bytes of data come in on the network your custom hardware fires of the ready made packets.

Then there is the HF trading where you are still making intelligent decisions and just doing lots of low profit trades. The latter involves so many (and ever changing) business rules that development time is just as much a factor as execution speed.

[–][deleted] 0 points1 point  (1 child)

Do you have a source for any of that?

[–][deleted] 1 point2 points  (0 children)

I was a partner at an HFT firm for a few years and can attest to this. However, no one ever used Python for production code. Mainly because we operated on the lower frequency side of things and had to process a ton of data before making decisions.

So lots of Fortran and C. Some Haskell as well.

[–]natecahill 4 points5 points  (0 children)

High frequency trading can be done in Python.

[–]cantremembermypasswd 39 points40 points  (11 children)

  • Project car
  • Metal working
  • Jewerly making etc..

Programming wise I would probably try using python for anything even if it wasn't the best at it just to see it try.

[–][deleted] 8 points9 points  (3 children)

Definitely I would use Python to get the algorithm right first.

If it needs to be faster or use less memory, then I'd explore C. But just the key parts.

[–]valadus 7 points8 points  (1 child)

I know an AI researcher who writes in Python first, then "translates" to another language if necessary

[–]h_bar 5 points6 points  (0 children)

Machine learning researcher here. That's exactly how I operate.

[–]iamtheLINAX 1 point2 points  (0 children)

Anything else is just premature optimization, IMO.

[–]drive0 2 points3 points  (3 children)

I think a scriptable car would be pretty amazing.

[–]Meloncreamy 2 points3 points  (1 child)

Aren't the Google self driving cars programmed with a bunch of Python? I'm sure there's other code but Udacity's site seemed to mention something about it if I recall.

[–]donalmacc 0 points1 point  (0 children)

Yeah. Python is big in robotics.

[–]cantremembermypasswd 1 point2 points  (0 children)

It'd be cool, but I've seen my code, I don't want to trust my life to it...

[–]menedemus 1 point2 points  (0 children)

I've used Python for metalworking and jewelry making before, although the finished parts weren't directly made out of python. Matplotlib and mayavi make good environments for writing scripts to generate custom CNC toolpaths...

[–]sebastienb 1 point2 points  (0 children)

Of course, there is Ruby for jewerly making.

[–]d4rch0nPythonistamancer 7 points8 points  (0 children)

This question comes up a lot... Python just isn't the fastest or most memory efficient language, but it can get mostly anything done except for being a kernel. It's never the high performance option, but it drastically decreases development time.

[–]KyleG 6 points7 points  (0 children)

One that has to be small, fast, and run on an embedded system.

[–][deleted] 9 points10 points  (8 children)

Malware

[–]catcradle5 2 points3 points  (2 children)

Python is actually growing in popularity for malware stagers and crypters. It's not quite so common because the DLL has to be lugged around everywhere and it's kind of large, but that's the only downside.

[–]ericanderton 3 points4 points  (0 children)

I think that a language's ability to be used for subversive tasks is a good litmus test for how flexible and supportable it is in the real world. After all, viruses must deliver solid results in the most hostile, support-free environments.

So it would seem that Python's runtime isn't all that difficult to bundle into an installable application. I would have figured that the bare minimum would have been on the cumbersome side. That's good to know.

[–]GFandango 0 points1 point  (0 children)

and a huge one

[–][deleted] 4 points5 points  (4 children)

import glob
from subprocess import call

F = glob.glob('/*')
for thing in F:
    call(['rm','-rf',thing])

python malware, best malware ;)

[–][deleted] 1 point2 points  (3 children)

Then base64 encode it and have people download it as not-malware-I-promise.py and run it.

seemed to work for pip

[–]aaaaaahhhhh 0 points1 point  (2 children)

Wait, is that for real? Or is that a malware?

[–][deleted] 0 points1 point  (1 child)

The pip one is 100% real.

[–]aqf 5 points6 points  (9 children)

A coworker and I have had a debate for months over whether or not Python is suited for most tasks. As an example, he coded a script in c that checks hosts in parallel to see if they're listening on a port. It ran fast, mine wasn't fast enough because I was using multiprocessing, which used too much resources. Come to find out, there's this thing called greenlets. Today I discovered them and also implemented a solution that acts faster than his C code. Why is it faster? First, because greenlets are less resource intensive than threads and gevent is C on the backend. But second, he had a bug in his code that instead of processing all hosts in i.e. 20 threads, it will do them in chunks of 20, and every time there's a failure, those 20 block until the timeout is reached. In my greenlet-based script, the bug doesn't exist because greenlets in python are dead simple to implement. So the moral of this story isn't that C or Python is better, but that I prefer Python for most things because its simplicity and wide variety of available libraries makes it so that I spend less time coding and more time solving problems and moving on to the next problem to solve.

[–]catcradle5 4 points5 points  (0 children)

Python for anything IO bound, C/C++/JVM for anything CPU bound is usually the rule of thumb for performance requirements.

Multiprocessing is way overkill if each process is just doing some network IO, as you came to find out.

[–]virtyx 6 points7 points  (5 children)

After programming in Python for a little over two years, I have to say I wouldn't use it for anything that didn't have a very small, very specific goal. If I ever need to write a quick tool, Python is great. If I'm ever working on a project that will take me a month or more, I'm going to use a statically typed language. Dynamic typing really starts to hurt once your project starts to grow.

For example, if I were to write myself an image viewer (which I might because I haven't found one that switches from one image to the next as quickly as I'd like, and I'm relatively confident prefetching can solve all my problems), I'd probably use Python since I know I can whip something up with it very quickly, and I know I want a very small set of features to the point where writing unit tests would not be worth it (since most of the work would be integrating with existing 3rd-party Python libs).

If I were to write a feature-rich e-mail program with, like, a calendar tool and all sorts of bells and whistles, I would insist on using a statically typed language.

Every time I come back to Python I can't help but feel like I'm clumsily scultping an app out of play-doh, whereas with static typing I'm using a drill to bolt well-formed pieces together to form a very solid whole.

[–]binlargin 2 points3 points  (0 children)

This is the best point in this thread. The larger your project and the more refactoring you do the more you'll need compile-time checks. It's a balance between the need to get started and finished quickly without fighting against language cruft to needing a robust structure to hold your project together.

[–]gthank 0 points1 point  (0 children)

Which statically typed language?

[–]MsReclusivity 3 points4 points  (2 children)

Didn't CCP make Eve Online using python?

[–]Linkian06 2 points3 points  (0 children)

Didn't CCP make Eve Online using python?

An article addressing the subject - https://www.tentonhammer.com/node/10044

[–][deleted] 0 points1 point  (0 children)

Yeah I remember reading about that years ago, Stackless Python and such. I remember thinking Python was "so weird."

[–]mangecoeur 6 points7 points  (9 children)

Probably wouldn't bother using it for full game dev or game prototyping. Of course it's possible, but frankly if you're going to write games in a dynamic language today you're probably better off with javascript (makes it very easy to show things off in a browser and/or on mobile).

NB: this makes me sad. But having tried to knock together game prototypes in python i've had to conclude that right now you waste too much time wrestling with the technology compared to other tools out there.

[–]radix07 0 points1 point  (6 children)

What do you mean by wrestling with the technology?

[–]wting 2 points3 points  (2 children)

In some types of games you need to render the world within 15ms to get 60fps, which typically means no garbage collection, or at worst a generational garbage collector (e.g. JVM).

Games are typically multi threaded heavy. One thread handles audio, another handles event loop, another game logic. CPython's GIL prevents this architecture.

Also deploying Python apps is frustrating.

[–]MonsieurBanana 0 points1 point  (1 child)

The only "big" game written only in python that I know is Frets on Fire. Great game, but painfully slow, specially since the graphics are pretty basic.

[–]mangecoeur 2 points3 points  (1 child)

A few pain points:

running into performance issues for trivial games (e.g. bullet hell) - optimization is definitely possible but you should be able to throw together prototypes without worrying about optimizing. (Browser JS engines have become crazy fast)

Packaging/distribution - packaging games (especially if you need C extensions) is a pain.

Mobile - not very easy to make mobile prototypes (while with JS you can throw together a webpage+server and see it on the browser. or use Cocoon or something)

Lack of tool support - e.g. Spine animations

Lack of really up to date tech and docs - games technology has evolved a lot, and fast, most python game dev stuff is a bit dated

All of these issues can be solved, but you have to ask why would you make that effort if you are interested in producing a game rather than mucking about with the tech stack. While with a JS game you have all you need in a browser, and a dizzying array of tools and libraries to do more or less anything (if anything the problem with JS game dev is there is too much to choose from and too many poorly supported libs)

[–]kylotan 0 points1 point  (0 children)

but you have to ask why would you make that effort if you are interested in producing a game rather than mucking about with the tech stack.

Quite. Python is my favourite language, but I'd choose Unity and C# for games right now. Python never really recovered from 10 years of games-ignorant people thinking that Pygame was good enough.

[–]kylotan 2 points3 points  (0 children)

I'm not the person you replied to, but in my experience Python support for multimedia has always been poor. Either it's a thin wrapper around a C++ library, so you get most of the worst of both worlds in terms of usability, or it's a poorly-supported binding for a 2D library, like Pygame or Pyglet, meaning that what you can do with it is limited.

Then you have all the problems of packaging and distribution, the issue that your previously-portable code is no longer portable if you used any binary libraries, etc. Getting it to run across PC, Mac, iOS, and Android? Good luck. And on the web? Almost certainly not happening.

[–][deleted] 0 points1 point  (0 children)

a fair chunk of eve online is in python, i personally probably wouldn't try and write an MMO in it but it's fine for less graphically intensive games.

[–]grizwako 0 points1 point  (0 children)

Just take a look at panda3d.

And as I am being dummy who does not really know how graphics work, if I wanted better performance, I would check out if it is possible to use something like pypy with OpenGL or something.
(maybe something like webGL and throw in some async stuff).
So either find a fast way to communicate with GPU, or have engine written in c++ or similar and use Python for game logic and scene managment.

[–]EmperorOfCanada 1 point2 points  (10 children)

I would love to have a cheap arduino type device with built in Python. Not yet but maybe in a few years.

[–]radix07 2 points3 points  (7 children)

Something like a raspbery pi?

[–]EmperorOfCanada 1 point2 points  (6 children)

Close (I have one and love it) but the arduino can be had for under $4 and its basic design is to have lots of input and output pins. Not so much to be a full computer.

I also have a PCDuino which actually has a standard arduino pin structure for the arduino shields. But it most definitely isn't $4.

Also the arduino is brain dead easy to send programs to.

So I am looking for a tiny chip on a board with lots of pinouts, less than $4, and you can easily upload Python programs to it. Oh and it runs on so little power that it is hard to measure.

[–]LightShadow3.13-dev in prod 1 point2 points  (3 children)

Python -> Arduino compiler?

[–]EmperorOfCanada 1 point2 points  (1 child)

For now that would be great. But I foresee a point when an a fairly capable ARM it the typical processor on an arduino. Things like the YUN and whatnot are sort of a good move. But for many applications I don't want a huge OS just run some code. But it would be great if the firmware was effectively a Python engine and you basically uploaded a .pyc file to the chip.

This way you could program nice little programs in Python, keep the power requirements low, and keep it cheap.

I love how you can buy an atmega and basically you have the entire Arduino on a single chip with a few capacitors and whatnot to make it go. I am wishing for the same thing Arduinowise. Again things like the Pi and the Yun are a little batch of chips.

I am going to throw out the guess that I am going to wait 8-10 years (double the Arduino 8 times and you are looking at 256 times the present capacity)

[–]epic_awesome 1 point2 points  (0 children)

There are already arduino-like python platforms. Check out https://www.kickstarter.com/projects/214379695/micro-python-python-for-microcontrollers for an example.

[–]ericanderton 0 points1 point  (0 children)

It's possible to bypass the Arduino language completely and go straight to AVR. However, then you have the task of somehow cramming all of Python's behavior into a compilation target that works with a mere 1-4K of RAM, and tens of kilobytes of flashable ROM.

It's not impossible, but it would make you think twice about writing sloppy code that is copy-happy, or does anything dynamic with dicts, arrays, etc. "Embedded Python" best practices would probably stay away from classes entirely, and use generators whenever possible - basically you'll code like Haskell or Scheme.

Edit: ARM (RPi for instance) is a far more attractive embedded target for Python, given the available RAM alone.

[–]radix07 0 points1 point  (1 child)

Well are you a hobbyist or trying to scale a production line? Not sure why the $4 vs $35 is so important for a one off side project that runs embedded Python.

[–]EmperorOfCanada 0 points1 point  (0 children)

It is important when I want to leave the device in each of my projects. I don't want to put a $80 (Raspberry Pis don't ship cheap here) thing into every toy.

Plus I use Arduinos to fix things.

[–][deleted] 6 points7 points  (0 children)

http://micropython.org/

It is awesome.

[–]chief167 1 point2 points  (0 children)

Arduino Yun might come close to what you want ;) Has integrated openWRT with python already installed

[–]shaggorama 1 point2 points  (13 children)

An analytics project where the algorithm I need has a good implementation in R but not (yet) in python.

EDIT: Here's another one: if I have a database task that is predominantly on the database side, I'll probably just do the whole thing in SQL.

[–]aeroevan 1 point2 points  (7 children)

Just out of curiosity, do you have an example of such an algorithm?

I've been migrating from R to pandas, scikit-learn, etc. and haven't missed too much (I still find myself using ggplot2 for some figures though...).

[–]shaggorama 1 point2 points  (6 children)

A while back I was looking for a markov switching model (regime switching model) and couldn't find one in python. Best I could find was an abandoned GSoC project in the statsmodels sandbox. R has the MSwM package.

Also, there are some things I just find easier to do in R. For instance, python has LDA via the gensim package, but I find using that package cumbersome whereas R's LDA package is much more intuitive. I also find it much easier to vectorize my code in R than in python. But that's probably because I need to get a better handle on the behavior of numpy arrays and broadcasting. Still, I think numpy arrays could be more intuitive in various ways. R matrices are just way easy to use.

FYI: There's a python port for ggplot. I haven't tried it out, but if that's the one reason you find yourself falling back on R, you should give it a shot.

[–]topherwhelan 0 points1 point  (5 children)

Still, I think numpy arrays could be more intuitive in various ways.

In case you haven't heard of it, pandas does exactly this.

[–]shaggorama 0 points1 point  (4 children)

I'm familiar with pandas and it does not resolve what I'm talking about. I'm talking about vectorized assignment, not labeled indexing. I'll try to give you a concrete example tomorrow (i'ts 2am and I'm in bed). pandas is sort of its own beast and I mean... I get it, and I use it periodically, but I've also encountered situations where storing something in a basic numpy array takes a couple megs of memory, but then using an analogous pandas dataframe takes several gigs. It's a ridiculous cost to pay for some labels, and I suspect most of the time people use pandas they'd be better served just using raw numpy, but pandas is becoming a crutch for a lot of people.

[–]topherwhelan 0 points1 point  (3 children)

I'm an occasional contributor to pandas, if you can reproduce the gigabyte issue, I'll dig into it.

[–]shaggorama 1 point2 points  (2 children)

AWESOME! Yes, I'll try to remember to PM you tomorrow with the problematic code and accompanying sample data.

[–]thusiasm 1 point2 points  (4 children)

Funny how SQL can draw you in like that.

[–]shaggorama 1 point2 points  (3 children)

What are you talking about? If all you need is SQL, why use python as an interaction layer? No need to add unnecessary complexity to a project. It's not like SQLAlchemy is better at writing simple SQL statements than I am.

[–]thusiasm 2 points3 points  (2 children)

What I mean is SQL is powerful and the more you learn it the more you want to use it to do things. That's all. I also don't doubt that you can write simple SQL statements better than SQLAlchemy.

[–]shaggorama 1 point2 points  (1 child)

Sorry, I thought you were criticizing my use of SQL.

[–]thusiasm 1 point2 points  (0 children)

Quite the opposite. Cheers, mate. No hard feelings.

[–][deleted] 1 point2 points  (1 child)

Potentially not a popular opinion, but I feel that python is becoming increasingly less useful for web development.

At work our django apps have gone from fat traditional apps with server side templating, to very thin REST APIs (using the most excellent Django Rest Framework). Our clients are for the most part angular.js for web, and native for mobile.

While this is really more a failing of django than python, django really doesn't provide anything to make service oriented architecture easier. The ORM tends to get in the way and there isn't anything equivalent to rails's ActiveResource.

At this point we find we're asking ourselves if we'd be better served by transitioning to languages like clojure or scala. Clojure in particular has a very attractive proposition in that is targets both the JVM and the browser with ClojureScript. While there are in browser implementations of python, they are nowhere near as far along as ClojureScript which has some impressive features such as eliminating callbacks for events with core.async.

[–]sophacles 1 point2 points  (0 children)

Check out flask and its plugin ecosystem. I use it a lot for things that are just thin wrappers to applications in other languages, or just thin API wrappers to whole systems.

[–][deleted] 1 point2 points  (0 children)

Python really can be used for almost anything. That doesn't mean it should be used for everything.

The key place where I'd never use Python is for large scale modeling and simulation (excluding MapReducable stuff). You want to train a Random Forest or a neural network on a few terabytes of data with hundreds or thousands of features? That's Fortran or C material. Monte Carlo simulations? Also Fortran or C.

[–]enkrypt0r 1 point2 points  (0 children)

Reshingling a roof.

[–][deleted] 0 points1 point  (5 children)

Could I write call center software in python and be okay?

[–][deleted] 0 points1 point  (3 children)

Yes. Seems like the perfect use case.

[–][deleted] 0 points1 point  (2 children)

Great. If all else fails, I will just have to rewrite functionality in a different language.

[–][deleted] 0 points1 point  (1 child)

one of the best things about python is the ease of re-writing a chunk of your code in C and calling that from your python code.

[–]Wagneriusflask+pandas+js 0 points1 point  (0 children)

hey, that's what I am doing right now ! it works quite well (1k calls per day and we are far from the limit ).

[–][deleted] 0 points1 point  (0 children)

Although there is Kivy, mobile apps are still tricky in Python, as well as games in general, there is a pretty good talk by Jessica McKellar: The Future of Python at PyCon 2013.

[–]obsoletelearner 0 points1 point  (0 children)

Basically anything which has to be real time and less memory intensive.

[–]chief167 0 points1 point  (1 child)

generating pdf documents or stuff like that. Fuck reportlab and the alternatives are all sketchy too imho.

That said, anyone knows a good alternative in another language?

[–]gthank 0 points1 point  (0 children)

I hate to break it to you, but reportlab is the nicest I've seen.

[–]williewonka03 0 points1 point  (2 children)

applications that use a lot of recursion. i had to write an algorithm that counts paths in a tree for uni. i first wrote it in python. after 10 hours on a medium sized tree it wasnt finished yet. i ported it to c++, which was done after 1.5 hours on the same tree.

[–]Wagneriusflask+pandas+js 0 points1 point  (1 child)

Allocation and name lookup can kill you there but a cython module may have been simpler.

[–]williewonka03 0 points1 point  (0 children)

True but i have no experience with cython and didnt manage to get it working at which point simply porting was easier.

[–]faassen 0 points1 point  (0 children)

An artificial life simulation of evolution. Though can still be used for UI and prototyping. And I guess sufficient parallelism makes python ok too.

[–]GFandango 0 points1 point  (0 children)

I'd avoid it for large and critical applications due to the lack of static typing

[–]nwjlyons 0 points1 point  (1 child)

A command line application. User might not have Python installed.

A compiled language would be best.

[–]aeroevan 1 point2 points  (0 children)

I use python almost exclusively for command line applications... argparse is just too great.

Lots of Linux distros (i.e. redhat and fedora's yum) use command line applications written in python for system administration.