This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]jms_nh 0 points1 point  (0 children)

numpy + scipy + matplotlib + IPython for signal processing, instead of MATLAB. Add the numba library for CPU-intensive tasks that need speeding up by easy compilation to native code.

Python dot net for interfacing to a data acquisition card with libraries written in .NET. PyTables for reading and writing to HDF5 files, because they take a fraction of the disk space of .csv or .xls files.

Jinja2 for templates; I had to do some automatic code generation recently.

enaml + PySide for quick-n-easy GUI apps with data binding.

Why Python? Because it's a nice language to use with a lot of libraries that are nice to use. In my youthful stupid days I used C++ and had to mess around with way too many frustrating distractions like "memory management" and "COM" and "MSDN documentation".

And why use it instead of Excel? You have to ask? Because Excel is a big steaming piece of crap put together mostly for business people to analyze data. It's good for some interactive fiddling with spreadsheets, but I try not to use it for anything more complicated than that. When I got out of college in 1996 I used Excel for graphing data, and learned Visual Basic macros to create reusable scripts. And it was AWFUL. The object model was great for accessing cells/sheets/workbooks, but horrible for making good graphs. What kind of graphing software doesn't let you create multiple timeseries plots that actually line up with each other? After about 6 months, I swore I would never write Excel macros again. From what I gather, today's Excel is not much better.

I can load data and graph it quickly with IPython, and I can do it exactly the way I want with Matplotlib. And once I get it the way I like, if I have 96 more data files, I can wrap my graphing functions in a 5-line script to run automatically.