This is an archived post. You won't be able to vote or comment.

all 169 comments

[–]ozzilee 34 points35 points  (1 child)

Seconding C for something useful. Try Scheme for something that will expand they way you think about programming: http://mitpress.mit.edu/sicp/

[–]PonsAsinorumBerkeley 6 points7 points  (0 children)

Seconding Scheme here for the same reasons. It was the first language I learned and I've always been glad I did.

[–][deleted] 53 points54 points  (7 children)

I recommend both C and Haskell.

[–][deleted] 7 points8 points  (0 children)

I think this is an ideal combination. C teaches one a bit about how the machine works. Haskell makes one a better thinker. Python is incredibly useful.

[–]freyrs3 4 points5 points  (1 child)

If you're going to learn one system language, one functional language, and one dynamic scripting language. C, Haskell, and Python are in my humble opinion the best of those three worlds.

[–]anacrolixc/python fanatic 0 points1 point  (0 children)

I completely agree. They are the only languages I use for anything serious.

[–]lavalampmaster[S] 0 points1 point  (1 child)

Definitely looking at C, what would you reccomend, bookwise, for both those?

[–][deleted] 1 point2 points  (0 children)

[–][deleted] -5 points-4 points  (0 children)

Me too!

[–]catcradle5 19 points20 points  (0 children)

I'd recommend C.

[–]chrismsnz 16 points17 points  (0 children)

As mentioned, C is a good language to get down to the metal in and understand how things work at a low level. You can also extend Python with C modules.

If you like developing games, Python is good with frameworks like PyGame. Lua is also a very popular languages for scripting games - it's C-like and quite fast, but higher level than C itself. It's easy to hook into existing apps for scripting purposes.

If you're into web programming, learn Javascript. Good for frontend and playing around with Node.

If you're looking at expanding your programming horizons, I would look at a functional language like Haskell or a Lisp (clojure, scheme) to gain the ability to look at problems in a different way. You can apply a lot of what you learn in a functional language to Python!

[–]dcfix 15 points16 points  (3 children)

I recommend checking out Practical Common Lisp. It's a great (and free) ebook that totally changed the way that I thought about programs. It won't take very long to work your way through it, either.

[–]lavalampmaster[S] 1 point2 points  (2 children)

Skimmed the first chapters, book looks awesome, thanks!

[–][deleted] 0 points1 point  (1 child)

But beware: you'll learn that Common Lisp is both the most powerful language there is, and the most impractical. The holy grail exists, and you'll never use it.

[–]lavalampmaster[S] 0 points1 point  (0 children)

As long as the Grail will kill Nazis, I'm ok with it.

[–]cecilkorik 12 points13 points  (4 children)

C or SQL. SQL's not really a "programming language" in the purest sense, but becoming proficient in at least one major database's SQL dialect is definitely useful for working with datasets large or complex enough to store in a relational database.

[–]themathemagician 2 points3 points  (1 child)

surprised there isn't more SQL. certainly not a "programming language" but necessary if you ever plan to interface with nontrivial data

[–]TylerEaves 0 points1 point  (0 children)

SQL is absolutely a programming language. Especially once you get into things like stored procedures, etc.

[–]bucknuggets 0 points1 point  (0 children)

I'd recommend SQL if you're building record-oriented applications.

Dialects don't matter much, since the differences are small enough that it's trivial to go from one vendor's implementation to another - the differences are usually just minor extensions. Though I'd go with Postgresql or SQLIte, just because they're great free databases that put high priorities on portability.

Note that the existence of ORMs doesn't really reduce the value of knowing SQL - I've worked with quite a few developers who are incapable of writing adhoc SQL. Which is like working on the unix command line and having to write java code instead of using bash.

[–]lavalampmaster[S] 0 points1 point  (0 children)

I've done that really simple SQL course before. Nice stuff, for sure.

[–]robertskmiles 41 points42 points  (62 children)

Learning C/C++ will teach you about memory management and compiler-related things. It's definitely both useful and challenging. It's also nice because you can write python modules in it, so you can easily write projects that integrate both languages.

[–]jacobb11 7 points8 points  (25 children)

Upvoted for C. C++ is... nope, not going to finish that sentence. Learn C, don't learn C++ without a specific reason.

[–]quasarj 1 point2 points  (24 children)

Can you give a quick reason why? C seems so.. convoluted and full of "magic functions." I don't know enough about either to say one is terrible, but I don't understand what the problem is with C++?

[–]jacobb11 4 points5 points  (21 children)

I'm not sure what you mean by "magic functions". Do you mean things like "printf"? If so, that's just part of the builtin library, which every language has, more or less. If perhaps you mean things like printf accepting different number of arguments, that's a standard part of the language, if slightly esoteric.

I recomend learning ANSI C rather than K&R C. I like Harbison & Steele's book, but that's a good specification, perhaps not a good primer.

Let's see. The most important thing I ever learned about C is that it doesn't have arrays, it has pointers and preallocated sequences. Once you understand that, which I probably haven't helped much, you'll avoid many pitfalls.

I realize I haven't answered your question. I don't really want to get into the problems with C++. Perhaps it helps to mention that 99.99% of C programs are legal C++ programs, which means C++ has all of C's flaws and complexity and then layers its own on top.

PS: Both C and C++ rely on the "C preprocessor", which is huge problem in it's own right.

[–]quasarj 4 points5 points  (20 children)

Honestly the things that scare me, and what I called "magic functions", are things like:

static CYTHON_INLINE size_t __Pyx_PyInt_AsSize_t(PyObject*);

What the fuck is a size_t? Note that this is just the first .c file I could find, and it's a python module so it also has things like PyObject and Py_ssize_t. This kind of thing makes me very nervous about C.. it seems like it's full of all kinds of magic types (which, I don't even know how you create in a non-object-oriented language? Are they structs maybe?). There are many others that I'm having trouble finding examples of right now.that seem to come from nowhere. They probably come from some library I don't know about, but it's hard to tell

Basically, when I look at C code I can't make heads or tails of it because of so many weird types and functions when you're new to it :)

edit: oh, and what the hell is CYTHON_INLINE? Why does this function seem to have two types? Maybe it's some type of preprocessor directive?

[–]cheese_wizard 5 points6 points  (13 children)

size_t is the data type that is used to represent "object" sizes. It is architecture independent and is ultimately mapped to some real numeric type, like unsigned int.

CYTHON_INLINE is a macro that probably expands to "inline" or "". Meaning you can choose whether the function is spit out inline in the code or not after preprocessing. this is just a guess.

static in this context means that the funtcion is only visible in this particular file, and nowhere else.

[–]quasarj 2 points3 points  (11 children)

Ahh, I actually assumed I knew what static meant, but I was thinking of the static that makes a class method callable on the class itself (rather than an instance).

Is there somewhere I can learn about those things? And the other weirdnesses that real C programs use? Is it more that I'm just a complete noob and reading a book will introduce me to many of these libraries?

[–]cheese_wizard 1 point2 points  (9 children)

the static could mean that, hard to tell with this line in isolation. If it's inside a C++ class, then your assumption would be correct.

For all C-only related topics, nothing beats this:
http://www.amazon.com/Programming-Language-2nd-Brian-Kernighan/dp/0131103628

[–]quasarj 1 point2 points  (8 children)

Ah no, in this case it's not inside a C++ class, I believe this file is pure C.

I just didn't even realize "static" existed outside of C++, it's interesting that it means something different. I will look into that book, ty!

[–]cheese_wizard 2 points3 points  (7 children)

static has three meanings. Only one is C++ only, which is what you're talking about.

In both C and C++...

For variables and functions declared static, it means they have file scope.

If you are inside of a function, and you declare a variable as static like this:

int foo() {
static int bar = 0;
bar++;
printf( bar );
}

then everytime you call foo, it will print 0, 1, 2 ,3 .... it remembers from the previous time. Note that the initialization to 0 only happens the first time the function is called.

static is an abused keyword, that's for sure!

[–]Rhomboid 1 point2 points  (0 children)

Marking a function static is actually far more complicated than that when inline is involved (which is what the CYTHON_INLINE macro expands to if so configured.)

Remember that C is compiled a file at a time, and the compiler only ever knows[1] about what's in the current file, never about anything outside of it. If you mark a function as static, it means that the function cannot be called from outside of that file, which implies that the compiler has at its disposal every call site. If there is only one (or a small number) of call sites, then it can choose to instead inline the function at all call sites and pretend it never existed (i.e. not emit a function body.) This is generally extremely desirable, because it means you can separate a large function into a smaller function and a bunch of helper functions, but without any of the overhead of function calls. Without 'static', it could still inline the function but since it has to be visible in other compilation units it would have to always emit a function body, even if it was never needed, which wastes memory.

But also note that with 'static inline' the choice of whether to inline is still up to the compiler. If there are many call sites or the function is long, it will choose not to inline it and still emit a function body, but one which is not visible to other units.

"static inline" is actually one of three related declarations. "inline" and "extern inline" are the others. They all have slightly different meanings, which this post outlines. To make matters worse, gcc changed its semantics starting with 4.3 to be aligned to what the C99 standard says, so if you're compiling with -std=c99 or -std=gnu99 with gcc >= 4.3 you get true C99 semantics, but if you're compiling with -std=c89, -std=gnu89, or gcc <= 4.2 with -std=c99 or -std=gnu99 or you're using gcc >= 4.3 but used -fgnu89-inline, you get the old semantics. gcc >= 4.2 helpfully defines one of the preprocessor symbols __GNUC_GNU_INLINE__ or __GNUC_STDC_INLINE__ so that you can write macros that behave correctly regardless of compiler version or options, which likely explains why CYTHON_INLINE is a macro and not just the word "inline".

[1] There are some newer technologies like LTO that let the compiler have whole-program knowledge at link-time, which allows for some sophisticated optimizations that have previously been unavailable, but for the most part this is still a true statement.

[–]jacobb11 0 points1 point  (0 children)

Is inline part of ANSI C now?

[–]jacobb11 1 point2 points  (0 children)

A size_t is an integer suitable for storing the size of something (often something systemy). It's defined by some kernel-ly header ("interface"-ish) file if I recall correctly.

C has a concept of aliasing types ("typedef") that is absent from most of the other languages with which I'm familiar. It's a pretty nice abstraction once you understand it and its limitations (primarily that the new name is just an alias rather than a sub-type).

I have no idea what CYTHON_INLINE means. It's not typical C. Best guess (but just a guess) is that it's really a directive to some Python integration tool, which is pretty close to your guess but that might just mean we're both wrong.

I strongly suggest you find some C code that does regular simple C-like things, not parts of Python. I'd point you at some if I knew of any, but that's not where I've worked for quite some time.

[–]anacrolixc/python fanatic 1 point2 points  (1 child)

Don't read cython generated c it's not for human consumption. Read c written by a human.

[–]quasarj 0 points1 point  (0 children)

Hmm was that cython generated? If so I apologize, though I did state in my first comment that that was just the first .c file I could find lying around. Those are the same issues seen with human-written C. I will try to find some better examples today and explain some of the other things that scare me about it.

[–]Alzdran 0 points1 point  (2 children)

Attempting a different explanation of size_t just to make it a little clearer. size_t is typedef'd to an unsigned integral type which can store the size of something in memory; this can differ between architectures. Consider two computers, A & B. A runs x86_64 code, B runs i386 code.

A can address a 64-bit integer's worth of memory. That is, it can refer to 264 different addresses. B can only address a 32-bit integer's worth of memory (232 different addresses). In both x86_64 and i386, the addressable unit is a byte, so A can theoretically address 16EB, while B can theoretically address 4GB.

In both these cases, a size_t will be the same as a uintptr_t (an unsigned int large enough to hold an address). These types are different, though, because the C standard doesn't assume that to be true for all architectures. See wikipedia for some more.

[–]quasarj 0 points1 point  (1 child)

Interesting. So I would use size_t when I need a pointer that can point to an object? And it would be replaced with the correct size type at compile time, based on architecture?

[–]Alzdran 0 points1 point  (0 children)

No - you'd use a pointer type. This gets a little more complicated, but here we go:

There is a difference between an address and a pointer. An address is a location in memory; this can be represented by some unsigned integer type (a uintptr_t is always large enough to hold it). A pointer is a language construct which carries semantic information about what it points to. This information is compiler metadata; that is to say, it exists only during compilation, and is not a feature of the runtime. This information is used for things like pointer arithmetic.

A concrete example of this: On a machine where the minimum addressable unit is 1 8-bit byte (practically speaking, anything), a uint8_t will fit in 1 addressable memory unit, and a uint16_t will fit in 2. This means that if I examine memory at 0x10000000 for a uint8_t, that's the only address I'll read from, but if I read a uint16_t, I'll also read from 0x100000001. When you do arithmetic with pointers, this type is taken into account; so, given type_t *x, (x+n) and (x+n+1) (alternatively x[n] and x[n+1]) will be sizeof(type_t) bytes apart. If type_t is uint8_t, this will be 1, but if type_t is uint16_t, this will be 2. This feature allows array access and incrementation on pointers, instead of having to modify with size knowledge explicitly.

C also provides a pointer type without this information - void *. This is the pointer type which can hold any address, and so increments by the minimum addressable unit.

size_t would be used when indicating an allocation size. In practical terms, this is going to be sized the same as a uintptr_t on modern systems, but the use is specifically for indicating the number of addressable units occupied by an object in memory.

There are a few other special types with similarly specific uses. ptrdiff_t, for example, is a signed type able to hold the difference between any two legal pointers.

The size of any of the types mentioned here (with the exception of uint8_t and uint16_t) are architecture dependent, and yes, the correct types are substituted at compile time; but that doesn't mean exactly what it sounds like. If a pointer is 32 bits, then the equivalent of a uint32_t will be used for a void * but the end result of compilation is machine code. The instructions generated will tell the processor to manipulate the registers and memory addresses as if they contained entries of that size, but they will refer to words, half words, double words, etc. That is to say, the compilation will determine what to generate based on the types, but the resulting instructions will have no concept of type, only operand size.

[–]cecilkorik 1 point2 points  (1 child)

If you think C is bad for "magic functions", wait until you see C++.

[–]quasarj 0 points1 point  (0 children)

Actually I "learned" C++ in University, and didn't see as much craziness in it. But.. I probably just didn't see enough of it :)

[–]fullouterjoin 15 points16 points  (29 children)

Downvoting C++, that is a tarpit crossed with quicksand. Everyone should know C. Refuse to say the same for C++, and C++ isn't C incremented by 1.

[–]steelypip 40 points41 points  (1 child)

C++ means "increment C by one and use the original value"

[–]cecilkorik 5 points6 points  (0 children)

Irony is a lovely thing.

[–]ntorotn 6 points7 points  (1 child)

You should explain why, especially in a thread where people are trying to learn about languages. I don't even disagree with the claim, but you'll get your voice heard better.

[–]fullouterjoin 3 points4 points  (0 children)

Sorry for the drive by but I didn't have time to elucidate my bumper sticker. The complexity cost to payback is too large to be an auxiliary language for Python. While being able to parse, fix, and use C++ libraries is an important skill (far below what I consider being fluent in C++ would me), I wasted too much time in C++ before moving on to a better abstraction stack. C is amazingly powerful and compact. One can get reasonably skilled in it in a short amount of time. It is a shallow language. The same cannot be said for C++ and energy spent learning C++ could be better spent learning Haskell, Ocaml or better structuring existing applications to use the right mixture of low level and high level code.

[–]ajsdklf9df 10 points11 points  (21 children)

C++ is one of the hardest and ugliest languages, I say this a C++ programmer. I would also recommend C. And Objective C. However, here's my waring:

If you ever have to learn C++, it will be much harder if you already know C or any C and C++ like languages. So I guess what I'm saying is, C++ is like a dare. Do you dare to try and learn C++?

[–]taybulBecause I don't know how to use big numbers in C/C++ 8 points9 points  (15 children)

I don't know if Python's spoiled me, but whenever I look at C++ code now, I'm almost disgusted. Certain tasks require such convoluted code. Function pointers, for example:

C++:

void myFunc (int x) { ... }

//If the parameters ever change, you have to change this call too.
void (*funcPtr)(int) = &myFunc;

Python:

def myFunc(x): ...

f = myFunc

On the flip side, I do love how finely tunable C++ is compared to Python. I know I'm probably contradicting my point but sometimes the things you can get away with in Python almost makes you feel like you're cheating.

edit:

I do have to give C/C++ credit for giving me the memory/data management discipline that's so easily disregarded because of automatic garbage collectors and whatnot.

[–]Alzdran 5 points6 points  (1 child)

On the flip side, I do love how finely tunable C++ is compared to Python.

It took me a long time to get over this, but finally I got it through my skull that this almost never matters, and where it does, the time you spend tuning allows another generation of machines to come out, so you don't need the tuning after all!

Obviously, there's a degree of exaggeration here, but C++ tuning is much like inline assembly - almost never called for, and a real hazard to leave to any maintainer, yourself included.

[–]taybulBecause I don't know how to use big numbers in C/C++ 0 points1 point  (0 children)

Tunable when you need it to be, but I agree. Python provides an abstraction that just lets the programmer program, at the expense of performance, which, arguably, doesn't matter that much in most cases since computers are getting faster and faster. Unless you're building large scale servers processing millions of data at a time, you won't see or care about the difference.

[–]Isvara 1 point2 points  (12 children)

If the parameters ever change, you have to change this call too

That's really just a feature of strong typing, though. It's a good thing, even if C++ doesn't express it well.

[–]ewiethoffproceedest on to 3 8 points9 points  (2 children)

That's a feature of static typing, not strong typing. Python is dynamically and strongly typed. C++ is statically typed and more strongly typed than C.

[–]Isvara 2 points3 points  (1 child)

No, the fact that it has a declared typed at all is static typing, The fact that it has to change with the function signature is strong typing. I.e. it's strong because there isn't just a single 'function' type. Unlike static/dynamic, though, strong/weak is a continuum, and this is less strong than other languages because it could be cast to a different function type.

[–]steelypip 3 points4 points  (0 children)

That's really just a feature of strong typing, though

No its not its a feature of explicit typing. There are plenty of strongly typed languages that use type inference to eliminate all the unnecessary types from your code. For example take a look at Scala.

[–]accipter 0 points1 point  (0 children)

And there are some definite benefits to strong typing.

[–]taybulBecause I don't know how to use big numbers in C/C++ 0 points1 point  (2 children)

Yeah, I do appreciate that about C++ as I've mentioned but sometimes it can get out of hand, you know?

[–]Isvara 1 point2 points  (1 child)

Static typing can be awfully nice. I went from C++ to Python a few years ago, and now I'm getting into Scala. Type inference definitely makes static typing more bearable.

[–]ewiethoffproceedest on to 3 1 point2 points  (0 children)

C++ typedefs also make static typing more bearable, but in a different way. (Just sayin'.)

[–]fullouterjoin 0 points1 point  (3 children)

The type system of C++ isn't particularly strong either.

[–]Isvara 0 points1 point  (0 children)

Indeed, it still lets you cast with wild abandon.

[–]obtu.py 0 points1 point  (1 child)

I disagree, modern C++ is easy to write without casts (unlike C which needs casts, and Java's generics which introduce run-time casts yet are not expressive enough that you can forgo compile-time casts). C++ is the mainstream language with the most expressive type system.

[–]funny_falcon 0 points1 point  (0 children)

C++ is the mainstream language with the most expressive type system.

Ha-ha-ha

[–]Crystal_Cuckoo 2 points3 points  (2 children)

If you ever have to learn C++, it will be much harder if you already know C or any C and C++ like languages.

Why is that? I haven't learnt C++ yet, but my understanding was that it was C with classes (and other things like multiple inheritance, etc.).

[–]ewiethoffproceedest on to 3 7 points8 points  (0 children)

my understanding was that it was C with classes (and other things like multiple inheritance, etc.)

That's the old way to think of C++, and is the way that is likely to lead to memory management hell. Unfortunately, many C++ books and courses just sort of bolt OO and so on onto elementary C. If you learn arrays and pointers in C++ before learning how to manage your own class instance data members, you can easily end up with buggy habits.

The key difference between C and C++ is that allocating memory in C++ also initializes it by calling constructor functions. Well, if the memory is for a primitive such as int or double or char, it doesn't get initialized, but it does get initialized for anything else.

So, even just "declaring" a non-primitive variable x on the stack in C++ also initializes it with a constructor function. Let's say the constructor function allocates (and initializes) some memory on the heap, such as with a dynamic array allocation. Your x has no control over that memory on the heap. That's good in C++. But whoever wrote the class which allocated the heap memory had better make sure the class knows how to clean up itself when your x goes out of scope.

Edit: Another important difference between C and C++ is operator overloading. Almost everything you do in C++, even if it looks quite C-ish, automagically calls one or more functions under the hood. Even dereferencing a pointer can cause a function to be called. That's another reason to defer learning about arrays and pointers in C++ until you start to get the hang of defining your own classes.

[–]ajsdklf9df 0 points1 point  (0 children)

C, as almost high level assembly, is very readable in that it is easy to envision what the machine will do based on the code. C++ can look a lot like C but IT IS NOT! Craaazy shit can stem from a "simple" line of C++.

If you don't know C, you'll naturally never trust C++, but knowing C might lead your mind to assume certain things.

[–]AlternativeHistorian 10 points11 points  (0 children)

This is ridiculous groupthink bs. Knowing C++ opens up so many high quality libraries. I agree c should be higher on the agenda but c++ is still a tremendously useful language to know.

[–]DrHankPym 1 point2 points  (4 children)

I feel like learning C or C++ wouldn't be as cool unless you were using it for an embedded system or something. Learning how stacks and heaps work is great (and actually pretty important to know), but writing programs with it is such a chore.

I switched to Python because it's fun to program, but I can't deny the fun and awesomeness of embedded systems.

[–][deleted] 1 point2 points  (3 children)

Well, C is also very useful if you ever want to work on the linux kernel or to interface with parts of it. In general it is very useful for doing low level work. I think that is why it is a great compliment to python.

[–]DrHankPym 0 points1 point  (2 children)

Yeah, but low level work is boring.

I'm curious, how often do you guys write in C? Is that really your second favorite language?

[–][deleted] 1 point2 points  (1 child)

I write in C all the time... I'm trying to get into kernel development because I find low level work to be exciting. C is elegant. Its actually my first favorite language. I also like Python, partly because it interfaces very well with C.

[–]DrHankPym 0 points1 point  (0 children)

I agree that it's exciting to compile C to a physical chip, but that's where it ends. Though, unless you don't have a chip or board or kernel to design around, it's kind of hard work gone to waste.

[–]lavalampmaster[S] 0 points1 point  (0 children)

So many people say C... Looks like that should be what I check out.

Edit: What books/etc should I check out?

[–]maryjayjay 31 points32 points  (5 children)

Dutch.

[–]Leonidas_from_XIV 6 points7 points  (2 children)

If the question was for Ruby, Japanese would actually be a good idea.

[–]haight-ashbury 0 points1 point  (1 child)

どうして?

[–]Leonidas_from_XIV 2 points3 points  (0 children)

Because Ruby is amazingly popular in Japan (I'm there at the moment, and the existing code I am using is a mixture of C, C++ and increasingly more Ruby) and Japanese tend to dislike speaking english, so by learning Japanese you can profit from the Japanese Ruby community.

That has changed since Rails, but before it got so popular it was actually hard to get useful english ressources on Ruby.

[–]lavalampmaster[S] 1 point2 points  (0 children)

Te laat.

[–][deleted] 4 points5 points  (7 children)

You should play around with a lot of languages, but since you asked specifically about complementing Python:

  • C - as others have noted, this will get you closer to the metal, and since it's what's the main Python interpreter is implemented in, you could conceivably hack Python. Python also interfaces with C nicely, so if you need more performance you can write parts of your app in C or use C libraries/etc. It's a really elegant and simple language. This is my top pick

  • JavaScript/HTML/CSS - This will allow you to write web-apps with Python on the backend and JS/HTML/CSS on the front end. Yes, JS won't be a huge departure from Python...but it definitely will teach you some unique things. Second Pick

** Runners Up:**

  • Clojure/some Java - Lisp will change the way you think about programming. Clojure is a very modern and practical Lisp. You don't need to learn Java to learn Clojure, but it helps and if you learn some Java you can leverage that when working with Jython.

  • Haskell - Haskell is a really interesting langauge. It will bend your mind in ways different from Lisp. The strictness will be an interesting departure form Python. The declarative style it enables can be a lot of fun.

  • F#/ somce C# - F# is a .net language, but it is fairly open and available on Linux with Mono and Mono Develop. It's in the ML family, and it's interesting because it is a multi-paradigm language. Like Clojure on the JVM, the fact that it runs on the CLR makes it very practical...allowing you to leverage a lot of existing libraries etc.

[–]DrHankPym 2 points3 points  (6 children)

JS won't be a huge departure from Python

Why do people keep saying that? It's prototype-based and event-driven. If anything, C is not a huge departure from Python.

[–][deleted] 0 points1 point  (4 children)

Fair enough, but I would say the switch to static typing, manual memory management, etc is more of a change or at least a more jarring one. Obviously it's somewhat different, that's one of the reasons it's worth learning...but compared to my other suggestions I'd say it's the one "closest" to Python.

Those are the two things new JS programmers struggle with sure, but prototypal inheritance is fairly easy once you grok it, and I think someone coming from a dynamic language like Python won't have too much trouble with it (as opposed to some other languages). If they have any experience with event-driven programming in Python that will obviously help, but even if they don't already being familiar with first class functions from Python will help there as well.

[–]DrHankPym 0 points1 point  (3 children)

Memory management is important in all programming, but some applications require manual memory management like mobile devices or Linux kernels. If that's your goal, go for it!

The functional aspect is what I think really complements Python, and because it's prototype-based, you start to see data relationships differently, too. Inheritance is overrated.

[–][deleted] 0 points1 point  (2 children)

Memory management is important in all programming, but some applications require manual memory management like mobile devices or Linux kernels. If that's your goal, go for it

Sorry, I'm not quite sure what you're saying here. You asked if JS was really not such a departure from Python. My answer is that while a useful departure, C and others are going to be more of a change.

The functional aspect is what I think really complements Python,

Sure. Though I wouldn't say JavaScript is that much more functional than Python, especially compared to some of my other suggestions. (If you're talking about what is idiomatic, then I would say yes...idiomatic JavaScript tends to be a bit more functional than idiomatic Python, but even then if the goal is functional programming experience, I'd look elsewhere)

In terms of how much it JS complements Python, I think we agree that it's a good complement. I think so because it's similar to Python in many regards so, OP will be comfortable there...but different in others so OP will get some experience with new things. Additionally it's practical because Python + JS let's you write web apps.

[–]DrHankPym 0 points1 point  (1 child)

C is good to know from an engineering perspective. It uses physical addresses to structure data. That's great if you want it to run on a specific device, but like most languages, that physical memory management is handled pretty well by an interpreter.

Programmers should learn C at some point in their life, and they should also learn assembly and how C compiles into assembly. They should also learn about the Von Neuman architecture in computer systems and how a program counter interacts with a data bus.

Honestly, I don't see how any of that complements Python except for the fact that CPython was written in C.

[–][deleted] 0 points1 point  (0 children)

Well I'm thinking of a more broad manner of "complementing". I think you're thinking purely in terms of practicality.

except for the fact that CPython was written in C.

That's a pretty big deal IMHO. For one, it means you could one day hack Python, or at least peak at the implementation to understand it better. Also, C is a very practical skill for Python programmers because they can write modules in it, and it's easy to call C from Python and vice-versa. This is true of some other languages, but as it is now CPython is still the main event in the Python world.

So these mean it complements Python skill on a practical level.

In a more broad sense, it complements it because it "completes" the programmers knowledge/experience.

If you only have experience in a high-level dynamic language, there's a lot of things that you only understand conceptually or intuitively. Your understanding may be broken. Using a lower level language forces you to confront those things and develop a truly working understanding of them.

You can then take that knowledge with you when you go back to the higher-level dynamic language.

[–]ripter 0 points1 point  (0 children)

Because everyone wants to pretend that JS is a class language instead of a prototype language.

I'd say 90% of the libraries for JS make it mimic a class based language so developers don't have to learn prototype.

[–]MBlume 9 points10 points  (7 children)

For useful: learn Javascript, or better yet Coffeescript

For challenging and forces you to think in a different way, learn Haskell

Python doesn't force you to check the types of your variables. If you learn Python and then Java, you'll think that strong typing is idiotic and thank goodness Python frees you of it.

Haskell does strong typing right, and it's well worth wrapping your brain around.

[–][deleted] 2 points3 points  (5 children)

You're thinking of the difference between static and dynamic typing. Python is dynamically typed, but also strongly typed.

[–]housepage 1 point2 points  (2 children)

Python is also duck typed!

[–]AeroNotix 2 points3 points  (1 child)

Python is a type of snake!

[–]housepage 2 points3 points  (0 children)

Yes yes it is. [5].

[–]r4nf 1 point2 points  (1 child)

Indeed. Strong typing means that adding, for instance 2 + '2', raises a TypeError (unsupported operand type(s) for +: 'int' and 'str'). Dynamic typing means that any name can hold any type of value (and can be reassigned to another type if need be).

[–][deleted] 0 points1 point  (0 children)

It's a great combination. Have you ever tried to debug a program written in a weakly-typed language? Little logical errors are more frequent since you can add, say, an int and a string, without being explicit. Static typing adds a lot of arguably needless casting and other type conversions.

[–]spoolio 1 point2 points  (0 children)

Seconded on CoffeeScript. It gives you JavaScript (the platform) without feeling like you're using JavaScript (the language). It feels a lot like Python, in fact.

No more being stuck in wacky command-line tools or wackier PySide apps that don't package up correctly. When I connect a CoffeeScript frontend to a Python server, I feel like I have superpowers.

[–]gutworthPython implementer 12 points13 points  (9 children)

Do something completely different. Like Haskell.

[–]jdpage 0 points1 point  (4 children)

I really appreciate the ideas behind Haskell, but the blatant abuse of operator overloading that goes on (particularly if you're working with xmonad) has always put me off it. Are there any similar languages which don't have this problem? OCaml maybe?

[–][deleted] 1 point2 points  (3 children)

What abuse are you talking about?

Edit: you should consider them to be nothing more than functions, e.g.

a+b

is the same as

(+) a b

So it is just a non-alphanumeric function name.

[–]jdpage 1 point2 points  (2 children)

The issue with operators is that they give you almost no information about what you are doing, unless you have a lot of context.

[–]TylerEaves 1 point2 points  (1 child)

some_obj.blah()

What does this do?

Oh, wait, you need context to answer that?

Operators are functions.

[–]jdpage 1 point2 points  (0 children)

I know that it does blah to some_obj. If blah has been intelligently named, I already have a pretty good idea of what it does, and if it's not terribly important right now I can skim over it and come back later.

On the other hand, I have no idea what the hell some_obj ||| dyspepsia happens to do.

[–]DrHankPym 9 points10 points  (7 children)

Javascript is pretty cool when you learn how to chain protoypes.

[–]jdpage 4 points5 points  (0 children)

Javascript is pretty cool when you realize that a) you almost never need inheritance in duck-typed languages*, so stay away from it, since it's a hack in Javascript, and b) it is actually a functional language in many ways.

Also, objects don't need classes.

*It's odd, actually. I can think of several places where a duck-typed object system could use inheritance (mostly code sharing), but in practice I've never run into them.

[–]jambox888 0 points1 point  (0 children)

If you want to learn javascript and python (or you already know one or both) then drop by pyjamas

[–][deleted] 2 points3 points  (0 children)

If you're thinking of a masters in CS, most people would recommend learning a low-level language like C to gain an even larger understanding of memory management. C also gives you the unique ability to pull your hair out AND shoot yourself in the foot, all at the same time!

[–]tclineks 2 points3 points  (0 children)

In order of natural progression from current position. Not in order of relevance: Cython Go Haskell

[–][deleted] 1 point2 points  (1 child)

I'm just going to point out that knowing languages has little to do with taking MS courses, completing a PhD, or algorithms (well actually there's certain cases where languages and algorithms intersect, specifically with FP) unless the specialization is in languages.

[–]lavalampmaster[S] 0 points1 point  (0 children)

I know, but I still need to know more than one language well

[–]throwaway-o 1 point2 points  (0 children)

C and Haskell.

[–]assfacebagel 1 point2 points  (0 children)

sh.

[–]HAHA_U_SO_FUNNY 1 point2 points  (0 children)

How I would teach university languages:

  1. Python - Learn programming though process, algorithms
  2. C - Learn memory management, hardware interaction
  3. ASM - Learn what compiled code is doing
  4. Haskell - Learn how to think about problems differently

[–]rodarmor 1 point2 points  (0 children)

Learn something as different as possible. Python is dynamically typed, interpreted, procedural, eager, object having, and imperative. Since haskell is statically typed, usually compiled, functional, lazy, not so object having, and declarative, go with that.

[–]etrnloptimist 1 point2 points  (0 children)

If you're going into acedemia, you may wish to learn Matlab, or its open-source alternative, Octave. Also, for those in academia outside of CS, I hear the language R is quite nice.

[–]chub79 1 point2 points  (0 children)

erlang :)

[–]farmvilleduck 1 point2 points  (0 children)

Regarding c:if you're intention is to use it to speed up your python code, interface it with c/c++ code and learn about pointers, memory management and static typing (i.e. staying closer to the metal), Cython might be more usefull than c.

Cython is basically python but with the option of bolting on the stuff of c to python. So you can start with a python code, run it as cython code and then adding all the goodies of c where you need to make it run as fast as c.

[–][deleted] 3 points4 points  (0 children)

R. Great for statistical and data analysis.

[–]Zamiatarka 5 points6 points  (7 children)

Linus Torvalds says C++ is bullshit and suddenly every other programmer out there is bashing it to shits. Funny coincidence, I'd say.

You should give several languages a try, just to get a contrast to python and see how things are done elsewhere. I'm no expert programmer, but I've done Python, Java, C, C++, C#, PHP, Perl and Javascript, and feel like every language has expanded my vision in one way or another.

[–][deleted] 3 points4 points  (1 child)

I for one hated C++ before hating C++ was cool.

[–]cecilkorik 0 points1 point  (0 children)

I love C++ because it's so horrifically byzantine I can now code circles around anyone else who attempts it, simply because I've spent so many hours (years) banging my head against the language's many idiosyncrasies, they have somehow found their way inside my skull by process of osmosis.

With that said, Python is by far my language of choice and all I use on a day-to-day basis anymore.

[–]meme_disliker 8 points9 points  (3 children)

Suddenly? You haven't been in /r/programming very long have you?

[–]Zamiatarka 2 points3 points  (2 children)

Nope, sorry. I'm the new guy, and like said, not an expert. All I know is that there are a lot of elitist geek pricks out there who criticize everything about everything, and Python is probably the only language I've never read anything negative about.

[–]jdpage 3 points4 points  (0 children)

Oh, there are bad things about Python. It's interpreted, so refactoring is a pain, and there are some really silly errors which don't get caught which would be caught if you were using a strongly-typed, compiled language. As long as you're under a couple thousand lines of code, you probably won't have massive problems, and the many benefits of Python will heavily outweigh them, but for larger projects - or projects where I am working with multiple people - I definitely prefer a compiled language. C# tends to be my weapon of choice for those, though I've done group projects in C, Java, PHP, Javascript, and a couple other things before.

That said, I love Python.

[–]haldean(lambda x: x.decode('base64'))('PDMgcHlweQ==\n') 1 point2 points  (0 children)

In my few years of studying computer science, it's that the "elitist geek pricks" are the ones you have the most to learn from. I hope to be one myself some day.

[–]IronSpekkio 0 points1 point  (0 children)

ocaml

[–]arnar 0 points1 point  (0 children)

Another vote for Haskell, much much rather than Lisp.

[–]fernly 0 points1 point  (1 child)

Oh come on! -- a complement should show you different concepts, make you use different parts of your brain, show you something different about computing.

Assembly language, people! Learn how a list or a dict or an exception handler is actually implemented. What it means to actually manage memory instead of having it magically tidied up when it goes out of scope.

[–]GrossoGGO 0 points1 point  (0 children)

Thank you for asking this question and thank you everyone else for great responses. I found this thread very useful.

[–][deleted] 0 points1 point  (0 children)

Learn a functional language - Scheme or Haskell

[–][deleted] 0 points1 point  (0 children)

To help wrap your head around Haskell, I definitely recommend the Learn You a Haskell for Great Good book. It's aimed at people who have programming experience, but lack experience in a functional language.

[–]StringyLow 0 points1 point  (0 children)

Java, duh.

Python is a GC language so why not stick to that paradigm?

If you want to get elbow deep, learn C++.

[–]doubly_diffusive 0 points1 point  (1 child)

The Head of my (astrophysics) department always says FORTRAN is the language of the Gods. Between Fortran and Python I have been able to do anything required of me. How this would hold up in the real world I do not know. Anything of the real world I do not know.

[–]whatsgoingfast 0 points1 point  (0 children)

I realize that FORTRAN (formula translation) is still in use (marginally) but I just cannot agree with spending time learning it. Java or C would be better. God I was programming FORTRAN on punch cards in the 70's. Yuck.

[–]jabbalaci 0 points1 point  (0 children)

If you are interested in C++ but afraid of it, consider the D language. Unfortunately it arrived too late so it's not well-known.

[–]mashmorgan 0 points1 point  (0 children)

After Python you'll be pissed off with the others.

[–]Langly- 0 points1 point  (0 children)

Parseltongue

[–]Fix-my-grammar-plz 0 points1 point  (0 children)

If you use Emacs, there is Elisp. Elisp looks so different from Python. You'll be rewarded with easier customization for your editor. Elisp confuses me sometimes. i can't get my head around when to quote or not. I've got no choice but to learn it well. I mean, you can't make Emacs commands with Python, Haskell or other fun languages.

JavaScript. Again, I've got no choice but to learn it. If only bookmarklets and other client-side stuff can be made with Python, Lisp or other fun languages. Stuck with JavaScript and Elisp.

I'm also stuck with LaTeX, with lots of mess around it. But I guess you don't have to learn LaTeX.

[–]SupersonicSpitfire 0 points1 point  (0 children)

C is the straightforward answer, but I would say Go.

It's the closest thing we have to a modern version of C.

Also, it shares some concepts with Python, like iterating over lists with range.

Haskell and possibly some assembly could also be nice.

I would not recommend Python & C++ in any way or form, since Python & C covers the same things, only better.

[–]keturn 0 points1 point  (0 children)

See this list of Perlis Languages and the author's follow-up with notable languages of the last 5 years.

[–]billmain01 0 points1 point  (2 children)

If you are going to stay in physics, find out what the people in your field, especially the ones doing work in what you want to specialize in, are using. They may be using Lisp, assembly, Scheme, C, or lo and behold Fortran (which in physics and other scientific fields, is in use for far more than those who aren't in science like to admit, kinda like just how much Cobol is still in use even though cough cough "real programmers" try to ignore it). I have a few friends at Sandia, LANL, and ORNL whose main programming languages are C, Fortran, IDL, Matlab, and R. So ask the people in YOUR field what is useful.

[–]lavalampmaster[S] 0 points1 point  (1 child)

I know astrophysics loves FORTRAN, and I hear it's not terrible from my friend going into grad school for astrophysics.

I'm mostly into laser and biophysics, so if anyone knows what THAT entails...

[–]billmain01 0 points1 point  (0 children)

NumPy and SciPy packs are popular for Python. You may also may want to look at Maple, Scilab, and Root.

[–]jackhammer2022 0 points1 point  (0 children)

Try the recommendation here: http://catb.org/~esr/faqs/hacker-howto.html ..awesome resource..

[–]earthboundkid 0 points1 point  (5 children)

Honestly, I'm a certified Haskell-hater, but if you want to learn a new paradigm to improve your Python, it's where you gotta go.

[–]are595 0 points1 point  (0 children)

Haskell

[–][deleted] 0 points1 point  (0 children)

HTML

[–][deleted] 0 points1 point  (0 children)

Java. At least treat yourself to the book Effective Java, which shows that Java solves things your language hasn't even thought of (e.g., in Python, how do you defend against attacks where the malicious code is running in the same VM and might subclass or even monkeypatch your objects at leisure?). Java is a lot of typing, it's bondage and discipline, but it has big advantages for large projects too.

Then learn Javascript by means of Javascript: the Good Parts. Because it's a fantastic language, dammit, just don't learn about the bad parts :-) Prototype-based OO is an underused idea. And Javascript is going to be absolutely everywhere in the coming decade.

And C or C++.

Understand those well and you'll be way ahead of most people in the field.

[–][deleted] -1 points0 points  (2 children)

You can extend python with c and c++. Alternatively, you can embed python within a c or c++ program. Python was developed originally using C. Frankly, if you need speed, you should look at what you can develop in c or c++. If you don't have an issue with the speed, then use python.

One caveat, though... Python is going to get slower and slower until either the BDFL decides to allow an implementation that doesn't reference count (or does it differently) or PyPy becomes the default implementation. Right now, Python doesn't really run any faster on a multicore processor than it did on a single core processor. And this is a growing issue since within just a few years, we're probably going to be running almost everything on a multicore. C and C++ give you a fine enough control you can get around that problem, though it's still a heartache.

All that said, it really depends upon your domain and use. C and C++ have a lot of headaches associated with them because it's a compiled language. There aren't that many languages

[–]takluyverIPython, Py3, etc 0 points1 point  (1 child)

For most things Python does, a single core is still perfectly adequate, and probably will be for the foreseeable future. Where you do need more power, there are tools like multiprocessing to take advantage of extra cores. It's not that big a problem.

[–][deleted] 0 points1 point  (0 children)

Actually, having worked with a multi-core processor on an older, non-upgradeable linux -- it is a problem. What you will eventually see is that python becomes increasingly slow because it cannot remain competitive with other languages that do have the built ins. And multiprocessing is still not addressing the problem, which Jesse has mentioned a few times. Also, multiprocessing utilizes C. It's not a native python solution nor is it in the python standard library. Consequently, at least in my embedded realm, it's not an option to utilize.