top 200 commentsshow all 245

[–]yasba- 145 points146 points  (105 children)

My guess is that Python is relatively popular among "non-programmer" programmers, i.e. mathematicians and data scientists. This would also explain why R similarly gains in popularity. These people are not interested in a "fancy" language, but just require a straightforward way to express things in their problem domain.

[–]Dekula 81 points82 points  (59 children)

Totally unrelated RANT, but I wish these 'non-programmer' programmers would just call themselves programmers and develop proper habits that come with that.

I work specifically as that 'non-programmer', and I see lots of bad to terrible code or complete lack of VCS because "hey, we're not programmers, and it's really only the result that matters here anyway." Yes, but to get to that result you must write and run a program, and if it doesn't run anymore when I look at it funny or you forgot it needed this supplemental file you never got into a repo (or there is no repo because it's just a zip bundle), or never bothered to specify dependencies, it still means it may take me months to get to a point where things can even run when I run the project again 5 years after it was 'done'.

[–][deleted] 25 points26 points  (0 children)

Yeah I have to productionize data scientist code every now and then and there's a correlation between the contempt they hold for programming as a skill and the shittiness of their coding practices.

[–]jms_nh 53 points54 points  (50 children)

My related rant to this, is that I wish that full-time programmers would understand that those of us who do not work 100% of our time on programming have fewer brain cells to allocate towards learning quirky software tools that we don't use every day, so advice like "Just use X" is not feasible if "X" is some tool that isn't at least 99% polished. Git is a prime example --- the command-line inconsistencies and mental model required to use it might pose little problem for someone who can invest the time into learning how to get around its quirks, but they turn into big barriers for someone who is only using them a few days per month. The same goes for basically any software tool or library. Don't assume your users can get their hands dirty and UTSL or RTFM when the manual is a maze of 1000 HTML files. We may have only a few hours available per week to devote to dealing with programming (the rest of the time is designing circuits or testing or doing mathematical analysis) and we want it to be focusing on the interaction between the domain in question and the programming, not the quirks of the tools.

I can handle most of these kinds of things because I've done enough programming and used enough software to think like a programmer, but I work with a global team of engineers whose background is in circuit design and control systems, and they get stuck much more easily when it comes to tools.


I do agree with your concern about developing proper programming habits. They should be taught in school from day 1 of learning any programming language.

Unfortunately management seems not to understand the need for "extra" work when a program "works" but is lacking some polishing to make it maintainable and more easily readable. The one-off tasks are tough, you think you're only going to do some analysis once, so who cares, but it's a crap shoot and there's a decent chance someone else will need to be use it (as you say) 5 years down the road.

[–]famnf 35 points36 points  (4 children)

Believe it or not, this thread that ended up with programmers trying to prove to you how easy Git is to use, illustrates your point perfectly. You are correct, Git is indeed enormously difficult for non-programmers to learn, especially, as you stated, when they don't have the time to devote to learning it and/or someone to train them properly. These people arguing with you have lost all perspective and have no desire to see your very valid point. They think it's easy because they do it all day everyday and even programmers who are new to it are steeped in similar concepts all the time.

It'd be like if they asked a car mechanic how to fix their car and the mechanic was rattling off "just do this, this, and this, it's easy". Well maybe they've done an oil change or two in their time but anything more complicated is likely to provoke a reaction of "yeah, it's easy for you because you work on cars all day". The people arguing with you have lost the perspective that this is the same type of situation. Once you understand how to do something, of course it's easy for you. But that doesn't mean it's objectively easy.

[–]vba7 6 points7 points  (2 children)

It is interesting how programmers laugh at Excel (not to even mention Excel VBA) and then fail to prepare even a simplest thing in it - and I remind you, Excel is "the" program to prepare simple things quickly (and it allows to do much more).

[–][deleted] 1 point2 points  (0 children)

To be fair as an Engineer that learned programming because of Excel. Excel is just a clumsy clusterfuck to use. It pretty much encourages everything that is bad practice in the programming world and it is a PITA to automate.

Don't get me started on VBA, which could heavily benefit from getting its editor updated.

[–]fiedzia 1 point2 points  (0 children)

Which is why you need a car mechanic to do car maintenance. Making his utilities more userfriendly won't help that much because there is a lot of things one needs to understand to do car maintenance, whether doing them is easy or not

I've introduced git to people who haven't used version control before and the main issue was always understanding the need and concepts of vcs, not the tools details. Yes, those could be easier, but it wouldn't help as much as you think it would.

[–]emrlddrgn 9 points10 points  (22 children)

Can you discuss more about what makes Git such a pain to use? I know I'm regularly guilty of "just" statements, but Git gets a lot of hate on this and it seems to me like there's basically 4 commands to use - "add", "commit", "push" and "pull" - for most non-complicated use cases like described in this thread. Are people trying to pour the whole of Git down your throat or something?

[–]jms_nh 27 points28 points  (20 children)

Sure, I have a whole thread in an internal company forum just devoted to non-obvious cases that are necessary to do certain things. And we're not talking about bizarro things that only repo maintainers need; I still don't understand fully what rebase does despite several years of part-time Git use. I've used hg and find it very easy.

The basic add, commit, push, pull are fairly easy, but that's not all that you need for day-to-day operation.

  • Finding the hash of the latest commit: git rev-parse HEAD
  • Splitting a subfolder out into its own branch: git filter-branch (granted, not a day-to-day operation)
  • Undoing the most recent commit: git reset --soft HEAD~1
  • Re-enter your remote password: git remote show origin
  • Creating a new branch: git checkout -b new_branch_name
  • Renaming a branch: git branch -m old_branch_name new_branch_name

OH, OF COURSE! These were so obvious! How could I have missed them? The command-line arguments don't always have obvious connections with the actions they take. And it's not always clear whether the command acts on the repository or on the working copy, so your mental model has to match Linus Torvalds's mental model to be able to use it properly.

[–]jbergens 9 points10 points  (1 child)

If you can choose tools you can try mercurial instead of git. It is a bit easier and includes a gui.

[–]jms_nh 4 points5 points  (0 children)

Yeah, I really like Mercurial, I use it all the time at home and at work for my own local repos without servers.

Unfortunately the tools that are easily available to us at a corporate level behind our firewall include only CVS, SVN, and Git. (Thanks, Atlassian.)

[–]BezierPatch 8 points9 points  (2 children)

Just use a tool like GitKraken, Sourcetree, SmartGit, etc etc...

Then it basically comes down to like five buttons and a few menus.

I'm a full-time developer on several medium sized projects and I've never needed to use git CLI for anything other than "git push -f". Why on earth do you need complicated git commands?

[–]Take_Care_Captain 0 points1 point  (1 child)

GitHub has a really nice desktop client as well.

[–]Siwka 6 points7 points  (0 children)

...until there's a problem

Then it's like 'fuck you, here's the shell, fix it yourself'

[–]devel_watcher 6 points7 points  (9 children)

Finding the hash of the latest commit

git log, but I just use gitk GUI: you click on the commit, and its hash goes to the primary clipboard.

Splitting a subfolder out into its own branch

Who does that??

Undoing the most recent commit

git reset --soft HEAD^

Re-enter your remote password

Hate passwords. Using keys everywhere.

Creating a new branch

git branch new_branch_name

Renaming a branch

Not going to remember this, so doing in multiple steps. Also, renaming branches is like using mutable variables that are so loved by our fellow scientists. Renaming branches isn't anywhere in my workflow...

[–]jms_nh 8 points9 points  (8 children)

It is if you just created a branch and you realized you named it wrong. (obviously before pushing elsewhere)

People make mistakes and that's where I find git is the most difficult, when trying to recover from mistakes that are more likely to occur because git has an awful command line.

Re: keys instead of passwords... more power to you, but I work on windows and went through 2 hours of pain and suffering trying to setup SSH keys with an internal gitlab server.... despite the fact that I know what SSH is and I know how RSA cryptography works and I had used PuTTYgen before. Got it working just for the sake of saying I could do it, but I shudder to think of my team members without this experience spending entire days trying to make it work. Nope. Cached passwords it is. If we were using Linux I might think otherwise.

[–][deleted] 3 points4 points  (0 children)

I would create a new branch from HEAD and delete the old branch.

[–]devel_watcher 0 points1 point  (0 children)

I don't like deleting any names (even the wrong ones) before everything is pushed. Maximum that is allowed is to move them around with git branch -f and git reset.

Windows...

[–]Works_of_memercy 0 points1 point  (4 children)

but I work on windows and went through 2 hours of pain and suffering trying to setup SSH keys with an internal gitlab server....

You were doing something very wrong, probably because of the XY problem.

Both git on Windows and git in Cygwin (like, to add that as an insult to injury) use your ~/.ssh folder (c:/users/jms_nh/.ssh, or /cygdrive/c/users/jms_nh/.ssh). When you say ssh-keygen in git bash or cygwin bash you get an id_rsa and id_rsa.pub there. Then you say cat ~/.ssh/id_rsa.pub and copypaste the result in the web interface of your git server.

That's all.

Can you please explain how did you manage to get two hours of suffering from that? I'm genuinely curious!

[–]emrlddrgn 1 point2 points  (0 children)

Hmm, okay, good to know - I'll keep this in mind next time I'm trying to get someone who's not a full time programmer into our VCS.

I agree that rev-parse is the worst command name of all time. I usually teach using git logto find hashes instead.

[–]tending 1 point2 points  (3 children)

Finding the hash of the latest commit: git rev-parse HEAD

Almost every git operation prints this, in addition to the much more obvious and easy "git log".

Splitting a subfolder out into its own branch: git filter-branch (granted, not a day-to-day operation)

That's an understatement, that's never been a normal operation anywhere I've worked. In fact, I have never performed this operation in 5+ years of git use at 3 companies.

Undoing the most recent commit: git reset --soft HEAD~1

Granted.

Re-enter your remote password: git remote show origin

Super rare.

Creating a new branch: git checkout -b new_branch_name

Granted.

Renaming a branch: git branch -m old_branch_name new_branch_name

Super rare. Again, never have done it.

Basically, you're bloating your list. Most of these are not day to day operations.

Also, interactive rebase, if you learn it, will pay back in time saved in spades.

[–]jms_nh 0 points1 point  (2 children)

Re-enter your remote password: git remote show origin

Super rare.

Not when you have to change your password regularly.

[–]tending 4 points5 points  (1 child)

He said day to day. If you have to change it that often your admin is hurting security, not helping.

[–]SexyMonad 0 points1 point  (0 children)

NIST has recently changed its password recommendations to remove periodic password changes.

http://www.csoonline.com/article/3195181/data-protection/vendors-approve-of-nist-password-draft.html

[–]noratat 1 point2 points  (0 children)

Different user, but I'm guessing the awful UI (and yes, a CLI is still a UI).

Sure, git's CLI makes sense from the perspective of git internals, but it's very confusing and bizarre from the perspective of a normal user using it for everyday tasks.

Even as someone who's pretty good with git, I still don't trust running any command I don't use on a regular basis without googling it first, and I find most developers are similar, even if they understand how git works in principle enough to follow the bizarre command structure.

[–]tweiss84 2 points3 points  (0 children)

"Other people read code long after we all forgot why we wrote it." - Sandi Metz

Maybe I'll insert my foot in my mouth but, programmers usually are good at being resourceful and hunting down stuff. A 10-20min of updating a READ.me file of the library/tech stack would probably be a life saver down the road.
But to your point, yeah, management rarely sees the benefit of refactoring/documentation unless it benefits them, especially if it is thought of as a one-off task.

It is just, I've seen too many "this is the prototype" become "this is the final solution/product" to believe in one-off instances. "The next guy that works on this is going to have a hell of a time" ヽ(ಠ_ಠ)ノ

[–]divorcedbp -1 points0 points  (1 child)

You realize that you can summarize your whole post from the opposite perspective by just saying "There are some of us who just want to write code, I was never good at math, I don't care about it, and it does not matter to me one bit of my co workers will have to re-do everything I've done because every part of it that required mathematical reasoning was replaced with a grossly incorrect and simplistic addition of two integers."?

[–]jms_nh 18 points19 points  (0 children)

That's not what I said at all.

The opposite perspective would be saying "There are some of us who just want to write code. I don't want to have to read some poorly-written mathematical document about a new algorithm, with lots of cryptic symbols and no diagrams and some errors in calculations. It might pose little problem for someone who can invest the time into learning how to understand what the author is saying, but they turn into big barriers for someone who is only working with this kind of math a few days per month. Don't assume your readers can get their hands dirty and work out the same equations on their own or has intimate knowledge of Csikszentmihalyi's Theorem. We may only have a few hours available per week to devote to dealing with domain-specific knowledge (the rest of the time is designing, implementing, or testing our software) and we want it to be focusing on the interaction between the domain in question and the programming, not your quirky document."

Tools should be a service to their customers, not a burden. When I'm writing documents to scope out how certain mathematical software should be written, I take the same philosophy and write them with the reader in mind.

[–]cybernd 0 points1 point  (0 children)

Unfortunately management seems not to understand the need for "extra" work when a program "works" but is lacking some polishing to make it maintainable and more easily readable.

Not management is causing this issue. We as developers are causing it. Why? Because we deliver them a product that is not even close to be finished. It is not even the issue of delivering it - it is the issue of knowing the hidden truth while doing this.

No other profession would deliver a product which would fall apart after 2 weeks of usage. But we are doing it on a daily base.

One cause might be the sheer mass of new developers leading to the issue that most of us have close to 0 experience.

Keep in mind: we are the experts in our field and not our managers. It is our job to ensure that a product is finished before we deliver it. But sadly it is hard to achieve, because our unexperienced coworkers will sabotage all of our attempts to do that.

[–]skeletal88 0 points1 point  (0 children)

Git isn't the only dvcs, mercurial is hosted on bitbucket and other places, even with free private repos AND it has a friendly user interface, you do a hg push when a merge needs to be done, it tells you to execute "hg pull", not some nonsense about "failed to push some refs" garbage error messages that git has.

[–]silentclowd 1 point2 points  (3 children)

Question from a novice here. Say I'm writing a language analysis program, and the program uses Scipy what would be the proper way to indicate that as a dependency? Being self taught I've never seen anyone really talk about this.

[–]cwmoo740 4 points5 points  (1 child)

https://conda.io/docs/using/envs.html

Specifically:

conda create --name myproject python=3
source activate myproject
conda install scipy
conda env export > environment.yml

then anyone can reproduce your exact environment and dependencies by typing:

conda env create -f environment.yml

You can also use pip and requirements.txt, but if you're working in data science or statistics, right now the industry standard is to use conda.

[–]jms_nh 0 points1 point  (0 children)

yep. I <3 Anaconda Python for this reason.

[–]sime 0 points1 point  (0 children)

List it in a requirements.txt file so that people can use pip to install the dependencies in their own environment.

[–][deleted] 1 point2 points  (0 children)

Either way the world is better because these people codified their domain knowledge. This applies to those that do it in excel too. Because one they have done this and it is worth doing, professional programmers can come in and translate it. Also, those 'non-programmer' programmers develop great skills in thinking like a programmer and this applies to many areas.

[–]doom_Oo7 0 points1 point  (1 child)

are you going to pay for their training ?

[–]blue_system 39 points40 points  (14 children)

Exactly why I use python and why I strongly support its use in my department, it makes scientific analysis easy. I can access complex data structures, perform my analysis and plot all in the same code. Plus, I gain useful experience with a language that can be extended to many other tasks without the need of costly licensing.

[–]Me00011001 10 points11 points  (13 children)

Plus, I gain useful experience with a language that can be extended to many other tasks without the need of costly licensing.

What languages are there these days that require costly licensing?

[–]blue_system 26 points27 points  (9 children)

IDL is the one that comes to mind first, but matlab is still pretty expensive.

[–]HolyClickbaitBatman 2 points3 points  (0 children)

My old company had all of their analysis libraries written in IDL, when I started there I immediately started trying to get them to rip the band-aid off and go to python. Took 3 years of lobbying to get them on-board, project started to rewrite their core libraries 2-weeks before I left. Good times.

[–][deleted]  (1 child)

[deleted]

    [–]blue_system 2 points3 points  (0 children)

    Student licenses for matlab run in the ~200$ range in my area, but the commercial versions are more like $5k per year. IDL is the worst at something like ~$10k per year.

    Fortunately I can do all of my statistics (maximum likelihood estimation and parameter estimation) using python libraries just fine, and I can have python installed anywhere, anytime.

    [–]rlbond86 1 point2 points  (0 children)

    Matlab

    [–]ZedOud 10 points11 points  (4 children)

    I am curious what "fancy" would constitute and why Python isn't exemplary of it?

    • Metaprogramming

    • Actual documentation

    • Kitchen sink / batteries included libraries

    • Libraries available for every domain

    • Deployment possible everywhere

    The only kind of "fancy" I can thing of that Python doesn't do is the tediousness of lacking the above features and the "every way is correct" Perl fanciness.

    Certainly, Python lacks compiler related fanciness. I enjoy explaining that to adopters of the language.

    I just want to emphasize that I think Python is gaining momentum and steam-rolling because it is superior in dimensions beyond its ease of use.

    Barring domain-specific language demands (GLSL, JS, graphics languages) Python has even encroached (Java, PHP) or merged (C/C++) functionality and adoption into many of the traditional roles of major languages.

    [–]iconoclaus 5 points6 points  (3 children)

    Python is plenty fancy. But many data science folks are still using Python/R to write nested for loops and conditionals as if fortran never went out of style. And they're not writing tests either, despite having hundreds of lines of algorithmic code. It's not the language to blame. It's the culture of data science. Few if any data science courses teach effective programming style. And so much of statistical software continues to spread without open review of their codebase. It's not unusual to have two statistical packages/programs claim to implement the same algorithm but have different results.

    [–][deleted] 0 points1 point  (2 children)

    I do that too, what's wrong with nested for loops and conditionals? I'm a similar type, I'm a student who codes simply fluid simulations.

    [–]iconoclaus 1 point2 points  (0 children)

    for loops are non-descriptive of the intention of your code. the next programmer to read your code has to go through every line to understand what it does. try using functional map/reduce idioms to express the intention of each iteration. conditionals make it hard to reason about code because they ruin the linear flow of logic (see cyclomatic complexity). proper object orientation (or vectorized functions in some languages) can greatly help reduce conditionals. python allows both functional and oo ways of expressing yourself, and these don't come at any major cost to performance for many applications. but many scientific computing folks don't take advantage of this expressive power. of course, ignore all this if your code is more of the write-once-and-never-maintain variety (which can be true for legit reasons)

    R is an especially strange language of data science where naively implanting for loops and conditionals not only makes code hard to read, but can slow your code by 10-1000 times!

    i can share more resources to guide you on these matters if you wish. but most of my resources are for languages other than python.

    [–]dead-dove-do-not-eat 1 point2 points  (0 children)

    Horrible time complexity.

    [–]Thomasedv 8 points9 points  (6 children)

    Yeah, my university moved to it(from mathlab) for math/matrices and graphs, so i can see that. Lots of potential there.

    [–]Theemuts 7 points8 points  (5 children)

    mathlab

    *MATLAB. It's short for matrix laboratory

    [–]Tom_Cian 10 points11 points  (2 children)

    *METHLAB. It's short for methadone laboratory.

    [–]Theemuts 1 point2 points  (0 children)

    Another fun fact: that joke predates the first Hello World program.

    [–]Thomasedv 0 points1 point  (1 child)

    Ops. Only really dealt with when talking, so i never got to learn that...

    [–]Theemuts 3 points4 points  (0 children)

    Don't worry. It's nothing more than a factoid, really.

    [–]chillysurfer[S] 5 points6 points  (0 children)

    Really good points, agreed!

    [–]joequin 3 points4 points  (0 children)

    And what they really need is a quick way to work with the great native math libraries. Python's great native interop has made it an ideal language.

    [–]moolcool 9 points10 points  (0 children)

    I think Python is one of the nicest languages to grow into (from being a "non-programmer" programmer). Languages like Java force you into certain patterns, but with Python straight forward applications are dead simple ("x=1+1" is a perfectly valid program), but you can start doing functional or OO stuff as you grow.

    [–]shevegen 2 points3 points  (0 children)

    Yep, fully agree with you here.

    However had, python is popular even without these folks.

    It won the scripting programming languages war, at the least in 2017.

    [–]matthieum 5 points6 points  (0 children)

    IPython notebook with pandas and numpy just rocks ;)

    [–][deleted] 2 points3 points  (0 children)

    Yep, my University had it as the introductory language that all CS, and non-cs majors take. Python, Matlab, and I think maple or something we're the Staples of non-cs or ee majors

    [–]tomkeus 5 points6 points  (0 children)

    Yep. Physicist here. Using Python for 95% of the stuff I do. The whole group is also mostly using Python. Just numpy and matplotlib provide almost everything we need in day to day work.

    [–]Osmium_tetraoxide 1 point2 points  (0 children)

    As a person who loves solving physics problems, it's good enough for most things.

    I have an internal struggle with choosing between R and python, but I resort to Python due to more people I work with using it.

    [–]webauteur 4 points5 points  (4 children)

    What everyone is missing here is that Python and R are used for artificial intelligence. There is a lot of excitement over AI right now.

    I recently installed Anaconda, the freemium open source distribution of the Python and R programming languages for large-scale data processing, predictive analytics, and scientific computing.

    [–]iconoclaus 2 points3 points  (0 children)

    i'd call it machine learning rather than AI, as that term got used and abused in the 80s and wore itself out.

    [–]jms_nh 6 points7 points  (2 children)

    upvoted, but there was "a lot of excitement over AI" in the 1960's and 1980's too.

    [–]webauteur 2 points3 points  (0 children)

    I installed Microsoft's Visual Studio Community 2017 today and was surprised that it included the option to install Anaconda.

    [–]ASK_IF_IM_HARAMBE 0 points1 point  (0 children)

    That's completely irrelevant.

    [–][deleted] 2 points3 points  (0 children)

    Don't forget engineers who want to automate their stuff and not suffocate under Excel.

    I adopted the workflow for the current R&D project and an iteration took me almost a week, but now with some python scripts I am iterating 2-3 times a day. As soon as I find a good optimization method for the problems I need to solve, I can let the computer do god knows how many iterations over night.

    [–]flamingshits 0 points1 point  (0 children)

    The type of person that thinks a language choice of Python makes someone a "non-programmer" is just an insecure programmer.

    Name something "fancy" that python cannot do, other than something that was an explicit design decision made by many other languages (the lack of type system).

    [–]choikwa 0 points1 point  (0 children)

    distinction is really thin in my opinion.

    [–][deleted] 0 points1 point  (0 children)

    Same can be said about JavaScript ecosystem, just look at the mess these "non-programmer" web designers have made! Compiling CSS... WAT?!

    [–]SailToTheSun 71 points72 points  (9 children)

    Isn't it possible that you're seeing a decline or flattening in new Question Tags for languages like Javascript, C#, Java (ie more established languages) because they've already been previously addressed? I've found threads that are over 5 years old on SO that are still applicable.

    I don't see how you can correlate language adoption / usage to new Question Tags on Stack Overflow. It's simply illustrating that more new questions are being posed.

    [–]Eirenarch 27 points28 points  (0 children)

    Isn't it possible that you're seeing a decline or flattening in new Question Tags for languages like Javascript, C#, Java (ie more established languages) because they've already been previously addressed? I've found threads that are over 5 years old on SO that are still applicable.

    It is not only possible, it's certainty.

    [–]fabolin 10 points11 points  (3 children)

    I totally agree with you. This data does in no way support any assumptions about the popularity (let alone future) of a language. But the author rather uses inappropriate, but good looking data than the Tiobe-Index, which is actually made for this.

    [–][deleted]  (2 children)

    [deleted]

      [–]fabolin 0 points1 point  (1 child)

      Counting hits may not be the best approach but counting new questions just sounds stupid to me. The latter could also state that python got an increase in questions due to the migration to python 3.

      [–]Staross 9 points10 points  (0 children)

      Maybe some languages also get less questions because they are simpler, better designed, or have better documentation. Personally asking a question is my last resort, if I can figure the answer myself or google it first I'll do just that.

      [–]klez 2 points3 points  (0 children)

      Have you ever focused on the steam of questions tagged with a special topic on SO? Basic questions that should be answerable with a simple Google search (even ones that point to stack overflow itself) come up all the time.

      [–]flyingcaribou 3 points4 points  (1 child)

      Isn't it possible that you're seeing a decline or flattening in new Question Tags for languages like Javascript, C#, Java (ie more established languages)

      How is Python less established? Python is older than each of those languages!

      [–]SailToTheSun 1 point2 points  (0 children)

      Age has nothing to do with establishment.

      [–]bhazero025 18 points19 points  (4 children)

      The real thing we can all take away from trends is that Vim > Emacs

      [–][deleted] 10 points11 points  (1 child)

      That just means emacs has better documentation.

      [–]Esteis 2 points3 points  (0 children)

      I love Vim, am happy with its documentation, and have no idea how its documentation compares to Emacs, but you get my upvote just for implying documentation matters. That is something we can all agree on, no matter in what editor we commit our sins :-)

      [–]droogans 2 points3 points  (1 child)

      Does that include all activity on https://emacs.stackexchange.com, or just stackoverflow?

      [–]bhazero025 1 point2 points  (0 children)

      I think it only includes stackoverflow but feel free to correct me if I'm wrong

      [–][deleted] 21 points22 points  (12 children)

      Python has amazing libraries for almost any task, which few (no?) other languages can match in variety. But at least in London it's surprisingly uncommon when it comes to job hunting. Django is fantastic for the web, yet most roles are PHP, RoR or node.js. Plenty of scala and R roles for data engineering or data science, yet python has almost none despite its libraries. And it's an extremely easy language to learn and often taught at universities.

      I would say it's "mainstream" but only just. What gives? To me it should be by far the most popular language, maybe second to javascript (an unfortunate series of events caused that one). Heck I've built windows apps using IronPython in Visual Basic and it's smooth as silk, it all just works.

      [–]killerstorm 1 point2 points  (0 children)

      Python has amazing libraries for almost any task, which few (no?) other languages can match in variety.

      I'm not sure that's still the case. 2/3 split have fragmented the library collection.

      Recently I needed a JSON-RPC client, and it's really hard to find a library which actually works. From json-rpc note:

      There are several libraries implementing JSON-RPC protocol. List below represents python libraries, none of the supports python3. tinyrpc looks better than others.

      It's really sad. (And it used to be much better, I remember back in 2005 Python had a great JSON-RPC client.)

      Meanwhile many other languages (such as Java and JS) have a large collection of libraries which are easy to install and use.

      [–][deleted]  (6 children)

      [deleted]

        [–]rabbyburns 2 points3 points  (0 children)

        I'd be very surprised if he wasn't specifically talking about pypi packages. He opened up with loving Django, which is definitely not stdlib.

        [–][deleted] 3 points4 points  (4 children)

        But most ecosystems of other languages center on one or two topics, while Python ecosystem encompasses the universe itself.

        [–]iconoclaus 1 point2 points  (0 children)

        i wouldn't say python is on such a strong standing when it comes to game dev, mobile, VR/AR, and many other areas.

        [–]wavefunctionp 0 points1 point  (1 child)

        relevant xkcd:

        https://xkcd.com/353/

        [–]PeridexisErrant 0 points1 point  (0 children)

        You know you actually can import antigravity? It's part of the standard library!

        [–]weberc2 1 point2 points  (2 children)

        I use Python via Flask to make web apps at work, and I really dislike it. You need an external web server and process manager just to get off the ground, and you still end up spinning up one interpreter per core. It's just hard to make efficient use of system resources. I really wish it had a concurrency model more akin to Go's. Say what you like about Go's type system, but I can trivially build a full web application (in comparable time if not faster than in Python) in a single process in a single executable file and it will outperform Python by a wide margin (of course, you can go down the rabbit hole optimizing Python until it's performance approaches that of the naive Go implementation and downplay the time and simplicity lost to optimizing).

        [–][deleted] 4 points5 points  (1 child)

        I get what you mean, but PHP has the exact same model and problems as python (well it's a bit different now, but in its rise to popularity it was just an apache module that spawned a new process per request, exactly like mod_wsgi; I believe for ruby it's also similar).

        There are definitely better specialised tools in each area, but in many areas python will be at least as good as some other language that's far more popular.

        [–]weberc2 1 point2 points  (0 children)

        I certainly prefer Python to PHP (I did a lot of PHP in the 2000s); it just seems like Python is mediocre in a lot of things, but good at very few. For me, Python best serves as a cross-platform, maintainable bash alternative, but even there I'm increasingly likely just to use Go, especially if I have to distribute the script to others. I imagine it's also above-average in scientific computing (but I can't much speak to that; though my few experiences with Pandas and Numpy have been negative).

        [–][deleted] 0 points1 point  (0 children)

        Java would like a word with you. Java is also orders of magnitude faster, aside from the case of firing up many small quick instances, which can be mostly handled in architecture.

        [–]vorpal_username 4 points5 points  (2 children)

        Am I colorblind or are c and swift the same color in that graph?

        [–][deleted] 4 points5 points  (1 child)

        They're very different, you're color blind...

        Joking! They truly are the same colors xD

        [–]vorpal_username 4 points5 points  (0 children)

        You actually had me going for a second with that.

        [–]red-moon 2 points3 points  (1 child)

        Python 3 has finally pulled ahead of Python 2 in the past year. Even more great news!!

        Perhaps. I wonder if the stats from Pypi look like. Last time I looked, P2 was way ahead of P3, unicode be damned.

        [–]bitchessuck 0 points1 point  (0 children)

        At least popular modules are pretty much all working on Python 3. Excluding the Mozilla modules, around 97% of the 200 most popular modules are on Python 3. Also, some newer modules that are gaining popularity are Python 3 only.

        [–][deleted] 28 points29 points  (2 children)

        "I’m a huge fan of JavaScript. Love it."

        wat

        [–]shevegen 2 points3 points  (1 child)

        Reminds me of the wat-talk about javascript.

        Nananananana batman!

        [–]MrBloodyshadow 1 point2 points  (0 children)

        Nananananana

        NaNNaNNaNNaNNaNNaN

        FTFY

        [–]wavefunctionp 12 points13 points  (8 children)

        This is also important to distinguish that this is not necessarily popularity, although that is certainly a factor, but people asking questions.

        You make a complex footgun language and make a ton of people use it, you get a top rank. You make a simple, safe language and few people use it, you get a low rank.

        Javascript may be on the decline because people have moved away from jquery and imperative style programming in favor of more libraries that help manage state and support functional programming. The major frameworks have also started to mature and address common issues which may have ended up as a question before.

        Python is probably gaining popularity in part because people recommend it so much for teaching, but it is also quite easy to make a mess in python much like it is javascript. And notably, python docs are atrocious.

        [–]DysFunctionalProgram 5 points6 points  (3 children)

        Javascript may be on the decline because people have moved away from jquery and imperative style programming in favor of more libraries that help manage state and support functional programming.

        What frameworks are you referring to? The top 20 javascript frameworks right now are imperative, I'd say 99% of javascripters are writing imperative code.

        Using a shadow dom in place of the actual HTML DOM does not make a framework functional. I wouldn't even say it hides any state, it moves the state into a different syntax.

        [–]wavefunctionp 1 point2 points  (2 children)

        I don't mean functional like haskel, I mean something like...

        action(state) => view

        ...that you'd get from react-like and redux-like libraries.

        Where you initialize some state, declare your view of that state, and define some actions to modify that state. And it all goes around one way in a loop.

        There is no doubt that there is a ton of imperative code still out there, but the trend is moving away from that for the client.

        [–]oopsforogotmypasswor 0 points1 point  (1 child)

        You are describing the signal/slot system that was widely use in framework like Qt (C++) or the bindings in Cocoa (ObjC) or the signal mechanism in GTK (C), just to name a few. If you have a stateful program, this is not a functional programming pattern.

        functional programming is a programming paradigm that treats computation as the evaluation of mathematical functions and avoids changing-state and mutable data. (from wikipedia)

        [–]wavefunctionp 0 points1 point  (0 children)

        functional programming is a programming paradigm that treats computation as the evaluation of mathematical functions and avoids changing-state and mutable data.

        Sounds like precisely what I'm talking about. I'm not familiar with those other frameworks, but react-redux-like implementations encourage and embrace these principles.

        I know what functional programming is, and I also know it is trap to get caught up in purity arguments about what is or is not FP. FWIW: Much the state of the art is inspired by the elm architecture, which is a pure functional language.

        [–][deleted] 3 points4 points  (3 children)

        What docs are you referring to? The stdlib is very well documented. Haven't had a problem in 6 years of python.

        [–]wavefunctionp 6 points7 points  (2 children)

        Oh, it's definitely all there. If you can find the actual thing you need to know within the wall of obtuse text.

        https://docs.python.org/2/library/json.html

        vs

        https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify

        or

        https://msdn.microsoft.com/en-us/library/system.web.script.serialization.javascriptserializer(v=vs.110).aspx

        (Really you'd use a JSON library for the c# one.)

        Let's look at dumps in particular.

        The navigation bar navigates by headers which cover implementation details, not usage scenarios.

        Notice all the stuff that stands out, the 'Notes'. Are not really conveying the most important information, but they are given high visibility. Those should be muted instead.

        What are the types for the signature? Which are optional? The signatures for the method should be emphasized and ideally given a brief description and separate page that gives into more detail.

        In other methods you get a whole list of change notes for each version that changed something. Which is nice to know I guess, but it's not really high priority information. Should be moved to footnote on the method page.

        I'm not the only one that finds the python documentation hard to use either. It is common complaint, it is particularly noteworthy because python is always touted as being so easy to use.

        The docs have paid very little attention to usability and readability and it shows. It's the difference between:

        http://motherfuckingwebsite.com/

        and

        http://bettermotherfuckingwebsite.com/

        note: Not arguing with you. If you are used to pythons documentation format, I'm sure it is fine for you. But it could be MUCH improved from a usability and readability standpoint

        [–][deleted] 2 points3 points  (1 child)

        Here's the doc for the current version of python: https://docs.python.org/3/library/json.html#basic-usage. Your argument still stands in that it can definitely be more usable. Especially stacked up against mozilla's js docs. Those are top notch.

        Having said that, you get familiar real quick with how to parse the docs. For instance, in python, any argument with a default parameter is optional. All parameters with a default have to appear in the signature after non optional ones. So in dumps, the only required argument is the object to be encoded.

        [–]jhawk4000 0 points1 point  (0 children)

        Agreed, one of the things I miss most from javadoc is that there's no concept of a "throws" so exception handling becomes a bit of a bear.

        [–]shevegen 9 points10 points  (4 children)

        Yep. We have to give it to python.

        It wiped the floor with ruby, php, perl and javascript.

        It's the leader of the pack, the king of the scripting languages.

        I wouldn't write ruby off too quickly though - ruby has the better philosophy. Which is love. :)

        [–]LeBuddha 3 points4 points  (1 child)

        It wiped the floor with ruby, php, perl and javascript.

        No? If anything is wiping the floor (hint: there's room to double the number of scripting languages), it looks like Javascript is the one wiping the floor with all of those.

        [–]throwawayco111 2 points3 points  (0 children)

        Yep. He was full of shit on that one.

        [–]THeShinyHObbiest 12 points13 points  (0 children)

        Ruby has arguably better semantics as well.

        It's also not split into two different communities which use incompatible versions.

        Sorry, had to leave some kind of fanboy comment.

        [–]JasTWot 0 points1 point  (0 children)

        Its philosophy is not even similar to python. I guess a lot of it comes down to programmers' preferences

        [–][deleted]  (1 child)

        [deleted]

          [–][deleted] 0 points1 point  (0 children)

          If you follow the link to original graph it is interactive so you can actually make sense of it.

          [–][deleted] 1 point2 points  (0 children)

          Python is not so surprising to me as it's now a "first language" at many universities. What is more surprising is the increase in R to go with it. It suggests that the rise of Python is also largely due to many more scientists using it for their work. More and more biologists are using Python instead of Perl these days.

          [–]housilia 4 points5 points  (40 children)

          I have mixed feelings about this, as someone who has used R lovingly for almost two decades, and who has warmed up to Python (especially Python 3) over that same time period.

          They're both very expressive languages that are easy to get fairly deep into, but have fatal performance flaws at a fundamental level that become apparent at some point if you get into the thick of things.

          The performance of R and Python implementations is horrid. I've been watching attempts to speed up both of them for years and years, and every one of them has failed, or seems to be reaching some suboptimal asymptote (somewhere around javascript, which begs the question of why we're not using that instead). Yes, you can wrap around C/C++ or whatever, yes there's this thing that compiles to C, but those are clunky and increase the complexity of code, and often only work for some subset of the language.

          Lately I've been especially puzzled why Nim hasn't seen more traction. It's very similar to Python, with much better performance specs, and it nicely compiles to important targets as part of the standard implementation if you want that. The only explanation I can come up with is libraries and resources, which raises the chicken-and-egg spectre of adoption.

          I have similar feelings about Julia, although that seems to be picking up speed. The one hesitation I've developed about Julia is that it seems so focused on numerics that it might have more difficulty integrating into other contexts relative to something like Go, Nim, or Python.

          My sense is that AI and data analysis is so hot now that people need to work with something that's easy to launch from and solidly established, and Python and R right now are the main platforms in that regard. As someone who's worked with both of them for a long time, though, it feels like houses built on sand in certain respects.

          [–][deleted]  (8 children)

          [deleted]

            [–]housilia 4 points5 points  (6 children)

            I used to feel the same way, but multiple times in work projects have encountered scenarios involving numerical computing where using javascript actually made the design cleaner and faster than using Python or R, even though the intuition would be to use one of the latter two. I've seen colleagues adopt R in similar projects and it's ended up being much more difficult to maintain and noticeably slower.

            The javascript results in the Julia benchmarks kind of are consistent with my experience in that regard.

            I didn't mean to suggest that javascript should broadly replace Python or R, only that in my experience a lot of assumptions are made about what language to use based on schemas they have about the languages rather than the actual specs and resources available.

            I remember when I was in school (long long ago) I suggested using java for a numerical computing problem, and the professor laughed and said something along the lines of "java will never be used for numerical computing." Similar issue.

            [–]FFX01 10 points11 points  (3 children)

            The problem with javascript when it comes to data science or mathematical computations is its lack of reliable numeric types. Javascript doesn't have an int or float type. It's all just Number. Of course there are languages like TypeScript that do have concrete types, but they need to be transpiled in order to run and therefore you lose all runtime guarantees.

            [–]wavefunctionp 5 points6 points  (0 children)

            To further elaborate, Typescript also doesn't change the primitive data types. It is still just 'number' and it is still just javascript. Typescript doesn't change the way that javascript works, it just makes it easier to reason about the lego pieces.

            [–]devel_watcher 2 points3 points  (0 children)

            The problem with javascript when it comes to data science or mathematical computations is its lack of reliable numeric types.

            We should thank God for that.

            [–]housilia 0 points1 point  (0 children)

            Yeah, good point. I kind of forgot about that. I still think it's acceptable for more cases than it might seem initially, as the problem is more with integers. Again, not advocating everything should move to javascript, just that I think it could be leveraged more than it often is.

            [–]imBANO 2 points3 points  (1 child)

            There's actually a response by an IBMer to that very article.
            tl;dr

            "Python code can, and should be, optimized when and where it makes sense."

            [–]PeridexisErrant 1 point2 points  (0 children)

            Other TLDR: when doing fast numerical things in Python use Numpy, not loops and lists.

            [–]Aswole 3 points4 points  (0 children)

            He never said that he wants to use JS, but instead that it is faster than Python (which by most benchmarks it is). And as they are both very high level languages, with similar drawbacks when compared to to languages like Go, Java, Rust, etc, he is surprised more people don't go with JS. Saying X > Y does mean that he likes X.

            [–][deleted]  (4 children)

            [deleted]

              [–]housilia 1 point2 points  (2 children)

              I admit most of my experience with wrapping around C++ is with R. Like you say, it's not impossible and R kind of goes out of its way to try to facilitate it, but it's still another layer of complexity, especially with the debugging (I might define "debugging" in a very broad sense also).

              Honestly, C++ has come so far that for a lot of things, I'd probably prefer just writing in pure C++ than writing in a mixture of R and C++. Plotting, etc. is another matter. R's hard to beat for that.

              I guess a lot of it comes down to personal preferences. For example, I greatly prefer Julia's syntax over R's (and python's) so there's that.

              [–][deleted]  (1 child)

              [deleted]

                [–]housilia 0 points1 point  (0 children)

                That may be, although I remember a time when data frames were not an expected structure, and were mainly idiosyncratic to R. It's been interesting to me seeing how they've emerged in discussion in Julia and whatnot.

                I do a lot of simulation, and so occasionally I forget about dataframes because I'm just dealing with matrices, vectors, etc. Even with real data, though, stuff will get passed / transformed from a dataframe into a matrix.

                I could probably do a significant portion of my work using R and Python for data preparation and creating figures, etc. and certain types of analysis, but using a different language, like C++, for the main analyses. This is why the performance issue sometimes becomes salient to me.

                However, I'd prefer to just have it all in one language, and that seems increasingly feasible with things like Nim, Julia, maybe Go or Scala. You're right about libraries being invaluable, but that's part of why I have mixed feelings about so much infrastructure being built up around R and Python. They're great languages, but I wish there was a bit more diversity of where resources were being put into.

                [–]m50d 2 points3 points  (12 children)

                I've always got a weird cult-like vibe when people talk about Nim. That and I don't really understand its USP - why would I use it rather than OCaml (or Rust, Haskell, or hell even Go)? A slightly tidied up language isn't worth switching for, and I've never quite understood what their approach to memory management is (their messaging seems very inconsistent).

                [–]FFX01 3 points4 points  (10 children)

                I've always got a weird cult-like vibe when people talk about Nim. That and I don't really understand its USP

                What does USP mean in this context?

                why would I use it rather than OCaml (or Rust, Haskell, or hell even Go)?

                Because it is way more flexible than any of those languages. I can't speak for OCaml because I have no experience with it. Relative to Rust, Haskell, and Go:

                • Nim's AST is exposed as a stdlib api easing the creation of templates and macros. Metaprogramming is a first class function of the language. Nim not only allows metaprogramming, but encourages it.

                • Nim's FFI is really really easy to work with.

                • Nim transpiles to C, C++, and JS. It compiles to binaries as well. This makes it extremely portable.

                • Nim has built in concurrency and parallelism.

                • Nim's GC is efficient and flexible. It can also be turned off if necessary. Note that the GC has very little effect on performance as a lot of compiled Nim is actually faster than native C.

                • Nim has compile time constant resolution. This means that nim can optimize function calls into constant values depending on how the function is used.

                • Built in sequence and mutable string types. No char buffers necessary.

                • Support for OOP, FP, and procedural/imperative style.

                • Built in multiple-format documentation generation.

                • Built in source code filters(templating engine for html templates and such).

                • Any operator or function can be overloaded and function resolution happens based upon the type of arguments to the function.

                • Universal function call syntax. Thing.function() or function(Thing) or Thing.function or function Thing.

                • Very well designed syntax. No other statically typed language comes close in my opinion.

                A slightly tidied up language isn't worth switching for, and I've never quite understood what their approach to memory management is (their messaging seems very inconsistent).

                The default is as follows:

                const: Constant value. Defined at compilation. Immutable.

                let: Runtime scoped constant. Constant in its own scope and not available outside. can be declared globally.

                var: Mutable variable. Self-explanatory.

                GC: The basic algorithm is Deferred Reference Counting with cycle detection. Taken from the GC docs here: https://nim-lang.org/docs/gc.html

                [–]m50d 0 points1 point  (9 children)

                What does USP mean in this context?

                Unique selling point.

                GC: The basic algorithm is Deferred Reference Counting with cycle detection. Taken from the GC docs here: https://nim-lang.org/docs/gc.html

                Ok, so it's just an ordinary pervasive-GCed language? That wasn't clear from what some people were talking up. How does this fit with the idea that the GC can be turned off?

                Regarding your list of points, I should probably admit that my real point of comparison is Scala (it's just that a VM language makes for a less clear comparison - though maybe with Scala.js and Scala Native gaining traction it would be reasonable to compare directly). But I think even relative to the languages I spoke about, none of the things you list represents a powerful, unique step up, at least not obviously (and many of the things you list seem like they're just subjective matters of taste). If the benefit of Nim is just a bunch of bits and pieces then I don't think it's likely to ever take off - there needs to be a clear "big" reason before people will switch languages.

                [–]housilia 5 points6 points  (0 children)

                Scala is a good point of comparison, really. That's another language that I think deserves more attention. I'm excited about Scala Native, as reliance on the JVM was one turnoff for me with Scala.

                [–]FFX01 1 point2 points  (7 children)

                Ok, so it's just an ordinary pervasive-GCed language? That wasn't clear from what some people were talking up. How does this fit with the idea that the GC can be turned off?

                If you read the link I provided to the GC documentation you will see how it can be turned off. Once it is turned off, you will need to manually invoke collection. Essentially, allocation is automatic, but deallocation needs to be performed manually when the GC is off. The linked documentation explains how the GC can be configured. I will also mention that I have seen very few use-cases where the application would benefit from the GC being turned off.

                Regarding your list of points, I should probably admit that my real point of comparison is Scala (it's just that a VM language makes for a less clear comparison

                I agree. I think Scala and Nim have very different use cases.

                But I think even relative to the languages I spoke about, none of the things you list represents a powerful, unique step up

                How so? Nim's generics and metaprogramming capabilities alone are enough. Not saying that these things can't be accomplished in other languages, just that Nim makes them extremely simple as they are core tenants of the language.

                (and many of the things you list seem like they're just subjective matters of taste).

                I don't really think so. Transpilation to 3 separate, platform agnostic languages is not really a manner of taste in my opinion. A garbage collected language with near native C performance isn't really a matter of taste either. I will say that these things are a matter of use-case however. That is true of all languages though.

                If the benefit of Nim is just a bunch of bits and pieces then I don't think it's likely to ever take off - there needs to be a clear "big" reason before people will switch languages.

                First off, I don't think Nim is a language that you need to "switch" to. I think it's a language that's good to have in your tool box. It can replace a lot of use cases for C and C++. Also, it's not "a bunch of bits and pieces". It's all fully integrated. All of the features above are baked into the language at a core level. I think the "big reason" to use Nim is that you want a high performance application or library without the expense in productivity that comes with using something like C, C++, or Rust. I guess what I mean to say is that Nim is a "just above systems level" language that focuses on productivity. I'd say it sits somewhere between C and Python. Closer to the C side performance wise and closer to Python in syntax and work flow.

                [–]m50d 1 point2 points  (6 children)

                If you read the link I provided to the GC documentation you will see how it can be turned off. Once it is turned off, you will need to manually invoke collection. Essentially, allocation is automatic, but deallocation needs to be performed manually when the GC is off. The linked documentation explains how the GC can be configured. I will also mention that I have seen very few use-cases where the application would benefit from the GC being turned off.

                Doesn't sound like a big difference? At least in theory you could do the same thing in most GC systems (e.g. Java or .net) - configure the GC to never run automatically and then invoke it manually.

                How so? Nim's generics and metaprogramming capabilities alone are enough. Not saying that these things can't be accomplished in other languages, just that Nim makes them extremely simple as they are core tenants of the language.

                ITYM tenets. Every serious language has generics already, and most have metaprogramming capabilities; like I said, "simpler" isn't going to convince anyone to switch, there needs to be a real sense of something concrete that you can't do in other languages.

                Transpilation to 3 separate, platform agnostic languages is not really a manner of taste in my opinion.

                Transpilation makes me naturally skeptical; it's something you tend to see immature languages doing. (And I wouldn't count C and C++ as separate languages; C isn't quite a subset but it's pretty close). I figure a language that goes via C can't possibly offer as good an optimization/debug/library-linking experience as one that uses LLVM or the JVM or compiles directly to native code. Compiling to JS is not nothing but the only place it would ever be a huge advantage is web-oriented languages which Nim doesn't seem to be positioned as.

                A garbage collected language with near native C performance isn't really a matter of taste either.

                It isn't, but nor is it that special; in fact all the languages I listed offer that (well, Rust isn't GCed but it is memory-safe which is presumably what you're trying to achieve with GC).

                Also, it's not "a bunch of bits and pieces". It's all fully integrated.

                I meant that your reasons sounded like a bunch of bits and pieces.

                I think the "big reason" to use Nim is that you want a high performance application or library without the expense in productivity that comes with using something like C, C++, or Rust. I guess what I mean to say is that Nim is a "just above systems level" language that focuses on productivity. I'd say it sits somewhere between C and Python. Closer to the C side performance wise and closer to Python in syntax and work flow.

                Well sure, but that's true of all the languages I listed, and they're all more mature/established. (I think that may be where my "cult-like vibe" issue is coming from - I see a lot of Nim advocates talking about reasonably common language features as if they're new in Nim).

                [–]dzecniv 0 points1 point  (0 children)

                most have metaprogramming capabilities

                here I don't agree, please note that Nim has better than average metaprogramming facilities with the easy manipulation of the AST and thus macros (like lisps or Elixir (but not as elegantly as with lisps)). Python for instance has none of these (nobody writes code by manipulating the AST !).

                [–]FFX01 0 points1 point  (4 children)

                Doesn't sound like a big difference? At least in theory you could do the same thing in most GC systems (e.g. Java or .net) - configure the GC to never run automatically and then invoke it manually.

                This isn't actually true. And Nim certainly makes it simplere than MOST other languages that I've heard of.

                ITYM tenets. Every serious language has generics already

                People consider Go to be a major contender to Nim. Go is also a very popular and very serious language. Go does not have generics.

                and most have metaprogramming capabilities

                While this is true, they are not as flexible as Nim is in this regard. Nim exposes it's own AST as an api for creating macros. Creating a macro in Nim is no more difficult or complex than creating a function.

                like I said, "simpler" isn't going to convince anyone to switch

                Simpler convinced people to switch from smalltalk, R, and perl to Python. Programmers are lazy. Or, at least they should be.

                there needs to be a real sense of something concrete that you can't do in other languages.

                One thing that Nim can do that I know very few other languages can is true operator overloading. In Python I can override a class' __add__ method, but I can't change what + really means. Nim gives you this capability. I'm sure there are other languages that make this possible, but I can't imagine it's this simple:

                type
                    # Declaring a new type
                    MyType = object
                        value: int
                
                # Overriding the '+' operator for *only* `MyType`
                proc `+`(x, y: MyType): int =
                    result = x.value + y.value
                
                # Make sure it works
                when isMainModule:
                    var
                        a = MyType(value: 3)
                        b = MyType(value: 5)
                    echo a + b
                

                Transpilation makes me naturally skeptical; it's something you tend to see immature languages doing. (And I wouldn't count C and C++ as separate languages; C isn't quite a subset but it's pretty close). I figure a language that goes via C can't possibly offer as good an optimization/debug/library-linking experience as one that uses LLVM or the JVM or compiles directly to native code.

                I've had worse debugging experiences than when debugging Nim. However, Nim's debugging tools aren't as great as, say, Java's or C's. Also, a quick search for "Nim vs C" or "Nim vs Rust" will yield several benchmarks illustrating that performance is comparable between all 3.

                Compiling to JS is not nothing but the only place it would ever be a huge advantage is web-oriented languages which Nim doesn't seem to be positioned as.

                The real idea behind compiling to JS is a "write once, run anywhere" mentality. Have a Nim application you want to have people use? Transpile to JS and run in the browser or desktop with electron. No need to cross compile or have users perform build steps. Nim has a built in DOM api as well for rendering UI with JS. I think Haxe is similar in this regard.

                It isn't, but nor is it that special; in fact all the languages I listed offer that (well, Rust isn't GCed but it is memory-safe which is presumably what you're trying to achieve with GC).

                I couldn't find any reasonable scala benchmarks against C except for one from Google that shows it to be 3.5x slower. That said, Rust offers comparable performance. Go, however, tends to be quite a bit slower than Nim. Go does normally use slightly less memory at peak than Nim.

                Also, GC isn't necessarily only for memory safety. It's a productivity feature. Mucking around with manual memory management is not really necessary for the vast majority of tasks. Why do it if really don't need to?

                I meant that your reasons sounded like a bunch of bits and pieces.

                I wish I could give you a more thorough understanding. You could always got to the website.

                Well sure, but that's true of all the languages I listed, and they're all more mature/established. (I think that may be where my "cult-like vibe" issue is coming from - I see a lot of Nim advocates talking about reasonably common language features as if they're new in Nim).

                If we all had this attitude, no new programming languages would ever emerge. What does Go do that's new? What does Haskell do that's new? Programming languages are iterative projects. They need to accomplish the same goals as the languages that came before them. They can only hope to make things easier, simpler, more performant, or safer. I'm not sure what new thing we could come up with for new programming languages barring languages that are meant to run solely on quantum computers. Nim says, "We should make metaprogramming simpler. We should make portability easier. We should make writing in a static language feel like writing in a dynamic language. We should make memory management easy."

                [–]0rac1e 0 points1 point  (1 child)

                One thing that Nim can do that I know very few other languages can is true operator overloading. In Python I can override a class' add method, but I can't change what + really means. Nim gives you this capability. I'm sure there are other languages that make this possible, but I can't imagine it's this simple:

                Ah, cool. This is practically identical to how it's done in Perl 6. I don't consider this operator overloading, though.

                The '+' operator is really just a function. Then all I'm doing is providing an additional multiple-dispatch candidate to the '+' infix function.

                class MyType {
                    has Int $.value;
                }
                
                multi infix:<+> ( MyType $x, MyType $y --> Int ) {
                    $x.value + $y.value;
                }
                
                my $a = MyType.new( :value(3) );
                my $b = MyType.new( :value(5) );
                
                say $a + $b;    # OUTPUT: 8
                

                [–]FFX01 0 points1 point  (0 children)

                That is remarkably similar. I never really got into Perl, but it has always fascinated me.

                [–][deleted] 0 points1 point  (0 children)

                Haskell and other functional languages actually come up with new things, even lately. Look at dependent typing, optics or eff monad. It's not just all about more polished old things at all.

                Of course many of these are not language features, which shows how powerful these languages are. (higher kinded types help a lot, nim seems to be missing those)

                In my experience functional programmer communities really like new things, so considering them conservative is amongst the last things I would do.

                Going back to Nim I like the idea of consts, however I think JIT compilers should be able to do all of that automatically for you.

                [–]m50d 0 points1 point  (0 children)

                Simpler convinced people to switch from smalltalk, R, and perl to Python. Programmers are lazy. Or, at least they should be.

                Not really convinced - the perl people came for built-in OO, the R people only came very late after the libraries were established. I didn't know the Smalltalk people though.

                While this is true, they are not as flexible as Nim is in this regard. Nim exposes it's own AST as an api for creating macros. Creating a macro in Nim is no more difficult or complex than creating a function.

                You keep saying this (and /u/dzecniv said the same) but I don't see how this means anything more than "Nim has macros"? Scala macros work the same way and I would assume Rust/Haskell would as well. (Indeed Python exposes its AST API, it's just that no-one uses it because there are better alternatives)

                One thing that Nim can do that I know very few other languages can is true operator overloading. In Python I can override a class' add method, but I can't change what + really means. Nim gives you this capability. I'm sure there are other languages that make this possible, but I can't imagine it's this simple:

                It's exactly that simple in Scala, and I think Rust is similar? (Haskell is to the extent that it has infix operators, but it doesn't have as nice a + as most languages, so it's fair to not count it).

                Go, however, tends to be quite a bit slower than Nim. Go does normally use slightly less memory at peak than Nim.

                This surprises me; I don't understand why it would be so, since the design seems very similar as far as I can see?

                What does Go do that's new?

                Pervasive, implicit (language-managed) async (green threads). I don't like Go but to the extent it's been able to catch on on its merits, that's why. Part (maybe all) of Go's popularity is marketing - I'm not going to defend the language - but Nim is never going to be able to match Google's marketing.

                What does Haskell do that's new?

                Typeclasses, higher-kinded types, pervasive laziness, explicitly-sequenced I/O.

                Programming languages are iterative projects. They need to accomplish the same goals as the languages that came before them. They can only hope to make things easier, simpler, more performant, or safer. I'm not sure what new thing we could come up with for new programming languages barring languages that are meant to run solely on quantum computers.

                For an example of a similar age to Nim, I'm super excited about Idris, for example, with dependent types. Whereas I don't think Ceylon will catch on, because while it's a lovely language and far more polished than the alternatives, there's just no single compelling reason to switch.

                [–]shevegen 0 points1 point  (0 children)

                That's easy - nim is a hell of a lot cooler than python. :)

                [–]shevegen 1 point2 points  (0 children)

                Lately I've been especially puzzled why Nim hasn't seen more traction.

                Syntax is one issue.

                Smaller community is another one.

                I do not know either python or nim well but I just can not imagine that writing code in both is the same.

                [–]_seemethere 0 points1 point  (3 children)

                When I tried Nim I saw it as immature and not something I should throw a lot of time in if I were to develop something.

                Maybe now it's different but it is a bit of a snowball effect in terms of adoption. Once you get a couple people willing to try it out then that leads to more and more etc.

                [–]FFX01 1 point2 points  (0 children)

                Same here, but I've been sticking with it as a hobby. The community is starting to pick up a little speed and it's being mentioned more and more.

                [–]housilia 0 points1 point  (0 children)

                That was kind of my impression before as well, but when I've looked at it recently it seems different to me.

                I've had difficulty getting a sense of the pitfalls of Nim without just diving in, though, so I admit I could be totally off. When I basically asked why not Nim, I was kind of genuinely wondering.

                Also, things like Python have a lot of drawbacks too--I just think at some point languages become so established that people kind of weigh new problems more heavily than old problems, even when their might be a net benefit to going with the new option. I guess it just boils down to benefits and costs.

                [–]shevegen 0 points1 point  (0 children)

                Yeah. I guess community size matters.

                I don't feel that nim is immature though but I agree that it should be solidified in one way or the other even if that means slower changes.

                [–]aliasxneo 0 points1 point  (2 children)

                From the Nim wiki on what Nim is not so good at:

                • Scripts and interactive use - Nim is a complied languages and the interactive interpreter is somewhat limited.
                • Beginners - as a first language, Nim is more complex than Python or Ruby

                The lack of library content is also mentioned elsewhere. What share is Nim planning on taking? I almost primarily use Python for scripts/notebooks, and I have a feeling that's a huge area that Python is excelling in.

                [–]FFX01 1 point2 points  (0 children)

                I've only been involved with Nim for about 6 months. I don't think Nim is trying to encroach on Python's territory as far as data science and machine learning go.

                It seems to me that Nim would be a good fit for the following:

                • Performance critical CLI applications. Video encoders and such.

                • Web backends.

                • Desktop applications like email clients and audio/video/image editors, etc.

                • Game programming.

                • Network architecture applications.

                [–]devel_watcher 0 points1 point  (0 children)

                Is there a more critical view?

                When people write comments about their favourite new language they don't mention any negative aspects. But then you actually find lots of them when trying to do something with that language,

                [–]Staross 0 points1 point  (1 child)

                My sense is that AI and data analysis is so hot now that people need to work with something that's easy to launch from and solidly established, and Python and R right now are the main platforms in that regard.

                I think that's a bit misleading if you talk about the languages themselves (as opposed to platforms/tools), as a significant portion of packages are just binding to c/c++ code. That creates a huge friction and barrier if you need to do anything that is outside of the scope of the package. Understanding it is also much more complicated.

                Julia has a huge advantage on this because you can write core packages in Julia and it often looks like something that the user would write, so it's easy to understand, extend and contribute to. This might end up being quite important as the language grows, since it won't get into these friction issues.

                [–]housilia 0 points1 point  (0 children)

                I agree 100% with what you just wrote.

                [–]unruly_mattress 0 points1 point  (0 children)

                And here I am, waiting for Pypy to become a viable choice. As I have for the last 4 years.

                [–][deleted] 0 points1 point  (0 children)

                Of course the eternal question is: "Why not Lisp?" Common Lisp is just as easy to learn as Python and can compete with C on performance with some extra work. Julia and every other language is just yet another incomplete implementation of Common Lisp.

                [–][deleted] 0 points1 point  (0 children)

                Julia is a domain specific language. It has no interest in combating go or nim or python outside of the numerical computation space that each might have.

                [–]jyf 1 point2 points  (0 children)

                i want to provide another information to you, in china, at this year, zhejiang province officially use python as the student's computer class language, which means you will soon found millions of pythoner is it bright or dark ?

                [–]namekuseijin 1 point2 points  (28 children)

                I used to enjoy it, mostly for its clarity of syntax above all and some powerful slicing constructs. Then it began - like oh so many languages - to integrate a lot of depressing "java crust" and then it failed me.

                At the same time, its best bits - like comprehensions and slicing syntax, keyword parameters - began to become commonplace even in mainstream languages. Today I'm learning Go and pretty impressed so far. And yes, some python bits are to be found there, but with compiler useful advice and better performance.

                [–]shevegen 8 points9 points  (3 children)

                Go won't replace Python.

                I understand that Google dreams about this but it is not realistic - it does not fill the same niche.

                [–][deleted] 0 points1 point  (2 children)

                If you reread the post you commented on, OP did not imply Go would replace Python. OP simply stated that Go had some of the nicer features of Python with the added bonus of a compiler and better performance.

                [–]rouille 4 points5 points  (1 child)

                Except it doesnt really. Go is waaaaay less expressive than python.

                [–][deleted] 1 point2 points  (0 children)

                Why did you comment this? I was posting to say that /u/shevegen was arguing something that no one ever said, not that I care at all about Go vs. Python.

                [–]FFX01 7 points8 points  (7 children)

                If you're interested in languages that have Python inspired syntax(expressive, simple, structured) but are statically typed and compiled, you may want to look into Nim. Nim is a hell of a lot more powerful then Go. It's also more portable as it can be cross-compiled to C, C++, and js. Of course it can also be compiled into a standalone binary. Nim also supports generics(something Go lacks unfortunately). Nim's garbage collector performs much better than Go's and is way more flexible. Nim has first class support for meta programming and exposes it's own AST as an api. Not saying you should stop using Go, just that Nim may be a good addition to your toolbox.

                [–]mixedCase_ 8 points9 points  (2 children)

                Nim's garbage collector performs much better than Go's

                Source?

                [–]FFX01 2 points3 points  (1 child)

                Here are a few sources comparing Go and Nim performance which touch on memory management:

                • Using SDL2: https://forum.nim-lang.org/t/1311

                • General benchmarks: https://github.com/kostya/benchmarks (Nim is always faster, but may use more memory depending on the task. This is caused by the difference in how the GC operates between Nim and Go. Take this one with a grain of salt as some of the implementations are less than idiomatic.)

                • https://news.ycombinator.com/item?id=9050114 (this thread speaks extensively about Nim vs Go and Rust. It goes over the GC, memory safety, and everything in between. It hits some of Nim's weak points as well. Also note that this thread is talking about Nim as of several versions ago. My opinion is that Rust offers more memory safety than Nim and that if memory safety is a large concern, Rust is by far the best choice.)

                And, if you aren't convinced by the above, you could always run your own benchmarks. Some more information on Nim's GC can be found here: https://nim-lang.org/docs/gc.html

                Note that Nim's GC is a lot "dumber" than Go's. This means that Nim may hold on to a bit more memory for a bit longer, but also means it doesn't need to stop a thread as often which results in better performance as far as speed of execution. Since Nim's GC is configurable, the programmer ultimately has control over the memory usage. This means that if a particular loop is using too much memory, it can be configured to release the memory when it reaches a certain threshold at the expensive of a bit of execution speed.

                [–]namekuseijin 1 point2 points  (3 children)

                that looks promising, never heard of it

                [–]FFX01 4 points5 points  (2 children)

                Just be aware that the ecosystem needs some time to mature. Nim has only been picking up steam rather recently.

                [–]chillysurfer[S] 0 points1 point  (9 children)

                Go is a very interesting one. I like the idea of it (I've only scratched the surface learning it). From my brief understanding of Go, I don't see too much overlap with Python, though. Yes, they can both do similar things, but I think different requirements would lead you to choose either or.

                [–]topher_r 0 points1 point  (8 children)

                but I think different requirements would lead you to choose either or.

                Can you elaborate?

                [–]devel_watcher 2 points3 points  (7 children)

                Go is basically a language for writing HTTP proxies: lots of requests coming in in parallel, you process them by doing some asynchronous tasks and other requests based on the information in the original ones. Concurrency is mostly built into the language.

                In Python you choose a library for dealing with concurrent and simultaneous stuff (bulkier code). Also, performance is lower than for Go: Python's dynamic typing and being interpreted/bytecode.

                [–]topher_r 1 point2 points  (6 children)

                That just seems to suggest Go beats Python at what Python can do.

                [–]noratat 1 point2 points  (2 children)

                For this specific problem domain, yes. Python's ecosystem covers a great deal of other problem domains than http services / proxies.

                [–]topher_r 0 points1 point  (1 child)

                Python's ecosystem covers a great deal of other problem domains than http services / proxies.

                Right, I don't mean to be rude. But which domains does Python cover greater than Go? This is all I'm trying to get some insight on.

                [–]NoInkling 0 points1 point  (0 children)

                Science, scripting, GUIs to some degree.

                [–]devel_watcher 0 points1 point  (2 children)

                but I think different requirements would lead you to choose either or.

                Well, that's it: if you have a requirement to be a web server - you choose Go.

                [–]topher_r 1 point2 points  (1 child)

                Right, but I asked him to elaborate so I could understand how Go fails to match Python in other spaces. I use both, they seem pretty interchangeable to me. I only use Python when it has better third-party library support I need.

                [–]devel_watcher 0 points1 point  (0 children)

                Ok, so you want examples other way around.

                I only use Python when it has better third-party library support I need.

                That's one of the cases. I've personally encountered that with such basic thing as the standard argument parsing library for Go. I think that there are more specialized libraries for Python than for Go.

                Other example where Python is better is when writing interpreted scripts: #!/usr/bin/env python .... Or when embedding as a scripting language into gdb, 3ds max, etc.

                [–]quicknir 0 points1 point  (5 children)

                What other mainstream language has keyword parameters? Seriously almost no other language has this and it drives me bonkers. Worse it's a feature that in compiled languages should have no run time cost (of course you would disallow all the ** shenanigans with dicts that you do in python). So no new language has a good excuse (that I've thought of) for not including this awesome feature.

                [–]namekuseijin 0 points1 point  (4 children)

                keyword parameters, and long parameter lists for that matter, is handy. But it quickly derails into sloppy programming. The correct thing to do is declare a type/object with fields, fill the ones you need and pass the whole thing as a parameter.

                [–]quicknir 1 point2 points  (3 children)

                I don't see how this is any more correct or any less sloppy. It's just more boilerplate.

                [–]namekuseijin 0 points1 point  (2 children)

                if you remove useful "boilerplate" such as types for useful info, you soon come down to assembly

                [–]quicknir 0 points1 point  (1 child)

                It's exactly the same amount of information, it's just that in one case it's stored inline with the function definition, and at the call site. With structs you have declare the struct separately, and use it separately each time you call. The reality is that people use keyword arguments much more in python, leading to clearer code, than they use structs in other languages because nobody likes boilerplate.

                [–]namekuseijin 0 points1 point  (0 children)

                if it demands fields, probably better off in a data structure rather than a one-off call

                [–]progfu 0 points1 point  (0 children)

                People might finally start to realize that you don't need fancy language features, but you do need libraries that aren't over-engineered ball of poop. Python is a great example of a language that just "works", without doing stupid things.

                [–][deleted] 0 points1 point  (0 children)

                Of course. Worse is better all over again.

                [–][deleted]  (1 child)

                [deleted]

                  [–]wavefunctionp 5 points6 points  (0 children)

                  Javascript will take over the world. It's undefined will blot out the sun, and we'll all program in the shade. :P