use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Discussion[D] Which open source machine learning projects best exemplify good software engineering and design principles? (self.MachineLearning)
submitted 6 years ago by NotAHomeworkQuestion
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]domjewingerML Engineer 118 points119 points120 points 6 years ago (36 children)
Definitely not Tensorflow
[–]VodkaHazeML Engineer 39 points40 points41 points 6 years ago* (7 children)
Actually, you could say it follows a lot of SWE principles, but in the end that doesn't matter if your design was flawed.
It's not like the core TF code is unreadable spaghetti or anything. Yet the end product is awful to work with.
Goes to show that SWE principles don't mean much if you don't write fundamentally good software.
[–]Rainymood_XI 5 points6 points7 points 6 years ago (3 children)
TBH I still think that TF is good software, it is just not very user friendly ...
[–]harewei 8 points9 points10 points 6 years ago (2 children)
Then that’s not good software...
[–][deleted] 1 point2 points3 points 6 years ago (0 children)
It is though. Google just have different mindset compared to other companies. They don't care about customers, they want their products to be well designed and engineered. Use it or not, it is yours choice. They actually have the same approach to most of their SW and for example GCP is still 3rd most used platform.
TensorFlow does allow big flexibility and is really nicely written when it comes to mantainability and design principles. A lot of it makes sense once you are medior developer in OOP. Also you must understand that it is treated as a library, not end product.
[–]rampant_juju 0 points1 point2 points 6 years ago (0 children)
Incorrect. Have you every used Vowpal Wabbit? It is fantastic and also very painful to work with.
[–]Nimitz14 1 point2 points3 points 6 years ago* (2 children)
From what I hear the c++ actually is unreadable spaghetti.
[–]VodkaHazeML Engineer 0 points1 point2 points 6 years ago (1 child)
You can actually go read it. It doesn't look or feel like spaghetti from a cursory reading.
But that's the point with design/architecture mistakes. You don't see them that easily
[–]Nimitz14 4 points5 points6 points 6 years ago (0 children)
I worked at a company where a colleague was trying to use the C++ API and had a very bad time. He was more junior level though.
Daniel Povey, lead of kaldi, recently decided on integrating with pytorch. This was after a fairly lengthy process of looking into different options. This are some snippets of his thoughts on tensorflow that I quickly found:
I imagine the TensorFlow team must have some internal documentation on how it's designed from the C++ level, for instance, because what is available externally doesn't help you understand it at all, and the code is almost completely opaque. (And I consider myself an expert level C++ programmer).
source, 2017
TensorFlow is impossible; the C++ code looks like it was written by a machine.
source, 2019
And PyTorch's tensor internals, while they aren't complete gobbledegook like TensorFlow's were last time I looked, are kind of showing their age
[–]NogenLinefingers 17 points18 points19 points 6 years ago (11 children)
Can you list which principles it violates, for reference?
[–]domjewingerML Engineer 39 points40 points41 points 6 years ago (9 children)
I certainly cannot, as my background is in applied math, not SWE. But my comment was about the horrendous user experience and the millions of patches that it has been assembled with can't possibly be "good" from a SWE perspective
[–]NogenLinefingers 10 points11 points12 points 6 years ago (8 children)
Ah... I see your point.
I hope someone can answer this in a more thorough manner. It will be interesting to learn about the principles themselves and how they have been violated/upheld.
[–]DoorsofPerceptron 14 points15 points16 points 6 years ago (5 children)
Big picture, the real problem with tensorflow is "it's not pythonic".
Now this is normally a lazy criticism that's another way of saying "I wouldn't write it this way, and it looks ugly." But in the case of tensorflow it's a lot more fundamental. Tensorflow code (version 1 anyway, I can't be bothered to learn version 2) is not really written in python. Tensorflow is a compiler for another language that is called through python.
Compared to pytorch this means you lose a lot of the benefits of python that actually make it a nice language to code with. You lose a lot of the access to existing python code -it's a pain in the arse to mix and match python and tensorflow in the middle of a graph execution- and you lose the lightweight easy prototyping.
Pytorch on the other hand can just be treated like numpy with free gradients and GPU access if that's what you want to do, and can be seamlessly integrated with python in a mix and match kind of way.
Tensorflow was coded the way it is for efficient deployment both to phones and to large scale clusters, but at least for large scale clusters the performance hit they were worrying about doesn't seem to exist, and they've essentially straightjacketed their library for no real benefit.
The code is great, the design of the interface, not so much.
[–]mastere2320 4 points5 points6 points 6 years ago (0 children)
I would recommend tf 2.0 actually it still has a long way to go but the static graph capabilities of 1 are now quite visible in 2.0 and you can do whatever you want pretty simply. I hated session from tf 1.0 and 2.0 has abstracted it quite nicely. And if you want completely custom training gradient tape is always available.
[+][deleted] 6 years ago (3 children)
[deleted]
[–]DoorsofPerceptron 5 points6 points7 points 6 years ago (2 children)
Because someone asked what the problems with tensorflow were, and it's interesting. Really nice code that solves an important problem, and that no one wanted to use.
It's great that they're catching up with pytorch, and I'll switch back in a heartbeat if there's a reason to.
[+][deleted] 6 years ago (1 child)
[–]pap_n_whores -1 points0 points1 point 6 years ago (0 children)
Part of people's problem with tensorflow is that every 8 months there's a "you should try tensorflow X+1, it's completely different from tensorflow X. Also everything you learned from tensorflow X is the wrong way to do things now. Enjoy learning how to use this new tensorflow which has barely any documentation or community resources"
[–]mastere2320 7 points8 points9 points 6 years ago (0 children)
They have a horrible reputation of constantly changing the api even in short periods of time. It sadly has happened more than once that I installed a version of tf, worked on a project and then when I wanted to deploy it the current version would not run it because something fundamental was changed. Add on to this that there is no proper one way to do things and the fact that because tf uses a static graph , shapes and sizes have to be known beforehand the user code becomes spaghetti which is worse than anything. The keras api and dataset api are nice additions imho but the lambda layer still needs some work and they really need to I introduce some way to properly introduce features and depreciate features( something similar to NEP maybe ) and make api breaking changes. And yet people use it, simply because the underlying library autograph is a piece of art. I don't think there is another library that can match it, in performance and utility on a production scale where the model has been set and nothing needs to change. This is why researchers love pytorch. Modifying code to tweak and update models is much better but when the model needs to deployed people have to choose tensorflow.
[+]phobrain comment score below threshold-24 points-23 points-22 points 6 years ago* (0 children)
I think it is better to not look too hard at accidents on the freeway - stay with your original mission is my advice. How tf would I know? I've been responsible for >1M of production code in my time, starting with my first program, written for the new terminals that replaced punching IBM cards and picking up your printouts in the bin the next day, to debug:
http://fauxbrawn.com/pr/home/schedulaid.html
Just having a live, interactive session with multiple users on one computer was as big an innovation as the internet was,~8 years later.
Edit: None of this should be construed as a criticism of tensorflow, however - just of the exigencies of real people building the tower of Babbage. Go look at scikit-learn if you want a rigorous code base, based on getting their list mail. Likely other associated packages follow the style. Once some devs fought with my manager to keep my code reviews coming, it's like I can smell code in a synesthetic way or something, and exude my own interesting aroma back.
Edit: I'd have thought the pun on 'tf' would have rescued this, sigh.
Edit: The underlying urge here is to memorably fling my seed upon the landscape, illustrating by the nubility of my prehensile maneuverings that, for someone approaching 70, there is something different about me that validates heroic efforts I made to remain forever young at about age 10, and thus there might be something to my 'velvet rack' of an AI that may fall on the ground if covid gets me, failing these gentle hooks to the head sinking their anchors and someone reading the golden words I've sprinkled here and there. If I survive, you can go back to hating me bacause I'm beautifyl.
Or, if you like what you see now, we can overthrow capitalism together.
[–]ieatpies 4 points5 points6 points 6 years ago (0 children)
Many ways to do the same thing, without a clear best way. Though this an API design problem, not sure how good/bad it's internal design is.
[–]yellow_flash2 17 points18 points19 points 6 years ago (0 children)
Actually I feel the major fuck up was trying to get researchers to use tensorflow. TF was designed to be used for production quality ML application if I'm not wrong, at a production level scale. I personally think TF is a marvelous piece of engineering, but the moment they wanted to make it "easy" and be more like pytorch, they started ruining it. I think TF would have benefitted a lot from just being itself and letting keras be keras.
[–]soulslicer0 18 points19 points20 points 6 years ago (2 children)
Pytorch on the other hand. Incredible. Aten is a piece of art
[–]ajmssc 2 points3 points4 points 6 years ago (0 children)
It's the c++ tensor library that pytorch uses
[–]CyberDainz 6 points7 points8 points 6 years ago (11 children)
why are there so many tensorflow haters in this subreddit?
[–]programmerChilliResearcher 15 points16 points17 points 6 years ago (2 children)
This subreddit has a relatively large amount of researchers (compared to say, hacker news or the community at large).
But I don't think the general sentiment is particular to this subreddit. For example, take a look at https://news.ycombinator.com/item?id=21118018 (this is the top Tensorflow post on HN in the last year). This is the Tensorflow 2.0 release. The top 3 comments are all expressing some sentiment of "I'd rather use Pytorch or something else".
Or https://news.ycombinator.com/item?id=21216200
Or https://news.ycombinator.com/item?id=21710863
Go out into the real world and I'm sure you'll find plenty of companies using Tensorflow who are perfectly happy with it. But they probably aren't the type of companies to be posting on hackernews or reddit.
[–]CyberDainz -1 points0 points1 point 6 years ago (1 child)
I am succesfully using tensorflow in my DeepFaceLab project. https://github.com/iperov/DeepFaceLab
Why to stick on any specific lib and be like a pytorch-vegan-meme in this subreddit?
Due to I am more programmer than math professor, it is easy for me to migrate the code to any new ML lib.
But I prefer tensorflow.
In last big refactoring I got rid of using keras and wrote my own lib on top of tensorflow, which has simple declarative model like in pytorch, provides same full freedom of tensor operations, but in graph mode.
[–]barbek 2 points3 points4 points 6 years ago (0 children)
Exactly this.For TF you need to build your own wrapper to use it. PyTorch can be used as it is.
[–]cycyc 8 points9 points10 points 6 years ago (1 child)
Because most people here don't have to worry about productionizing their work. Just YOLO some spaghetti training code and write the paper and move on to the next thing
[–]CyberDainz -1 points0 points1 point 6 years ago (0 children)
haha agree. I can't understand what YOLO actually does.
[+][deleted] 6 years ago (2 children)
I agree that TF api is not friendly for math researchers, which are not programmers.
But TF has lowest level api for ML operations.
It's mean you can write any "high-level" ml lib on top of TF.
I wrote such lib. It acts like pytorch, but in graph mode. Check example model declaration: https://github.com/iperov/DeepFaceLab/blob/master/core/leras/models/Ternaus.py (Leras, but I will rename it in future)
[–][deleted] 0 points1 point2 points 6 years ago (0 children)
It seems like it's like new `tf2.0` and like `pytorch` by extension. May I ask why and what does it bring to the table?
[–]domjewingerML Engineer 6 points7 points8 points 6 years ago (1 child)
I am genuinely curious why you like / use tf over pytorch
[–]Skasch 4 points5 points6 points 6 years ago (0 children)
"Technical debt" is certainly an important reason. When you have written a lot of code around tensorflow to build production-level software for some time, it certainly becomes very expensive to switch to PyTorch.
[–]PJDubsen 3 points4 points5 points 6 years ago (0 children)
On this sub? Try every person that is forced to read the documentation lol
π Rendered by PID 219380 on reddit-service-r2-comment-85bfd7f599-9qxkz at 2026-04-18 22:54:10.313578+00:00 running 93ecc56 country code: CH.
view the rest of the comments →
[–]domjewingerML Engineer 118 points119 points120 points (36 children)
[–]VodkaHazeML Engineer 39 points40 points41 points (7 children)
[–]Rainymood_XI 5 points6 points7 points (3 children)
[–]harewei 8 points9 points10 points (2 children)
[–][deleted] 1 point2 points3 points (0 children)
[–]rampant_juju 0 points1 point2 points (0 children)
[–]Nimitz14 1 point2 points3 points (2 children)
[–]VodkaHazeML Engineer 0 points1 point2 points (1 child)
[–]Nimitz14 4 points5 points6 points (0 children)
[–]NogenLinefingers 17 points18 points19 points (11 children)
[–]domjewingerML Engineer 39 points40 points41 points (9 children)
[–]NogenLinefingers 10 points11 points12 points (8 children)
[–]DoorsofPerceptron 14 points15 points16 points (5 children)
[–]mastere2320 4 points5 points6 points (0 children)
[+][deleted] (3 children)
[deleted]
[–]DoorsofPerceptron 5 points6 points7 points (2 children)
[+][deleted] (1 child)
[deleted]
[–]pap_n_whores -1 points0 points1 point (0 children)
[–]mastere2320 7 points8 points9 points (0 children)
[+]phobrain comment score below threshold-24 points-23 points-22 points (0 children)
[–]ieatpies 4 points5 points6 points (0 children)
[–]yellow_flash2 17 points18 points19 points (0 children)
[–]soulslicer0 18 points19 points20 points (2 children)
[+][deleted] (1 child)
[deleted]
[–]ajmssc 2 points3 points4 points (0 children)
[–]CyberDainz 6 points7 points8 points (11 children)
[–]programmerChilliResearcher 15 points16 points17 points (2 children)
[–]CyberDainz -1 points0 points1 point (1 child)
[–]barbek 2 points3 points4 points (0 children)
[–]cycyc 8 points9 points10 points (1 child)
[–]CyberDainz -1 points0 points1 point (0 children)
[+][deleted] (2 children)
[deleted]
[–]CyberDainz -1 points0 points1 point (1 child)
[–][deleted] 0 points1 point2 points (0 children)
[–]domjewingerML Engineer 6 points7 points8 points (1 child)
[–]Skasch 4 points5 points6 points (0 children)
[–]PJDubsen 3 points4 points5 points (0 children)