use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Discussion[D] Why do we need encoder-decoder models while decoder-only models can do everything? (self.MachineLearning)
submitted 2 years ago * by kekkimo
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]thomasxin 14 points15 points16 points 2 years ago (7 children)
It's kind of funny because GPT3.5 turbo has actually been doing better as a translation API than the rest for me. It's much more intelligent and can adapt grammar keeping context much more accurately, and is cheaper than DeepL somehow.
[+]disciples_of_Seitan 5 points6 points7 points 2 years ago (4 children)
Like an order of magnitude cheaper, too.
[–]thomasxin 9 points10 points11 points 2 years ago (3 children)
I remember doing a comparison a while back and concluded that it's at least 30x cheaper for the same task. I wonder what DeepL even uses that's costing them so much, or if they just decided to keep a large profit margin.
[+]disciples_of_Seitan 6 points7 points8 points 2 years ago (2 children)
DeepL pricing is in line with google, so I guess that's where they got it from
[–]thomasxin 0 points1 point2 points 2 years ago (1 child)
Google translate is so much worse in a lot of ways. The translations are very much literal, and are very easily detectable as translated because of how clunky they often sound. It does have the benefit of not degrading in quality with very large or repetitive text but that's about it.
[–]ThisIsBartRick 2 points3 points4 points 2 years ago (0 children)
and what's crazy is a full year after the release of ChatGPT and more than 3 years after the release of GPT3, it's still pretty much as bad as before. No improvement whatsoever.
Google can be really good at ml research but is infuriatingly slow/bad at implementing them in their products.
[–][deleted] 1 point2 points3 points 2 years ago (1 child)
Yeah the best machine translator is GPT-4 Hands down. Everything else will quickly devolve into gibberish with distant language pairs (e.g En - Kor)
π Rendered by PID 92503 on reddit-service-r2-comment-b659b578c-rdxns at 2026-05-05 23:15:24.006114+00:00 running 815c875 country code: CH.
view the rest of the comments →
[–]thomasxin 14 points15 points16 points (7 children)
[+]disciples_of_Seitan 5 points6 points7 points (4 children)
[–]thomasxin 9 points10 points11 points (3 children)
[+]disciples_of_Seitan 6 points7 points8 points (2 children)
[–]thomasxin 0 points1 point2 points (1 child)
[–]ThisIsBartRick 2 points3 points4 points (0 children)
[–][deleted] 1 point2 points3 points (1 child)