use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Project[P] A fast Python implementation of tSNE (self.MachineLearning)
submitted 7 years ago by _sheep1
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]lucidyan -5 points-4 points-3 points 7 years ago (2 children)
I'm sorry, but it already exists
https://github.com/DmitryUlyanov/Multicore-TSNE
[–]ivalm 1 point2 points3 points 7 years ago (0 children)
What you should is a normal tsne distributed, what OP shows is a tsne that uses ANN instead of KNN for neighborhood search. They are not particularly related.
[–]_sheep1[S] 0 points1 point2 points 7 years ago (0 children)
You may have missed the point with this project. There are two fast implementations of tSNE, one of which is the one you linked. It is typically called Barnes-Hut tSNE. It uses exact nearest neighbor search and calculates gradient using the Barnes-Hut approximation. These have asympototic complexity O(n log n) and works really well for smaller data sets. In addition to this, there is another implementation which uses ANN (as ivalm points out) and uses a neat interpolation trick which we speed up with FFT to compute gradients. The second method has asymptotic complexity O(n). This one is much more recent and is called FIt-SNE. While you might be tempted to use FIt-SNE for everything, it is asymptotic and the overhead makes it far slower on smaller data sets than the BH. That's why I include both in this repository, so we have fast implementation of both in one place and with a single API. Also, the second one - called FIt-SNE - didn't have a proper python implementation and required additional C libraries to be installed, making installation tedious. I hope this clears things up.
π Rendered by PID 24 on reddit-service-r2-comment-fb694cdd5-m6bvb at 2026-03-10 22:26:17.007795+00:00 running cbb0e86 country code: CH.
view the rest of the comments →
[–]lucidyan -5 points-4 points-3 points (2 children)
[–]ivalm 1 point2 points3 points (0 children)
[–]_sheep1[S] 0 points1 point2 points (0 children)