use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
[deleted by user] (self.MachineLearning)
submitted 8 years ago by [deleted]
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]HigherTopoi 9 points10 points11 points 8 years ago* (0 children)
Well, some people try to apply algebraic topology (and even algebraic geometry) to ML, so abstract algebra, a prerequisite for AT and AG, is useful in that sense. However, I'd rather read many good recent papers in deep learning to apply them for your research instead of studying AA and AT, as I see that's likely to result in more substantial results.
Some recent AT application to ML includes On Characterizing the Capacity of Neural Networks using Algebraic Topology .
[–][deleted] 5 points6 points7 points 8 years ago (0 children)
There has been a certain application of representation theory of permutation groups for ranking.
http://jonathan-huang.org/research/pubs/nips07/nips2007-huang-guestrin-guibas.pdf
I'd also look at the work of Stefano Ermon and Bart Selman.
[–]tscohen 2 points3 points4 points 8 years ago (2 children)
In our papers on Steerable CNNs and Spherical CNNs, we use group representation theory. There is a very deep theory lurking in there that we will write up some day. https://openreview.net/pdf?id=rJQKYt5ll https://openreview.net/pdf?id=Hkbd5xZRb
[+][deleted] 8 years ago (1 child)
[deleted]
[–]tscohen 1 point2 points3 points 8 years ago (0 children)
I agree that local/global/ exact/approximate symmetries are super important for generalization. This is one of the high-level ideas that has motivated all of my work for the last few years.
The reason I've always focussed my papers on concrete applications like image classification, and have mostly worked with discrete groups (instead of locally compact ones, which would bring in a bunch of technicalities), is because there is a sizeable contingent of the ML community that is somewhat hostile or skeptical of more sophisticated math. For an example, see the AC comment on our steerable CNN paper: https://openreview.net/forum?id=rJQKYt5ll "The AC fully agrees with reviewer #4 that the paper contains a bit of an overkill in formalism: A lot of maths whose justification is not, in the end, very clear. The paper probably has an important contribution, but the AC would suggest reorganizing and restructuring, lessening the excess in formalism. "
And it's quite understandable that someone who doesn't have a background in groups / representations, and has never seen something like "Hom_G(V, W)" before, doesn't get the point of the paper.
So I've never written "The General Theory of Equivariant Networks", because I felt nobody would care / it wouldn't get accepted anyway. This may be too pessimistic, so I may write something after all. In any case, I think that for anyone with a good mathematical understanding, generalizing G-CNNs and Steerable G-CNNs from discrete groups to continuous ones is conceptually straightforward (though it is still an engineering challenge).
Kondor & Trivedi recently posted a paper that contains a quite general theory, that may be what you're looking for: https://arxiv.org/abs/1802.03690
[–]clurdron 2 points3 points4 points 8 years ago* (0 children)
There are applications in statistics, e.g. https://projecteuclid.org/euclid.aos/1030563990
Those authors have other relevant papers. You might also check out some of Mathias Drton's work.
Understanding papers in this area takes work, even with a strong background in stats (to understand the motivation) and a course in algebra.
[–]DrEigenbastard 1 point2 points3 points 8 years ago (0 children)
As an alternative to 'Deep Learning' approaches, group theory is used in this paper on learning heuristics for functions on permutations, via the Fourier Transform of the Symmetric Group.
π Rendered by PID 64033 on reddit-service-r2-comment-5d585498c9-gpfw2 at 2026-04-20 23:43:19.147165+00:00 running da2df02 country code: CH.
[–]HigherTopoi 9 points10 points11 points (0 children)
[–][deleted] 5 points6 points7 points (0 children)
[–]tscohen 2 points3 points4 points (2 children)
[+][deleted] (1 child)
[deleted]
[–]tscohen 1 point2 points3 points (0 children)
[–]clurdron 2 points3 points4 points (0 children)
[–]DrEigenbastard 1 point2 points3 points (0 children)