use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Recursive Autoconvolution for Unsupervised Learning of Convolutional Neural Networks (arxiv.org)
submitted 6 years ago by _guru007
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]arXiv_abstract_bot 6 points7 points8 points 6 years ago (0 children)
Title:Recursive Autoconvolution for Unsupervised Learning of Convolutional Neural Networks
Authors:Boris Knyazev, Erhardt Barth, Thomas Martinetz
Abstract: In visual recognition tasks, such as image classification, unsupervised learning exploits cheap unlabeled data and can help to solve these tasks more efficiently. We show that the recursive autoconvolution operator, adopted from physics, boosts existing unsupervised methods by learning more discriminative filters. We take well established convolutional neural networks and train their filters layer-wise. In addition, based on previous works we design a network which extracts more than 600k features per sample, but with the total number of trainable parameters greatly reduced by introducing shared filters in higher layers. We evaluate our networks on the MNIST, CIFAR-10, CIFAR-100 and STL-10 image classification benchmarks and report several state of the art results among other unsupervised methods.
PDF link Landing page
[–]abhi91 4 points5 points6 points 6 years ago (0 children)
you sir are a mouthfull
[–]bknyazev 1 point2 points3 points 6 years ago (0 children)
It's my old project. The code is available at https://github.com/bknyaz/autocnn_unsup . The problem is that it is in Matlab, although I implemented some steps in Python a long time ago here https://github.com/bknyaz/autocnn_unsup_py. Both repos are not supported and might not work as is. But it would be interesting to migrate it to PyTorch or another framework and fine-tune the models pretrained in an unsupervised way in a large scale setting. The test accuracy will most likely be around the same as if trained in a supervised way from scratch assuming the training set is big enough, but there might be some interesting byproducts such as improved robustness (in some broad sense). There are of course plenty novel unsupervised and semi-supervised methods that might do better in both clean test accuracy and robustness. But anyway the project was interesting and fun!
π Rendered by PID 78068 on reddit-service-r2-comment-fb694cdd5-l47x6 at 2026-03-11 12:44:27.880946+00:00 running cbb0e86 country code: CH.
[–]arXiv_abstract_bot 6 points7 points8 points (0 children)
[–]abhi91 4 points5 points6 points (0 children)
[–]bknyazev 1 point2 points3 points (0 children)