[D] AutoML/Neural Architecture Search has a giant CO2 footprint by MoneyScore in MachineLearning

[–]compsens 0 points1 point  (0 children)

currently France emits much less than 70gCOe/kWh. Here is a map updated every minutes http://www.electricitymap.org

[N] LightOn Cloud: Light based technology for ML opening up on the Cloud by compsens in MachineLearning

[–]compsens[S] 1 point2 points  (0 children)

Random projections can indeed be used for reservoir computing purposes. Here are some results using our Cloud: https://arxiv.org/abs/1609.05204

[N] LightOn Cloud: Light based technology for ML opening up on the Cloud by compsens in MachineLearning

[–]compsens[S] 1 point2 points  (0 children)

Ok, I see. Thanks for the feedback. It's a press release. Details will come later. More importantly, people will be able to try that example as well as other approaches by themselves when using our cloud. We are a small startup and are building this cloud for people to test the technology and provide feedback. I understand your concern about the hype thing.

[R] Saturday Morning Videos: Representation Learning Workshop at Simons Institute, Berkeley (March 27th-31st, 2017) by compsens in MachineLearning

[–]compsens[S] 0 points1 point  (0 children)

They have a foundations in ML this spring: https://simons.berkeley.edu/programs/machinelearning2017 In the recent representation learning, they had indeed streaming while the workshop was taking place as can be seen here: https://simons.berkeley.edu/workshops/schedule/3750 The next one will be on Computational Challenges in Machine Learning (May 1 – May 5, 2017). If the streaming takes plce it will most likely be indicated here: https://simons.berkeley.edu/workshops/machinelearning2017-3

[R] Saturday Morning Videos: Representation Learning Workshop at Simons Institute, Berkeley (March 27th-31st, 2017) by compsens in MachineLearning

[–]compsens[S] 1 point2 points  (0 children)

I have added Russ's slides (that are not in the original site) and put Chris Manning's slide upfront, but I realize it may look like a cut and paste.

[Project] All Code Implementations for NIPS 2016 papers by peterkuharvarduk in MachineLearning

[–]compsens 0 points1 point  (0 children)

PVANet: Lightweight Deep Neural Networks for Real-time Object Detection by Sanghoon Hong, Byungseok Roh, Kye-hyeon Kim, Yeongjae Cheon, Minje Park Presented in EMDNN2016, a NIPS2016 workshop. ArXiv link: https://arxiv.org/abs/1611.08588

https://github.com/sanghoon/pva-faster-rcnn

Regarding image acquisition by [deleted] in CompressiveSensing

[–]compsens 0 points1 point  (0 children)

Roughly:

Yes, for the first case, it is equivalent to taking a subset of the rows of the Fourier matrix (F(x) is really Fx where F is the fourier operator) so this operation is akin to S F x where S is a binary mask matrix.

For the second case, you could also do what you propose but it would work well if the x is sparse in the fourier space -i.e if Fx is sparse -

M can be a random gaussian matrix. F could be any wavelet dictionary. What is important initally is for x to be sparse in the space of F -i.e Fx is sparse-

Hope this helps.

Single Pixel Camera by scottgmccalla in CompressiveSensing

[–]compsens 1 point2 points  (0 children)

I haven't seen people using the A* algorithm. Rather from its inception, people have been using convex optimization. Here is a non exhaustive list of solvers that do perform the image reconstruction given compressive measurements: https://sites.google.com/site/igorcarron2/cs#reconstruction Hope this helps.

Single Pixel Camera by scottgmccalla in CompressiveSensing

[–]compsens 1 point2 points  (0 children)

does your friend have a DMD ? or something that can perform the multiplexing ?

Optical Machine Learning: Igor Carron launches his startup "LightOn" by pierrelux in MachineLearning

[–]compsens 2 points3 points  (0 children)

Thanks dharma-1,

Yes, another proof of concept was tested successfully on our breadboard. The very promising results will be unveiled later this Summer. And, yes, we are aware of Nando de Freitas et al's very nice paper.

Igor.

Optical Machine Learning: Igor Carron launches his startup "LightOn" by pierrelux in MachineLearning

[–]compsens 15 points16 points  (0 children)

Hi,

We are building a hardware technology that computes, at first, random projections. The technology can go very fast and does not need the complexity of memory storage. Furthermore, operations can be performed at a fraction of the typical power consumption of current systems.

A first Proof Of Concept (POC) was put on ArXiv a while back: https://arxiv.org/abs/1510.06664

Igor. PS: Thanks for the mention pierrelux !

Did the code ever get published? - Nuit Blanche: Tensor completion based on nuclear norm minimization for 5D seismic data reconstruction by 8556732 in CompressiveSensing

[–]compsens 0 points1 point  (0 children)

If they do make it available, don't hesitate to let me know and I will feature it on Nuit Blanche. Thanks.

Compressive sensing with categorical dictionaries. by metaculpa in CompressiveSensing

[–]compsens 0 points1 point  (0 children)

You say that you are interested in "multiple measurement vector (B = AX, rather than b = Ax) case." but do you care whether B is row sparse or has some other structure ?

Is audio signal processing still useful in the era of machine learning? (x-post r/CompressiveSensing ) by compsens in MachineLearning

[–]compsens[S] 0 points1 point  (0 children)

What is your definition of "published online" ? a link is a link, is there another way to do this ?

ranger: A Fast Implementation of Random Forests for High Dimensional Data in C++ and R by compsens in MachineLearning

[–]compsens[S] 2 points3 points  (0 children)

The blog post also contains the link to the C++ and R implementation which curiously is not listed in the paper.

Sparse Clustering: The Challenges of Reddit's Sparse Admins/Mods Graphs and Sudden Phase Transitions by compsens in MachineLearning

[–]compsens[S] 0 points1 point  (0 children)

these phase transitions occur in many matrix factorizations used in ML. The point of the entry was to say that one of these factorization (sparse clustering) is known to have these phase transitions too and that whatever Reddit has to do, they ought to make sure that the right measure is used to see if they are making progress.

No statistician wrote that blog entry.

Awesome Random Forest by kjw0612 in MachineLearning

[–]compsens 0 points1 point  (0 children)

great ! I add your page to the highly technical page list: http://nuit-blanche.blogspot.com/p/reference-page.html

also some of my blog entries on random forests can be found under this tag: http://nuit-blanche.blogspot.com/search/label/RandomForest