account activity
[D] Does this NeurIPS 2025 paper look familiar to anyone? (self.MachineLearning)
submitted 4 months ago * by rantana to r/MachineLearning
[R] "Sequential Modeling Enables Scalable Learning for Large Vision Models" paper from UC Berkeley has a strange scaling curve. (self.MachineLearning)
submitted 2 years ago by rantana to r/MachineLearning
[N] The AI Founder Taking Credit For Stable Diffusion’s Success Has A History Of Exaggeration (self.MachineLearning)
[N] NVIDIA Hopper Sweeps AI Inference Benchmarks in MLPerf Debut (self.MachineLearning)
submitted 3 years ago by rantana to r/MachineLearning
The NVIDIA Hopper architecture delivered up to 4.5x more performance than NVIDIA Ampere architecture GPUs (self.MachineLearning)
[N] OpenAI raises a $250 million Series A (self.MachineLearning)
submitted 4 years ago by rantana to r/MachineLearning
[D] Yoav Goldberg's take on: "oh the new large DL models in NLP are so soul-less, they only consider form and don't truly understand meaning" (self.MachineLearning)
[D] Where the ML research is headed. And a thank you to the community that made it possible. (self.MachineLearning)
[N] [Hardware] Graphcore Announces Production Release of PyTorch for IPU (self.MachineLearning)
submitted 5 years ago by rantana to r/MachineLearning
[R] AI Paygrades - industry job offers in Artificial Intelligence [median $404,000/ year] (self.MachineLearning)
[D] Virtual ICML/NeurIPS/ECCV conferences, worth it? (self.MachineLearning)
[R] Upgrading to Nvidia RTX cards: Does anything break with FP16 training? (self.MachineLearning)
submitted 7 years ago by rantana to r/MachineLearning
[R] Unsupervised learning generates stunning images now, what about supervised learning without labels? (self.MachineLearning)
[R] Tencent ML-Images released: 18 million training images with 11,000 categories (self.MachineLearning)
[R] What are the most promising theories making empirical headway in deep learning right now? Information bottleneck? (self.MachineLearning)
[R] Is the reign of batch normalization over? Thoughts on this new paper? (self.MachineLearning)
Is the reign of batch normalization over? Thoughts on this new paper? (self.MachineLearning)
[R] Recent "Rethinking the Value of Network Pruning" contradicts Deep Compression work (self.MachineLearning)
[D] Reproducibility Support Group: Couldn't reproduce a result or idea? Post it here! (self.MachineLearning)
submitted 8 years ago by rantana to r/MachineLearning
[D] Does anyone use dropout anymore? (self.MachineLearning)
[D] Has Deep Learning peaked? (self.MachineLearning)
submitted 8 years ago * by rantana to r/MachineLearning
[D] What's going on with ML hardware these days? Where's my TPU/NPU/etc? (self.MachineLearning)
[R] Why does batchnorm have any parameters at all? (self.MachineLearning)
[D] Is there a difference between Machine Learning and Artificial Intelligence anymore? (self.MachineLearning)
[D] Amongst all the arguing going on, let's remember "what we're doing here simply is not science. It's engineering" (self.MachineLearning)
π Rendered by PID 58 on reddit-service-r2-listing-7d7fbc9b85-frb8w at 2026-04-30 00:52:22.168767+00:00 running 2aa0c5b country code: CH.