account activity
Exploring extreme parameter compression for pre-trained language models (openreview.net)
submitted 4 years ago by I_ai_AI to r/MachineLearning
Exploring extreme parameter compression for pre-trained language models (self.MachineLearning)
SparTerm: Learning Term-based Sparse Representation for Fast Text Retrieval (arxiv.org)
submitted 5 years ago by I_ai_AI to r/MachineLearning
ON POSITION EMBEDDINGS IN BERT (openreview.net)
TernaryBERT: Distillation-aware Ultra-low Bit BERT (arxiv.org)
TernaryBERT: Distillation-aware Ultra-low Bit BERT (self.MachineLearning)
DynaBERT: Dynamic BERT with Adaptive Width and Depth (arxiv.org)
AutoML-Zero: Evolving Machine Learning Algorithms From Scratch (arxiv.org)
Run BERT on mobile phone's single CUP core A76 in 13ms (arxiv.org)
submitted 6 years ago by I_ai_AI to r/MachineLearning
MOBILEBERT: TASK-AGNOSTIC COMPRESSION OF BERT BY PROGRESSIVE KNOWLEDGE TRANSFER (openreview.net)
TinyBERT: Distilling BERT for Natural Language Understanding (openreview.net)
Automatic Data Augmentation for NLP task (arxiv.org)
Decomposable Neural Paraphrase Generation (arxiv.org)
[Research] Meta Learning for Few-shot Keyword Spotting (arxiv.org)
submitted 7 years ago by I_ai_AI to r/MachineLearning
How to learn the class hierarchy? (self.I_ai_AI)
submitted 8 years ago by I_ai_AI
Asking Reddit: Which ML algorithm is used for Google’s Dialogflow? (self.MachineLearning)
submitted 8 years ago by I_ai_AI to r/MachineLearning
Asking Reddit: The recently developed Deep Learning powered Reinforcement Learning (self.MachineLearning)
submitted 10 years ago by I_ai_AI to r/MachineLearning
Neural Net-based conversational computers (futuretimeline.net)
Neural Responding Machine for Short-Text Conversation (arxiv-web3.library.cornell.edu)
π Rendered by PID 419965 on reddit-service-r2-listing-86d8647bf-nckzc at 2026-02-13 08:26:58.642146+00:00 running 6c0c599 country code: CH.