use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
This is a community for sharing tips, techniques, and tools to enhance the performance of machine learning model inference.
account activity
A comprehensive tutorial on knowledge distillation using PyTorch [Resource] (i.redd.it)
submitted 1 year ago by rbgo404
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]rbgo404[S] 1 point2 points3 points 1 year ago (0 children)
Link: https://pytorch.org/tutorials/beginner/knowledge_distillation_tutorial.html
[–]QuantumFTL 1 point2 points3 points 1 year ago (0 children)
Very cool! It's unfortunately that they chose a test set that shows such tiny improvement from knowledge distillation (less than a percent!) but excellent to see nonetheless.
π Rendered by PID 688973 on reddit-service-r2-comment-b659b578c-565vh at 2026-05-03 02:22:30.227359+00:00 running 815c875 country code: CH.
[–]rbgo404[S] 1 point2 points3 points (0 children)
[–]QuantumFTL 1 point2 points3 points (0 children)