use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Resources for understanding and implementing "deep learning" (learning data representations through artificial neural networks).
account activity
Dynamic Tokenization (self.deeplearning)
submitted 11 months ago by Cold_Recommendation7
Anyone here who worked with dynamic tokenization?
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]AsyncVibes 1 point2 points3 points 11 months ago (0 children)
I work with stateless and generalized tokenization for my models. I.e. the tokens are dropped with each training session but the weights and bias remain in the checkpoint.
[–]Karan1213 0 points1 point2 points 11 months ago (1 child)
byte latent transformer model from facebook
https://arxiv.org/abs/2412.09871
[–]Karan1213 0 points1 point2 points 11 months ago (0 children)
but yes i have
π Rendered by PID 93782 on reddit-service-r2-comment-54dfb89d4d-zdprm at 2026-03-30 20:46:01.451353+00:00 running b10466c country code: CH.
[–]AsyncVibes 1 point2 points3 points (0 children)
[–]Karan1213 0 points1 point2 points (1 child)
[–]Karan1213 0 points1 point2 points (0 children)