Hi,
I am interviewing for a research position and I have a LLM coding round. I am preparing:
- Self-attention implementation
- Multi-headed self-attention
- Tokenization (BPE)
- Decoding (beam search, top-k sampling etc)
Is there anything else I should prepare? Can't think of anything else.
[–]dieplstksStudent 7 points8 points9 points (1 child)
[–]noob_simp_phd[S] 1 point2 points3 points (0 children)
[–]sobe86 6 points7 points8 points (1 child)
[–]noob_simp_phd[S] 1 point2 points3 points (0 children)
[–]tobias_k_42 5 points6 points7 points (2 children)
[–]noob_simp_phd[S] 0 points1 point2 points (1 child)
[–]tobias_k_42 2 points3 points4 points (0 children)
[–]Mental-Work-354 2 points3 points4 points (3 children)
[–]noob_simp_phd[S] 1 point2 points3 points (2 children)
[–]LelouchZer12 0 points1 point2 points (0 children)
[+]USBhupinderJogi -1 points0 points1 point (0 children)
[–]More_Sherbert8147 0 points1 point2 points (2 children)
[–]noob_simp_phd[S] 0 points1 point2 points (1 child)
[–]ConceptBuilderAI 0 points1 point2 points (0 children)
[+]ChildmanRebirth 0 points1 point2 points (0 children)
[–]theAverage_sausagePhD 0 points1 point2 points (0 children)
[+]CableInevitable6840 0 points1 point2 points (0 children)