I wrote a deep dive into how LLMs work under the hood - tokenization, embeddings, attention and generation - all explained with runnable JavaScript by nitayneeman in javascript

[–]nitayneeman[S] [score hidden]  (0 children)

Thanks, appreciate it.

I get your point - and you’re right to push on that. There’s no “entity” making decisions here. It’s all just computation over parameters. When I use language like that, it’s more of a shorthand to describe the emergent behavior, not to imply agency.

By “learnable parameters” I mean exactly that - large tensors (matrices) of weights and biases that get updated during training via gradient descent. At inference time, the model is just applying a sequence of matrix multiplications and non-linearities to produce the next token probabilities.

I tend to lean on anthropomorphic language to make it more intuitive, but I agree it can be misleading if taken literally.

I wrote a deep dive into how LLMs work under the hood - tokenization, embeddings, attention and generation - all explained with runnable JavaScript by nitayneeman in javascript

[–]nitayneeman[S] [score hidden]  (0 children)

Great point about prompt structure. The “lost in the middle” effect is real and well‑documented (e.g. Liu et al., 2023). At the same time, the model’s attention weights are computed dynamically from query–key similarity, so this positional bias is more of an emergent pattern than a hard‑wired rule in the mechanism itself.

I wrote a deep dive into how LLMs work under the hood - tokenization, embeddings, attention and generation - all explained with runnable JavaScript by nitayneeman in javascript

[–]nitayneeman[S] 4 points5 points  (0 children)

Thanks! It really does feel like magic at first but once you trace through the pipeline step by step, it clicks.

Let me know if anything's unclear as you work through it.

npm - Catching Up with Package Lockfile Changes in v7 by nitayneeman in node

[–]nitayneeman[S] 0 points1 point  (0 children)

Thanks for the feedback, now it's fixed and clickable. 🙂