Login
From:
Giles' Blog
(Uncensored)
subscribe
Writing an LLM from scratch, part 14 -- the complexity of self-attention at scale :: Giles' blog
https://www.gilesthomas.com/2025/05/llm-from-scratch-14-taking-stock-part-2-the-complexity-of-self-attention-at-scale
links
backlinks
Tagged with:
python
llm from scratch
til deep dives
A pause to take stock: starting to build intuition on how self-attention scales (and why the simple version doesn't)
Roast topics
Find topics
Find it!