Login
From:
Giles' Blog
(Uncensored)
subscribe
Writing an LLM from scratch, part 9 -- causal attention :: Giles' blog
https://www.gilesthomas.com/2025/03/llm-from-scratch-9-causal-attention
links
backlinks
Tagged with:
python
ai
llm from scratch
til deep dives
Causal, or masked self-attention: when we're considering a token, we don't pay attention to later ones. Following Sebastian Raschka's book 'Build a Large Language Model (from Scratch)'. Part 9/??
Roast topics
Find topics
Find it!