Login
Roast topics
Find topics
Find it!
From:
Giles' Blog
(Uncensored)
subscribe
Writing an LLM from scratch, part 20 -- starting training, and cross entropy loss :: Giles' blog
https://www.gilesthomas.com/2025/10/llm-from-scratch-20-starting-training-cross-entropy-loss
links
backlinks
Tagged with:
ai
llm from scratch
til deep dives
Roast topics
Find topics
Roast it!
Starting training our LLM requires a loss function, which is called cross entropy loss. What is this and why does it work?