Login
From:
njkumarr
(Uncensored)
subscribe
Understanding Multilingual Token Compression in GPT-o Family Models | njkumarr
https://www.njkumar.com/gpt-o-multilingual-token-compression/
links
backlinks
Roast topics
Find topics
Find it!
GPT-o introduces a new tokenizer , which both doubles the model's vocabulary size to 200k (previously 100k with GPT-4) and significantly improves token ...