Login
From:
spaces.ac.cn
(Uncensored)
subscribe
为什么线性注意力要加Short Conv? - 科学空间|Scientific Spaces
https://spaces.ac.cn/archives/11320
links
backlinks
如果读者有关注模型架构方面的进展,那么就会发现,比较新的线性Attention(参考《线性注意力简史:从模仿、创新到反哺》)模型都给$\boldsymbol{Q},\boldsymbol{K},...
Roast topics
Find topics
Roast it!
Roast topics
Find topics
Find it!
Roast topics
Find topics
Find it!