AMD is excited to announce Instella, a family of fully open state-of-the-art 3-billion-parameter language models (LMs). , In this blog we explain how the Instella models were trained, and how to access them.| ROCm Blogs
Our next generation of fully-open base and instruct models sit at the Pareto frontier of performance and training efficiency. Check out our [paper](https://arxiv.org/abs/2501.00656) to learn more, or keep reading for a summary.| allenai.org
Blog Post by Mengzhou Xia and Tianyu Gao from Princeton University| Princeton NLP