For a roundtable on how AI hardware, software, and infrastructure will evolve in the years to come, we brought together two current SPC members – Bill Chang, who built Tesla’s Dojo supercomputer, and Ravi Jain, who led strategy & business for Krutrim – and Max Ryabinin, a researcher at Together AI. Gopal:| South Park Commons
Training large language models (LLMs) costs less than you think. Using the MosaicML platform, we show how fast, cheap, and easy it is to train these models at scale (1B -> 70B parameters). With new training recipes and infrastructure designed for large workloads, we enable you to train LLMs while maintaining total customizability over your model and dataset.| Databricks
Large language models recognize, summarize, translate, predict and generate text and other content.| NVIDIA Blog
This step-by-step tutorial will teach you how to automatically transcribe voice notes with Whisper, summarize them with ChatGPT, and send them to Notion.| Thomas Frank
What does the future market structure look like for AI foundation and API companies? How does OSS play a role in a world of ever scaling models?| blog.eladgil.com