When it comes to using LLMs, it’s not always a question of which model to use: it’s also a matter of choosing who provides the LLM and where it is deployed. Today, we announce the release of any-llm, a Python library that provides a simple unified interface to access the most popular providers.| Mozilla.ai
New state-of-the-art models emerge every few weeks, making it hard to keep up, especially when testing and integrating them. In reality, many available models may already meet our needs. The key question isn’t “Which model is the best?” but rather, “What’s the smallest model that gets the job done?”| Mozilla.ai
Previously, we explored how LLMs like Meta’s Llama reshaped AI, offering transparency and control. We discussed open-weight models like DeepSeek and deployment options. Now, we show how to deploy DeepSeek V3, a powerful open-weight model, on a Kubernetes cluster using vLLM.| Mozilla.ai