Oddbean new post about | logout
 Mistral AI has announced the release of two new state-of-the-art models for on-device computing and at-the-edge use cases, called les Ministraux. These models, Ministral 3B and Ministral 8B, set a new frontier in knowledge, commonsense, reasoning, function-calling, and efficiency in the sub-10B category. Les Ministraux are designed to provide a compute-efficient and low-latency solution for critical applications such as on-device translation, internet-less smart assistants, local analytics, and autonomous robotics.

Source: https://mistral.ai/news/ministraux/