Mistral has unveiled the next generation of its flagship model, which boasts improvements in reasoning, mathematics, and programming, along with an impressive context window of 128,000 tokens.
The new model, known as Large 2, features a staggering 123 billion parameters, enabling it to handle large-scale applications effectively. Its 128,000-token context window allows it to operate efficiently on a single node.
One of the standout enhancements in this second generation is its multilingual support. It can now understand “dozens of languages,” including Spanish, Portuguese, Arabic, and Japanese. It also supports over 80 programming languages, as highlighted on Mistral’s official blog.
Mistral has focused on enhancing reasoning capabilities to reduce the occurrence of “hallucinations,” which are inaccuracies that seem plausible. The model is designed to generate reliable and precise results while also recognizing when it lacks sufficient information to provide a solution.
In addition to reasoning, Large 2 excels at following precise instructions and managing long, multi-turn conversations more effectively. The model also shows marked improvements in mathematics and programming tasks.
Currently, users can test Large 2 on the Mistral platform via the chat feature in version 24.07. For more insights into Mistral’s advancements, you can visit their official website or follow related technology news outlets.
Image and News Source: https://www.infobae.com/america/agencias/2024/07/25/mistral-presenta-large-2-con-mejoras-en-razonamiento-programacion-y-soporte-de-idiomas/