Connect with us

Hi, what are you looking for?

Top Stories

MiniMax Launches M2.7 AI Model Free, Surpassing Gemini 3.1 Pro with 229 Billion Parameters

MiniMax launches the free M2.7 AI model with 229 billion parameters, outperforming Gemini 3.1 Pro in key benchmarks and enhancing multi-agent capabilities.

MiniMax, a prominent AI development company based in China, has unveiled its latest AI model, MiniMax M2.7, boasting an impressive 229 billion parameters. The model, which was first announced on March 18, 2026, is now publicly available for free on platforms like Hugging Face and ModelScope, drawing attention for its advanced performance metrics.

On April 12, 2026, MiniMax officially released M2.7 as an open-source model, highlighting its state-of-the-art (SOTA) performance in benchmarks such as SWE-Pro and Terminal Bench 2, where it achieved scores of 56.22% and 57.0%, respectively. The company proclaimed via Twitter, “We’re delighted to announce that MiniMax M2.7 is now officially open source. You can find it on Hugging Face now. Enjoy!” The model can be found on Hugging Face at https://huggingface.co/MiniMaxAI/MiniMax-M2.7.

MiniMax M2.7 is uniquely designed to support multi-agent systems, enabling the simultaneous operation of multiple agents. The development of the model utilized a self-evolution mechanism, which involves a continuous cycle of ‘problem analysis → correction plan → code modification → test execution → result comparison → application or discarding changes.’ This process is intended to enhance the model’s adaptability and performance over time.

Initial benchmark results indicate that MiniMax M2.7 surpasses other leading models, including Gemini 3.1 Pro and Claude Opus 4.6, in various tests. A third-party organization, Artificial Analysis, confirmed these findings, asserting that MiniMax M2.7 delivers superior agent performance compared to the Gemini 3.1 Pro Preview. This performance boost is expected to strengthen MiniMax’s competitive edge in the rapidly evolving AI landscape.

The release of MiniMax M2.7 comes less than a month after its initial announcement, reflecting the company’s rapid development cycle. It is available for download free of charge, but it operates under a non-commercial license, requiring commercial users to seek permission from MiniMax for any business applications.

For those interested in exploring the technical specifications and performance metrics, the model’s code can be accessed on GitHub at https://github.com/MiniMax-AI/MiniMax-M2.7?tab=readme-ov-file. Additional information is also available on ModelScope at https://modelscope.cn/models/MiniMax/MiniMax-M2.7/summary.

The release of MiniMax M2.7 marks a significant milestone in AI development, particularly in multi-agent systems, and it underscores the growing trend of open-source AI models that are accessible to researchers and developers worldwide. As competition intensifies in the AI sector, MiniMax’s innovative approach may inspire further advancements in self-evolving models and their applications across various industries.

See also
Staff
Written By

The AiPressa Staff team brings you comprehensive coverage of the artificial intelligence industry, including breaking news, research developments, business trends, and policy updates. Our mission is to keep you informed about the rapidly evolving world of AI technology.

You May Also Like

AI Research

Treasury Secretary Bessent touts Anthropic's Mythos as a transformative AI model, asserting the US will dominate 70-80% of global computing power by 2025.

AI Regulation

China mandates new AI ethics review measures, requiring compliance from research entities within 30 days to ensure responsible innovation and protect human rights.

AI Regulation

China enacts strict AI regulations ahead of July 15, banning harmful content for minors to ensure safe and responsible tech development.

Top Stories

Stanford's AI Index reveals U.S. investment of $285.9B eclipses China's $12.4B, yet 95% of AI projects see no ROI and model gap narrows to...

Top Stories

MiniMax's M2.7 AI model achieves 56.22% on SWE-Pro benchmarks but restricts commercial use through new licensing, raising concerns among developers.

AI Education

China launches a national AI education strategy to integrate artificial intelligence into all educational levels, ensuring a future-ready workforce and global tech competitiveness.

AI Generative

MegaTrain enables the training of 120 billion parameter language models on a single NVIDIA H200 GPU, revolutionizing AI development by bypassing HBM limits.

Top Stories

Hugging Face donates its Safetensors project to the PyTorch Foundation, enhancing AI security by mitigating risks associated with arbitrary code execution.

© 2025 AIPressa · Part of Buzzora Media · All rights reserved. This website provides general news and educational content for informational purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of the information presented. The content should not be considered professional advice of any kind. Readers are encouraged to verify facts and consult appropriate experts when needed. We are not responsible for any loss or inconvenience resulting from the use of information on this site. Some images used on this website are generated with artificial intelligence and are illustrative in nature. They may not accurately represent the products, people, or events described in the articles.