MiniMax-M2.5 is a state-of-the-art large language model built for real-world productivity. Trained across a wide variety of complex digital work environments, M2.5 extends the coding strengths of M2.1 into broader office tasks—becoming fluent in creating and manipulating Word, Excel, and PowerPoint files, seamlessly switching context between different software tools, and collaborating across mixed agent and human teams. It scores 80.2% on SWE-Bench Verified, 51.3% on Multi-SWE-Bench, and 76.3% on BrowseComp. Compared to earlier generations, M2.5 is also more token-efficient, having been trained to plan effectively in order to optimize both its actions and outputs.