Moonshot AI Releases Kimi K2.5: Multimodal Open-Source Model with Coding Agent
Chinese AI company releases native multimodal model trained on 15 trillion tokens with agent swarm orchestration capabilities.
Moonshot AI, backed by Alibaba and HongShan (formerly Sequoia China), has released Kimi K2.5, a new open-source multimodal model that understands text, image, and video.
Model Training
The model was trained on 15 trillion mixed visual and text tokens, making it natively multimodal. The models perform well on coding tasks and agent swarm orchestration where multiple agents work together.
Benchmark Performance
Kim K2.5 matches the performance of proprietary peers in released benchmarks and even beats them in certain tasks. The coding benchmark results show particular strength, positioning the model as a competitive option for developers.
Open Source Availability
The release continues Moonshot AI's pattern of open-sourcing competitive models, challenging the assumption that cutting-edge AI must come exclusively from U.S. companies.
Related Articles
NVIDIA GTC 2026 Keynote: Jensen Huang Unveils Vera Rubin Platform and Six New Chips
NVIDIA CEO Jensen Huang opened GTC 2026 in San Jose with the formal unveiling of the complete Vera Rubin GPU platform — six new chips featuring 288 GB of HBM4 memory, 336 billion transistors, and 50 PetaFLOPS of FP4 performance. Over 30,000 attendees from 190 countries gathered for the AI industry's most anticipated annual event.
OpenAI Acquires Promptfoo to Strengthen AI Agent Security and Red-Teaming
OpenAI has agreed to acquire Promptfoo, the open-source AI security and red-teaming platform used by over 25% of the Fortune 500, in a deal that will integrate the tool directly into OpenAI's enterprise agent platform. The acquisition signals OpenAI's growing focus on safety infrastructure as it pushes deeper into autonomous AI agent deployment.
NVIDIA Releases Nemotron 3 Super: Open 120B-Parameter Model Targets Enterprise Agentic AI
NVIDIA has released Nemotron 3 Super, a 120-billion-parameter open-weights model built on a hybrid Mamba-Transformer architecture with a one-million-token context window. The model delivers 5x throughput improvements over its predecessor and is designed specifically for enterprise agentic AI workflows.