šŸš€Dec 9, 2025

Devstral 2Mistral Vibe CLI

State-of-the-art, open-source agentic coding models and CLI agent

72.2%
SWE-bench Verified
123B
Parameters
7x
More Cost-Efficient
256K
Context Window

Devstral: the next generation of SOTA coding

Devstral 2 is a 123B-parameter dense transformer supporting a 256K context window.

It reaches 72.2% on SWE-bench Verified, establishing it as one of the best open-weight models while remaining highly cost efficient.

Released under a modified MIT license, Devstral sets the open state-of-the-art for code agents. Devstral Small 2 scores 68.0% on SWE-bench Verified, and places firmly among models up to five times its size.

āœ“
Compact & Efficient
5x and 28x smaller than DeepSeek V3.2, 8x and 41x smaller than Kimi K2
āœ“
Cost Efficient
Up to 7x more cost-efficient than Claude Sonnet at real-world tasks
→ Devstral 2
123B Parameters • Modified MIT License
→ Devstral Small 2
24B Parameters • Apache 2.0 License
→ Performance
72.2% / 68.0% SWE-bench Verified

Performance Benchmarks

Devstral 2 (123B) and Devstral Small 2 (24B) are 5x and 28x smaller than DeepSeek V3.2

72.2%
Devstral 2
SWE-bench Verified
123B params • Modified MIT
68.0%
Devstral Small 2
SWE-bench Verified
24B params • Apache 2.0
7x
Cost Efficient
vs Claude Sonnet
Real-world tasks
ModelSWE-bench VerifiedParametersContext WindowLicense
Devstral 272.2%123B256KModified MIT
Devstral Small 268.0%24B256KApache 2.0

Compact Yet Powerful

āœ“
5x & 28x smaller than DeepSeek V3.2
Devstral 2 and Small 2 respectively
āœ“
8x & 41x smaller than Kimi K2
Practical deployment on limited hardware
šŸ’”

Human Evaluation Results

Devstral 2 shows a clear advantage over DeepSeek V3.2, with a 42.8% win rate versus 28.6% loss rate. Compact models can match or exceed the performance of much larger competitors, making deployment practical on limited hardware.

Built for production-grade workflows

Powerful capabilities for modern software engineering

šŸ¤–

Code Agent Excellence

Explores codebases and orchestrates changes across multiple files while maintaining architecture-level context

⚔

Smart Error Handling

Tracks framework dependencies, detects failures, and retries with corrections automatically

šŸ“

Multi-File Orchestration

Solves challenges like bug fixing and modernizing legacy systems across entire codebases

šŸ’°

Cost Effective

Currently free via API, with future pricing at $0.40/$2.00 per million tokens

šŸ”§

Mistral Vibe CLI

Native, open-source agent in your terminal solving software engineering tasks autonomously

šŸŽÆ

Customizable

Can be fine-tuned to prioritize specific languages or optimize for large enterprise codebases

šŸš€Apache 2.0 License

Mistral Vibe CLI

Open-source command-line coding assistant powered by Devstral. Explores, modifies, and executes changes across your codebase using natural language.

INSTALLATION
curl -LsSf https://mistral.ai/vibe/install.sh | bash
šŸ“

Project-Aware Context

Automatically scans your file structure and Git status to provide relevant context

@

Smart References

Reference files with @ autocomplete, execute shell commands with !, and use slash commands for configuration

šŸ”„

Multi-File Orchestration

Understands your entire codebase—not just the file you're editing—enabling architecture-level reasoning

šŸ“

Persistent History

Persistent history, autocompletion, and customizable themes for enhanced productivity

šŸ”Œ

IDE Integration

Vibe CLI is available as an extension in Zed, so you can use it directly inside your IDE. It can also be integrated into your preferred IDE via the Agent Communication Protocol.

What Partners Say

Trusted by leading AI coding tools

šŸ’¬
Cline
"Devstral 2 is at the frontier of open-source coding models. In Cline, it delivers a tool-calling success rate on par with the best closed models; it's a remarkably smooth driver. This is a massive contribution to the open-source ecosystem."
šŸ’¬
Kilo Code
"Devstral 2 was one of our most successful stealth launches yet, surpassing 17B tokens in the first 24 hours. Mistral AI is moving at Kilo Speed with a cost-efficient model that truly works at scale."

Recommended Deployment

Flexible deployment options for different use cases

Devstral 2

Data Center Deployment

āœ“
GPU Requirements
Minimum 4 H100-class GPUs
āœ“
NVIDIA Platform
Available on build.nvidia.com
āœ“
Recommended Settings
Temperature: 0.2 for optimal performance
āœ“
NVIDIA NIM Support
Coming soon
Devstral Small 2

Local & Edge Deployment

āœ“
Single-GPU Operation
Runs on consumer-grade GPUs
āœ“
CPU Support
No dedicated GPU required
āœ“
NVIDIA Systems
DGX Spark and GeForce RTX supported
āœ“
Fast Inference
Tight feedback loops and easy customization
šŸ’”

Image Input Support

Devstral Small 2 supports image inputs and can power multimodal agents, making it versatile for a wide range of applications beyond pure code generation.

Get Started

Devstral 2 is currently offered free via API

1

Mistral Vibe CLI

Open-source command-line coding assistant powered by Devstral

curl -LsSf https://mistral.ai/vibe/install.sh | bash
2

API Access

Use Devstral 2 via API with leading coding tools

āœ“ Currently free via API
āœ“ Integrated with Kilo Code and Cline
āœ“ Future pricing: $0.40/$2.00 per million tokens
3

Self-Hosted Deployment

Deploy on your own infrastructure with NVIDIA support

āœ“ Requires 4 H100-class GPUs minimum (Devstral 2)
āœ“ Single-GPU operation (Devstral Small 2)
āœ“ Available on build.nvidia.com

Integrated with Leading Tools

Kilo Code
Cline
Zed
NVIDIA

Frequently Asked Questions

Find answers to common questions about Devstral 2