Ministral 8B 2410

ministral-8b-2410
byMistralAI|Created Oct 3, 2025
Chat Completions

Ministral 8B is an 8B parameter model with a unique interleaved sliding-window attention pattern for faster and more memory-efficient inference. Optimized for edge use cases, it supports up to 128k context length and delivers strong performance in knowledge and reasoning tasks. Exceeding other models in the sub-10B category, it is ideal for low-latency and privacy-focused applications.

Pricing

Pay-as-you-go rates for this model. More details can be found here.

Input Tokens (1M)

$0.05

Output Tokens (1M)

$0.05

Capabilities

Input Modalities

Text

Output Modalities

Text

Supported Parameters

Available parameters for API requests

Frequency Penalty
Max Completion Tokens
Parallel Tool Calls
Prediction
Presence Penalty
Response Format
Stop
Temperature
Tool Choice
Tools
Top P