Ministral 8B 2410
ministral-8b-2410
by mistralai|Created Oct 3, 2025
Ministral 8B is an 8B parameter model with a unique interleaved sliding-window attention pattern for faster and more memory-efficient inference. Optimized for edge use cases, it supports up to 128k context length and delivers strong performance in knowledge and reasoning tasks. Exceeding other models in the sub-10B category, it is ideal for low-latency and privacy-focused applications.
Pricing
Pay-as-you-go rates for this model. More details can be found here.
Input Tokens (1M)
$0.05
Output Tokens (1M)
$0.05
Capabilities
Input Modalities
Text
Output Modalities
Text
Usage Analytics
Token usage across the last 30 active days