Ministral 8B 2410

ministral-8b-2410
byMistralAI|Created Oct 3, 2025
Chat Completions

Ministral 8B is an 8B parameter model with a unique interleaved sliding-window attention pattern for faster and more memory-efficient inference. Optimized for edge use cases, it supports up to 128k context length and delivers strong performance in knowledge and reasoning tasks. Exceeding other models in the sub-10B category, it is ideal for low-latency and privacy-focused applications.

Uptime

Reliability over the last 7 days