Review Ministral 8B 2410 on key metrics including price, context length, throughput, and model features.
AuthorMistralAI
Context Length131.1k
Supports Tools
Ministral 8B is an 8B parameter model with a unique interleaved sliding-window attention pattern for faster and more memory-efficient inference. Optimized for edge use cases, it supports up to 128k context length and delivers strong performance in knowledge and reasoning tasks. Exceeding other models in the sub-10B category, it is ideal for low-latency and privacy-focused applications.