Llama 4 Scout 17B 16E Instruct (Free)

llama-4-scout-17b-16e-instruct:free
by meta-llama|Created May 25, 2025

Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model from Meta, activating 17 billion parameters out of a total of 109B. It supports native multimodal input (text and image) and multilingual output (text and code) across 12 supported languages. Designed for assistant-style interaction and visual reasoning, Scout uses 16 experts per forward pass and features a context length of 10 million tokens, with a training corpus of ~40 trillion tokens. Built for high efficiency and local or commercial deployment, it is instruction-tuned for multilingual chat, captioning, and image understanding.

Pricing

Pay-as-you-go rates for this model. More details can be found here.

Free

Capabilities

Input Modalities

Text
Image

Output Modalities

Text