Llama 4 Scout

Llama 4 Scout

Meta

Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model developed by Meta, activating 17 billion parameters out of a total of 109B. It supports native multimodal input...

What is Llama 4 Scout?

Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model developed by Meta, activating 17 billion parameters out of a total of 109B. It supports native multimodal input...

Specifications

  • Developer: Meta
  • Context window: 327.7K tokens
  • Max output: 16.4K tokens
  • Input modalities: text, image
  • Output modalities: text
  • Input price: $0.0800 per 1M tokens
  • Output price: $0.3000 per 1M tokens
  • Knowledge cutoff: 2024-08-31
  • Supported parameters: frequency_penalty, logit_bias, max_tokens, min_p, presence_penalty, repetition_penalty, response_format, seed, stop, structured_outputs, temperature, tool_choice, tools, top_k, top_p

Use Llama 4 Scout with PromptLayer

PromptLayer lets teams manage, evaluate, and observe prompts that run on Llama 4 Scout alongside every other model in their stack. Version prompts, run evals across models, and ship safe rollouts from the same dashboard.

Ready to try it yourself? Sign up for PromptLayer and start managing your prompts in minutes.

Socials
PromptLayer
Company
All services online
Location IconPromptLayer is located in the heart of New York City
PromptLayer © 2026