Mixtral 8x7B

Mistral

A 7B sparse Mixture-of-Experts (SMoE). Uses 12.9B active parameters out of 45B total.

Try Now

Capabilities

Tool Use

Image Input

PDF Input

Technical Specifications

Context Window

32,768 tokens

Max Output

32,768 tokens

Pricing

Token Costs (per 1M tokens)

Cache Miss Input

$0.70

Non-Reasoning Output

$0.70

Tool Costs (per 1K calls)

Web Search

$15

Code Execution

$0.19

Legacy

Made legacy on

Reason

Old small MoE model; superseded by Mixtral 8x22B

Recommended Replacement

Mistral Large