Kimi K2 0905, a new state of the art open weight models for agentic reasoning, tool use, and coding, is now available! Try Now

Mistral Logo Icon

Mixtral MoE 8x7B Instruct

Mixtral MoE 8x7B Instruct is the instruction-tuned version of Mixtral MoE 8x7B and has the chat completions API enabled.

Try Model

Fireworks Features

On-demand Deployment

On-demand deployments give you dedicated GPUs for Mixtral MoE 8x7B Instruct using Fireworks' reliable, high-performance system with no rate limits.

Learn More

Info & Pricing

Provider

Mistral

Model Type

LLM

Context Length

32768

Pricing Per 1M Tokens

$0.5