DeepSeek R1 0528, an updated version of the state-of-the-art DeepSeek R1 model, is now available. Try it now!

Microsoft Logo Mark

Phi-3 Mini 128k Instruct

Phi-3-Mini-128K-Instruct is a 3.8 billion-parameter, lightweight, state-of-the-art open model trained using the Phi-3 datasets. This dataset includes both synthetic data and filtered publicly available website data, with an emphasis on high-quality and reasoning-dense properties. The model belongs to the Phi-3 family with the Mini version in two variants 4K and 128K which is the context length (in tokens) that it can support.

Try Model

Fireworks Features

On-demand Deployment

On-demand deployments give you dedicated GPUs for Phi-3 Mini 128k Instruct using Fireworks' reliable, high-performance system with no rate limits.

Learn More

Info

Provider

Microsoft

Model Type

LLM

Context Length

131072

Pricing Per 1M Tokens

$0.1