OpenAI gpt-oss-120b & 20b, open weight models designed for reasoning, agentic tasks, and versatile developer use cases is now available! Try Now

Microsoft Logo Mark

Phi-3 Mini 128k Instruct

Phi-3-Mini-128K-Instruct is a 3.8 billion-parameter, lightweight, state-of-the-art open model trained using the Phi-3 datasets. This dataset includes both synthetic data and filtered publicly available website data, with an emphasis on high-quality and reasoning-dense properties. The model belongs to the Phi-3 family with the Mini version in two variants 4K and 128K which is the context length (in tokens) that it can support.

Try Model

Fireworks Features

On-demand Deployment

On-demand deployments give you dedicated GPUs for Phi-3 Mini 128k Instruct using Fireworks' reliable, high-performance system with no rate limits.

Learn More

Info

Provider

Microsoft

Model Type

LLM

Context Length

131072

Pricing Per 1M Tokens

$0.1