OpenAI gpt-oss-120b & 20b, open weight models designed for reasoning, agentic tasks, and versatile developer use cases is now available! Try Now

Microsoft Logo Mark

Phi-2 3B

Phi-2 is a Transformer with 2.7 billion parameters. It was trained using the same data sources as Phi-1.5, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (for safety and educational value). When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-2 showcased a nearly state-of-the-art performance among models with less than 13 billion parameters.

Try Model

Fireworks Features

On-demand Deployment

On-demand deployments give you dedicated GPUs for Phi-2 3B using Fireworks' reliable, high-performance system with no rate limits.

Learn More

Info

Provider

Microsoft

Model Type

LLM

Context Length

2048

Pricing Per 1M Tokens

$0.1