Fireworks AI raises $250M Series C to power the future of enterprise AI. Read more

Model Library
/Microsoft/Phi-2 3B
Microsoft Logo Mark

Phi-2 3B

Ready
fireworks/phi-2-3b

    Phi-2 is a Transformer with 2.7 billion parameters. It was trained using the same data sources as Phi-1.5, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (for safety and educational value). When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-2 showcased a nearly state-of-the-art performance among models with less than 13 billion parameters.

    Phi-2 3B API Features

    On-demand Deployment

    Docs

    On-demand deployments give you dedicated GPUs for Phi-2 3B using Fireworks' reliable, high-performance system with no rate limits.

    Metadata

    State
    Ready
    Created on
    2/27/2024
    Kind
    Base model
    Provider
    Microsoft
    Hugging Face
    phi-2

    Specification

    Calibrated
    No
    Mixture-of-Experts
    No
    Parameters
    3.2B

    Supported Functionality

    Fine-tuning
    Not supported
    Serverless
    Not supported
    Serverless LoRA
    Supported
    Context Length
    2k tokens
    Function Calling
    Not supported
    Embeddings
    Not supported
    Rerankers
    Not supported
    Support image input
    Not supported