Fireworks AI raises $250M Series C to power the future of enterprise AI. Read more

Model Library
/Microsoft/Phi-3 Mini 128k Instruct
Microsoft Logo Mark

Phi-3 Mini 128k Instruct

Ready
fireworks/phi-3-mini-128k-instruct

    Phi-3-Mini-128K-Instruct is a 3.8 billion-parameter, lightweight, state-of-the-art open model trained using the Phi-3 datasets. This dataset includes both synthetic data and filtered publicly available website data, with an emphasis on high-quality and reasoning-dense properties. The model belongs to the Phi-3 family with the Mini version in two variants 4K and 128K which is the context length (in tokens) that it can support.

    Fireworks Features

    On-demand Deployment

    Docs

    On-demand deployments give you dedicated GPUs for Phi-3 Mini 128k Instruct using Fireworks' reliable, high-performance system with no rate limits.

    Metadata

    State
    Ready
    Created on
    5/24/2024
    Kind
    Base model
    Provider
    Microsoft
    Hugging Face
    Phi-3-mini-128k-instruct

    Specification

    Calibrated
    No
    Mixture-of-Experts
    No
    Parameters
    3.8B

    Supported Functionality

    Fine-tuning
    Not supported
    Serverless
    Not supported
    Serverless LoRA
    Supported
    Context Length
    131.1k tokens
    Function Calling
    Not supported
    Embeddings
    Not supported
    Rerankers
    Not supported
    Support image input
    Not supported