Fireworks AI raises $250M Series C to power the future of enterprise AI. Read more

Model Library
/Cognitive Computations/Dolphin 2.6 Mixtral 8x7b

Dolphin 2.6 Mixtral 8x7b

Ready
fireworks/dolphin-2p6-mixtral-8x7b

    Dolphin 2.6 Mixtral 8x7b is a fine-tuned version of the Mixtral-8x7b Large Language Model specializing in coding.

    Dolphin 2.6 Mixtral 8x7b API Features

    On-demand Deployment

    Docs

    On-demand deployments give you dedicated GPUs for Dolphin 2.6 Mixtral 8x7b using Fireworks' reliable, high-performance system with no rate limits.

    Metadata

    State
    Ready
    Created on
    2/29/2024
    Kind
    Base model
    Provider
    Cognitive Computations
    Hugging Face
    dolphin-2.6-mixtral-8x7b

    Specification

    Calibrated
    No
    Mixture-of-Experts
    Yes
    Parameters
    46B

    Supported Functionality

    Fine-tuning
    Not supported
    Serverless
    Not supported
    Serverless LoRA
    Supported
    Context Length
    32.8k tokens
    Function Calling
    Not supported
    Embeddings
    Not supported
    Rerankers
    Not supported
    Support image input
    Not supported