DeepSeek V4 Pro is Live → Try it now.

Model Library
/Cognitive Computations/Dolphin 2.6 Mixtral 8x7b
Cognitive Computations Logo

Dolphin 2.6 Mixtral 8x7b

Ready
accounts/fireworks/models/dolphin-2p6-mixtral-8x7b

    Dolphin 2.6 Mixtral 8x7b is a fine-tuned version of the Mixtral-8x7b Large Language Model specializing in coding.

    Dolphin 2.6 Mixtral 8x7b API Features

    On-demand Deployment

    Docs

    On-demand deployments allow you to use Dolphin 2.6 Mixtral 8x7b on dedicated GPUs with Fireworks' high-performance serving stack with high reliability and no rate limits.

    Metadata

    State
    Ready
    Created on
    2/29/2024
    Kind
    Base model
    Provider
    Cognitive Computations

    Specification

    Calibrated
    No
    Mixture-of-Experts
    Yes
    Parameters
    46B

    Supported Functionality

    Fine-tuning
    Not supported
    Serverless
    Not supported
    Context Length
    32.7k tokens
    Function Calling
    Not supported
    Embeddings
    Not supported
    Rerankers
    Not supported
    Support image input
    Not supported