Fireworks AI raises $250M Series C to power the future of enterprise AI. Read more

Model Library
/BigCode/StarCoder 7B
BigCode Logo Mark

StarCoder 7B

Ready
fireworks/starcoder-7b

    StarCoderBase-7B is a 7B parameter model trained on 80+ programming languages from The Stack (v1.2), with opt-out requests excluded. The model uses Multi Query Attention, a context window of 8,192 tokens, and was trained using the Fill-in-the-Middle objective on 1 trillion tokens.

    StarCoder 7B API Features

    On-demand Deployment

    Docs

    On-demand deployments give you dedicated GPUs for StarCoder 7B using Fireworks' reliable, high-performance system with no rate limits.

    Metadata

    State
    Ready
    Created on
    2/17/2024
    Kind
    Base model
    Provider
    BigCode
    Hugging Face
    starcoderbase-7b

    Specification

    Calibrated
    No
    Mixture-of-Experts
    No
    Parameters
    7B

    Supported Functionality

    Fine-tuning
    Not supported
    Serverless
    Not supported
    Serverless LoRA
    Supported
    Context Length
    8.2k tokens
    Function Calling
    Not supported
    Embeddings
    Not supported
    Rerankers
    Not supported
    Support image input
    Not supported