Fireworks AI raises $250M Series C to power the future of enterprise AI. Read more

Model Library
/BigCode/StarCoder2 7B
BigCode Logo Mark

StarCoder2 7B

Ready
fireworks/starcoder2-7b

    StarCoder2-7B is a 7B parameter model trained on 17 programming languages from The Stack v2, with opt-out requests excluded. The model uses Grouped Query Attention, a context window of 16,384 tokens with a sliding window attention of 4,096 tokens, and was trained using the Fill-in-the-Middle objective on 3.5+ trillion tokens.

    StarCoder2 7B API Features

    On-demand Deployment

    Docs

    On-demand deployments give you dedicated GPUs for StarCoder2 7B using Fireworks' reliable, high-performance system with no rate limits.

    Metadata

    State
    Ready
    Created on
    3/12/2024
    Kind
    Base model
    Provider
    BigCode
    Hugging Face
    starcoder2-7b

    Specification

    Calibrated
    No
    Mixture-of-Experts
    No
    Parameters
    7.4B

    Supported Functionality

    Fine-tuning
    Not supported
    Serverless
    Not supported
    Serverless LoRA
    Supported
    Context Length
    16.4k tokens
    Function Calling
    Not supported
    Embeddings
    Not supported
    Rerankers
    Not supported
    Support image input
    Not supported