Fireworks AI raises $250M Series C to power the future of enterprise AI. Read more

Model Library
/Deepseek/DeepSeek Coder 1.3B Base
Deepseek Logo Mark

DeepSeek Coder 1.3B Base

Ready
fireworks/deepseek-coder-1b-base

    DeepSeek Coder is composed of a series of code language models, each trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language in both English and Chinese. deepseek-coder-1.3b-base is a 1.3B parameter model with Multi-Head Attention trained on 1 trillion tokens.

    Fireworks Features

    Fine-tuning

    Docs

    DeepSeek Coder 1.3B Base can be customized with your data to improve responses. Fireworks uses LoRA to efficiently train and deploy your personalized model

    On-demand Deployment

    Docs

    On-demand deployments give you dedicated GPUs for DeepSeek Coder 1.3B Base using Fireworks' reliable, high-performance system with no rate limits.

    Metadata

    State
    Ready
    Created on
    6/18/2024
    Kind
    Base model
    Provider
    Deepseek
    Hugging Face
    deepseek-coder-1.3b-base

    Specification

    Calibrated
    No
    Mixture-of-Experts
    No
    Parameters
    1.3B

    Supported Functionality

    Fine-tuning
    Supported
    Serverless
    Not supported
    Serverless LoRA
    Supported
    Context Length
    16.4k tokens
    Function Calling
    Not supported
    Embeddings
    Not supported
    Rerankers
    Not supported
    Support image input
    Not supported