DeepSeek R1 0528, an updated version of the state-of-the-art DeepSeek R1 model, is now available. Try it now!

Deepseek Logo Mark

DeepSeek Coder 1.3B Base

DeepSeek Coder is composed of a series of code language models, each trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language in both English and Chinese. deepseek-coder-1.3b-base is a 1.3B parameter model with Multi-Head Attention trained on 1 trillion tokens.

Try Model

Fireworks Features

On-demand Deployment

On-demand deployments give you dedicated GPUs for DeepSeek Coder 1.3B Base using Fireworks' reliable, high-performance system with no rate limits.

Learn More

Info

Provider

Deepseek

Model Type

LLM

Context Length

16384

Pricing Per 1M Tokens

$0.1