Kimi K2 0905, a new state of the art open weight models for agentic reasoning, tool use, and coding, is now available! Try Now

Deepseek Logo Mark

DeepSeek Coder 7B Base

Deepseek Coder is composed of a series of code language models, each trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language in both English and Chinese. Deepseek Coder 6.7B Base is a 6.7B parameter model with Multi-Head Attention trained on 2 trillion tokens by employing a window size of 16K and an extra fill-in-the-blank task

Try Model

Fireworks Features

Fine-tuning

DeepSeek Coder 7B Base can be customized with your data to improve responses. Fireworks uses LoRA to efficiently train and deploy your personalized model

Learn More

On-demand Deployment

On-demand deployments give you dedicated GPUs for DeepSeek Coder 7B Base using Fireworks' reliable, high-performance system with no rate limits.

Learn More

Info & Pricing

Provider

Deepseek

Model Type

LLM

Context Length

4096

Fine-Tuning

Available

Pricing Per 1M Tokens

$0.2