OpenAI gpt-oss-120b & 20b, open weight models designed for reasoning, agentic tasks, and versatile developer use cases is now available! Try Now

Deepseek Logo Mark

DeepSeek Coder 7B Base v1.5

The Deepseek Coder 7B Base v1.5 LLM is pre-trained from Deepseek 7B on 2T tokens by employing a window size of 4K and next token prediction objective.

Try Model

Fireworks Features

Fine-tuning

DeepSeek Coder 7B Base v1.5 can be customized with your data to improve responses. Fireworks uses LoRA to efficiently train and deploy your personalized model

Learn More

On-demand Deployment

On-demand deployments give you dedicated GPUs for DeepSeek Coder 7B Base v1.5 using Fireworks' reliable, high-performance system with no rate limits.

Learn More

Info

Provider

Deepseek

Model Type

LLM

Context Length

4096

Fine-Tuning

Available

Pricing Per 1M Tokens

$0.2