Join us for "Own Your AI Night" on 10/1 in SF w/ Meta, Uber, Upwork, and AWS! Register here

Deepseek Logo Mark

DeepSeek Coder 33B Instruct

Deepseek Coder is composed of a series of code language models, each trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language in both English and Chinese. deepseek-coder-33b-instruct is a 33B parameter model initialized from deepseek-coder-33b-base and fine-tuned on 2B tokens of instruction data.

Try Model

Fireworks Features

Fine-tuning

DeepSeek Coder 33B Instruct can be customized with your data to improve responses. Fireworks uses LoRA to efficiently train and deploy your personalized model

Learn More

On-demand Deployment

On-demand deployments give you dedicated GPUs for DeepSeek Coder 33B Instruct using Fireworks' reliable, high-performance system with no rate limits.

Learn More

Info & Pricing

Provider

Deepseek

Model Type

LLM

Context Length

16384

Fine-Tuning

Available

Pricing Per 1M Tokens

$0.9