Kimi K2 Instruct, a 1T parameter model with state of the art quality for coding, reasoning, and agentic tool use, is now available on Fireworks! Try now

Deepseek Logo Mark

DeepSeek Coder 33B Instruct

Deepseek Coder is composed of a series of code language models, each trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language in both English and Chinese. deepseek-coder-33b-instruct is a 33B parameter model initialized from deepseek-coder-33b-base and fine-tuned on 2B tokens of instruction data.

Try Model

Fireworks Features

On-demand Deployment

On-demand deployments give you dedicated GPUs for DeepSeek Coder 33B Instruct using Fireworks' reliable, high-performance system with no rate limits.

Learn More

Info

Provider

Deepseek

Model Type

LLM

Context Length

16384

Pricing Per 1M Tokens

$0.9