OpenAI gpt-oss-120b & 20b, open weight models designed for reasoning, agentic tasks, and versatile developer use cases is now available! Try Now

BigCode Logo Mark

StarCoder2 7B

StarCoder2-7B is a 7B parameter model trained on 17 programming languages from The Stack v2, with opt-out requests excluded. The model uses Grouped Query Attention, a context window of 16,384 tokens with a sliding window attention of 4,096 tokens, and was trained using the Fill-in-the-Middle objective on 3.5+ trillion tokens.

Try Model

Fireworks Features

On-demand Deployment

On-demand deployments give you dedicated GPUs for StarCoder2 7B using Fireworks' reliable, high-performance system with no rate limits.

Learn More

Info

Provider

BigCode

Model Type

LLM

Context Length

16384

Pricing Per 1M Tokens

$0.2