Kimi K2 0905, a new state of the art open weight models for agentic reasoning, tool use, and coding, is now available! Try Now

BigCode Logo Mark

StarCoder2 15B

StarCoder2-15B is a 15B parameter model trained on 600+ programming languages from The Stack v2, with opt-out requests excluded. The model uses Grouped Query Attention, a context window of 16,384 tokens with a sliding window attention of 4,096 tokens, and was trained using the Fill-in-the-Middle objective on 4+ trillion tokens.

Try Model

Fireworks Features

On-demand Deployment

On-demand deployments give you dedicated GPUs for StarCoder2 15B using Fireworks' reliable, high-performance system with no rate limits.

Learn More

Info & Pricing

Provider

BigCode

Model Type

LLM

Context Length

16384

Pricing Per 1M Tokens

$0.2