StarCoder2-3B is a 3B parameter model trained on 17 programming languages from The Stack v2, with opt-out requests excluded. The model uses Grouped Query Attention, a context window of 16,384 tokens with a sliding window attention of 4,096 tokens, and was trained using the Fill-in-the-Middle objective on 3+ trillion tokens.
On-demand deployments give you dedicated GPUs for StarCoder2 3B using Fireworks' reliable, high-performance system with no rate limits.
Learn MoreBigCode
16384
$0.1