Join us for "Own Your AI" night on 10/1 in SF featuring Meta, Uber, Upwork, and AWS. Register here

Customer Stories

Cursor

Cursor builds lightning fast code edits with Fireworks

Cursor’s Fast Apply feature lets developers instantly accept high-quality code suggestions with a single click. Powered by Fireworks’ speculative decoding, it delivers faster, more accurate edits—outperforming GPT-4 in both speed and usability.

Upwork

Upwork delivers faster, smarter proposals for freelancers

Upwork, the world’s largest freelance marketplace, built Uma to help freelancers craft better proposals faster. With Fireworks, Uma delivers real-time, personalized proposal generation—tailored to each freelancer’s skills and the job at hand—boosting match quality, efficiency, and success rates across the platform.

Notion

Notion Reduces Latency 4x with Fireworks AI

Notion, the all-in-one workspace platform, partnered with Fireworks AI to fine-tune models, reducing latency from 2 seconds to 350 milliseconds. This enhancement enabled Notion to deliver faster, scalable AI features, supporting over 100 million users and aligning with their "vibe working" vision

Cresta

How Cresta delivered 100× more efficient agent guidance

Cresta, the AI platform for contact centers, uses Fireworks to power Knowledge Assist—real-time, context-aware guidance for agents by unifying information from multiple sources. With Fireworks’ scalable infrastructure and Multi-LoRA tech, Cresta cut costs by up to 100× versus GPT-4—boosting agent efficiency and customer satisfaction at scale.

What our customers are saying

Sourcegraph

"Fireworks has been a fantastic partner in building AI dev tools at Sourcegraph. Their fast, reliable model inference lets us focus on fine-tuning, AI-powered code search, and deep code context, making Cody the best AI coding assistant. They are responsive and ship at an amazing pace."

Beyang Liu Testimonial
Beyang Liu | CTO at Sourcegraph
Notion logo dark
"By partnering with Fireworks to fine-tune models, we reduced latency from about 2 seconds to 350 milliseconds, significantly improving performance and enabling us to launch AI features at scale. That improvement is a game changer for delivering reliable, enterprise-scale AI"
Sarah Sachs
Sarah Sachs | AI Lead at Notion
Cursor logo dark

“Fireworks has been an amazing partner getting our Fast Apply and Copilot++ models running performantly. They exceeded other competitors we reviewed on performance. After testing their quantized model quality for our use cases, we have found minimal degradation. Fireworks helps implement task specific speed ups and new architectures, allowing us to achieve bleeding edge performance!”

Sualeh Asif Testimonial
Sualeh Asif | CPO at Cursor
Quora

"Fireworks is the best platform out there to serve open source LLMs. We are glad to be partnering up to serve our domain foundation model series Ocean and thanks to its leading infrastructure we are able to serve thousands of LoRA adapters at scale in the most cost effective way."

SPENCER CHAN
Spencer Chan | Product Lead at Quora
Sourcegraph

"Fireworks has been a fantastic partner in building AI dev tools at Sourcegraph. Their fast, reliable model inference lets us focus on fine-tuning, AI-powered code search, and deep code context, making Cody the best AI coding assistant. They are responsive and ship at an amazing pace."

Beyang Liu Testimonial
Beyang Liu | CTO at Sourcegraph
Notion logo dark
"By partnering with Fireworks to fine-tune models, we reduced latency from about 2 seconds to 350 milliseconds, significantly improving performance and enabling us to launch AI features at scale. That improvement is a game changer for delivering reliable, enterprise-scale AI"
Sarah Sachs
Sarah Sachs | AI Lead at Notion
Cursor logo dark

“Fireworks has been an amazing partner getting our Fast Apply and Copilot++ models running performantly. They exceeded other competitors we reviewed on performance. After testing their quantized model quality for our use cases, we have found minimal degradation. Fireworks helps implement task specific speed ups and new architectures, allowing us to achieve bleeding edge performance!”

Sualeh Asif Testimonial
Sualeh Asif | CPO at Cursor
Quora

"Fireworks is the best platform out there to serve open source LLMs. We are glad to be partnering up to serve our domain foundation model series Ocean and thanks to its leading infrastructure we are able to serve thousands of LoRA adapters at scale in the most cost effective way."

SPENCER CHAN
Spencer Chan | Product Lead at Quora