
Three years ago, before AI had taken over the world, my co-founders and I made a bet. We believed the future of AI wouldn’t be controlled by a handful of powerful foundation model labs, but distributed across thousands of enterprises that want to own and customize their own AI products.
That founding thesis has paid off.
Today, we’re announcing a $250 million Series C at a $4 billion valuation, co-led by Lightspeed Venture Partners, Index Ventures, and Evantic, with continued support from Sequoia Capital. This round, which includes primary and secondary funding, brings our total funding to over $327 million, with prior rounds led by Benchmark and Sequoia, and strategic participation from NVIDIA, AMD, MongoDB and Databricks.
We raised this capital to meet surging enterprise demand for our production AI infrastructure and to cement our position as the market leader in AI inference.
When we left PyTorch to build Fireworks in 2022, companies were just beginning to experiment with large language models. Enterprise AI initiatives often lingered in the pilot phase for months, if not years, tethered to closed APIs from a few foundation model providers.
We believed a different future was possible — one where open-source models would converge on their proprietary counterparts and where companies could own their AI, end-to-end. Drawing from our experience scaling AI infrastructure at Meta, we built Fireworks as an AI Cloud for enterprise developer teams — giving them the speed, cost efficiency, and control they need to build production-grade AI.
That vision is now a reality.
Our customers, including Samsung, Uber, DoorDash, Notion, Shopify, and Upwork, are moving AI from pilot projects into the core of their business operations. They’re scaling to millions of users worldwide without compromising quality or cost.
Fireworks now powers over 10,000 companies (a 10× increase from our Series B) and serves hundreds of thousands of developers building customized AI applications. Our platform processes more than 10 trillion tokens per day, and our annualized revenue has surpassed $280 million.
The Fireworks platform offers access to hundreds of state-of-the-art open-source models across text, image, audio, and multimodal formats. We enable fine-tuning, reinforcement learning, and model evaluation — all built on our ultra-fast inference engine, delivering up to 40× faster performance and an 8× reduction in cost compared to other providers.
We attribute this growth to our belief in one-size-fits-one AI, not one-size-fits-all. Generic foundation models solve generic problems, because frontier labs can only train models on publicly available internet data. But the majority of valuable data lives inside enterprises and their applications: user interactions, domain-specific workflows, proprietary knowledge bases, and behavioral patterns.
Our platform helps developers tune their models for specific use cases using this enterprise- and application-specific data, making inference faster, cheaper and higher quality. As customers interact with these customized applications, new data continuously feeds back into and improves the model. When a user corrects an output, ignores a suggestion or discovers new ways to solve a problem—all that data improves the model, which improves the application. The product and model evolve together in a perpetual loop.
This is product-model co-design: a tight data feedback loop with continuous evaluation and reinforcement learning that allows enterprises to improve their AI applications over time, optimizing for cost, speed and quality simultaneously. This is how enterprises build a competitive moat with AI, and it’s the foundation of artificial autonomous intelligence.
With this new round, we’re accelerating toward this vision. We’ll invest in three key areas:
AI is the greatest business opportunity in history — but enterprises shouldn’t have to choose between a handful of tech giants to participate in it. They should be able to build, own, and control their AI infrastructure from the ground up.
This funding will help us meet that mission. We’ll expand our global operations, grow our engineering and GTM teams, and deepen our partnerships across the cloud, model, and hardware ecosystems.
At Fireworks, we believe that behind every magical AI experience, there’s Fireworks AI — where AI runs faster, and enterprises stay in control.