TL;DR: Fireworks now supports an OpenAI-response API endpoint that allows you to connect our library of leading open models to your own tools and data using the open Model Context Protocol (MCP).
Large Language Models are incredibly powerful, but out of the box, they exist in a vacuum. They can't check your inventory, update a customer's order, or query your internal database. To make them truly useful for your business, they need to securely interact with your proprietary APIs, tools, and data sources.
Historically, this required developers to build complex, brittle "glue code." You'd have to orchestrate a multi-step dance: prompt the model, parse its output to see if it wants to use a tool, make the API call yourself, and then feed the result back to the model. This process is slow, error-prone, and a significant engineering bottleneck that stifles rapid product development.
We're thrilled to announce a powerful new capability to solve this challenge: Fireworks now supports an OpenAI-compatible Responses API, with first-class support for the Model Context Protocol (MCP).
MCP is an open protocol that standardizes how applications provide context and expose tools to LLMs. Think of it as a universal adapter, creating a seamless and secure bridge between a language model and any external system. Instead of being locked into a proprietary set of tools, you can now connect any model on Fireworks to any tool you build, as long as it speaks MCP.
This new endpoint handles the entire agentic loop—reasoning, tool selection, and execution—server-side, allowing you to build sophisticated applications with a single, elegant API call.
Bringing an open protocol like MCP to our platform helps developers on Fireworks to:
Integrating your tools is now stunningly simple. You just need to tell the model the location of your MCP server. The model will then be able to discover and call the tools it provides.
Here’s a quick example of how you can get qwen3-235b-a22b to answer up-to-date questions about our latest open source project reward-kit that was released only last week and hence is not in the model’s training data. All you need is to add the gitmcp server in the request:
12345678910111213141516171819202122
1234567891011121314
Behind the scenes, the model identified the user's intent, discovered the tool for fetching GitHub repo documentation via an MCP server, called it with the correct parameters, and used the result to formulate its final response...all in one API call!
This unlocks a new class of applications built on open models:
The future of AI is not just about better models; it's about better-connected models. By embracing open standards like MCP, we're giving you the power to integrate the best open-source AI deeply into your own products and workflows.
We can't wait to see what you build. Note that this is a preview feature from Fireworks, we would love to hear your feedback.
Start building today!