Yi-Large is among the top LLMs, with performance on the LMSYS benchmark leaderboard closely trailing GPT-4, Gemini 1.5 Pro, and Claude 3 Opus. It excels in multilingual capabilities, especially in Spanish, Chinese, Japanese, German, and French. Yi-Large is user-friendly, sharing the same API definition as OpenAI for easy integration.
Fine-tuningDocs | Yi-Large can be customized with your data to improve responses. Fireworks uses LoRA to efficiently train and deploy your personalized model |
On-demand DeploymentDocs | On-demand deployments give you dedicated GPUs for Yi-Large using Fireworks' reliable, high-performance system with no rate limits. |
Yi-Large is a 70B parameter dense language model developed by 01.AI. Yi-Large ranks among the top-performing open models on the LMSYS leaderboard, closely trailing GPT-4, Claude 3 Opus, and Gemini 1.5 Pro.
Yi-Large is well-suited for:
Yi-Large supports a context length of 32,800 tokens on Fireworks AI.
The maximum usable context window is 32.8K tokens, as defined by Fireworks AI's platform configuration.
Yi-Large is a dense model with 70 billion parameters.
Yes. Fireworks supports LoRA-based fine-tuning for this model.