ERNIE-4.5-21B-A3B is a text MoE Post-trained model, with 21B total parameters and 3B activated parameters for each token.
Fine-tuningDocs | ERNIE-4.5-21B-A3B-PT can be customized with your data to improve responses. Fireworks uses LoRA to efficiently train and deploy your personalized model |
On-demand DeploymentDocs | On-demand deployments give you dedicated GPUs for ERNIE-4.5-21B-A3B-PT using Fireworks' reliable, high-performance system with no rate limits. |