Company
Date Published
Author
Kyle Corbitt and Saumya Gandhi
Word count
1301
Language
English
Hacker News points
13

Summary

The OpenPipe Mixture of Agents (MoA) model has been developed and released, achieving state-of-the-art results on various benchmarks compared to GPT-4, a leading large language model. The MoA model is designed as a drop-in replacement for GPT-4 and can be used for generating synthetic training data, fine-tuning smaller models, and improving response quality. It has been benchmarked across open-source and private benchmarks, including Arena Hard Auto and AlpacaEval 2.0, where it outperformed GPT-4 variants in many cases. The MoA model is also 1/25th the cost and 3x faster than GPT-4-Turbo, making it an attractive option for those looking to improve their models without breaking the bank. The OpenPipe platform provides a way to use the MoA model through its Chat Completions endpoint or as a relabeling model for fine-tuning smaller models.