The AI proxy is a new feature announced by Braintrust, an institution that fosters an open ecosystem for the AI industry. The proxy aims to address challenges in LLM development such as interoperability and API key management by embracing OpenAI's interface as the lingua franca for LLMs. It adds caching, logging, and API key management behind the scenes, supporting popular open source models like LLaMa 2 and Mistral via Perplexity. The proxy enables users to build robust, low-latency systems that work across a thriving ecosystem of model providers without changing code, promoting interoperability and defaulting to openness. It is available for all to use as a beta, for free, with features such as caching, logging, and API key management, and is expandable to include more providers and features, encouraging collaboration and feedback from users.