This article discusses how to manage and monitor open source Language Learning Models (LLMs) applications using UbiOps and Arize. It highlights the benefits of open-source LLMs, such as Llama 3, Mistral, or Falcon, which can be customized easily compared to closed-source models like GPT-4. The article provides a step-by-step guide on deploying an open source LLM (llama-3-8b-instruct) to the cloud with UbiOps and logging prompt and response embeddings together with some metadata to Arize for monitoring purposes. It also explains how to set up a connection with Arize API client, calculate the embeddings using HuggingFace's embedding model, and log the embeddings to Arize. The article concludes by demonstrating how to inspect the results in Arize's platform.