We combined Replit's Anthropic Computer Use template with EVI API, capturing user voice in real-time and sending instructions to Claude's agentic computer control loop. We then integrated EVI's explanation capabilities with Claude's responses, allowing for real-time updates and voice interaction. This integration enables the use of Large Language Models (LLMs) to control devices, showcasing a glimpse of the future of AI interfaces and agents.