We now support self-hosted deployments for our LLM product, allowing you to run Context.ai within your own cloud instance without data leaving your control. This feature has been requested by many privacy and security conscious enterprises. We've also improved transcript categorization to be more accurate and flexible, with the ability to run any LLM request over messages to assign them to categories. Additionally, we've introduced projects, which enable larger companies to separate multiple LLM products within one team or company, keeping data from each product separate.