Supabase has added support for Mozilla Llamafile, an Inference Server that can be used with Edge Functions. Llamafile allows users to distribute and run Language Models with a single file, running locally on most computers without installation. It also provides a local web UI chat server and an OpenAI API compatible server now integrated with Supabase Edge Functions. Users can find examples of how to use Llamafile with functions-js on GitHub. Instructions for getting started are provided, including setting up a new Supabase project locally and calling Llamafile with functions-js or the OpenAI Deno SDK. Deploying a Llamafile and Supabase Edge Functions is also covered in the text.