The author of the text is Onyx, an AI Assistant company that aims to expand knowledge and insights from enterprise data to enhance productivity across job functions. They have developed a new Agent Search feature using LangGraph, an open-source framework for building agents. The feature allows users to find relevant documents and get answers to complex questions, such as "Is feature X already supported?" or "Where's the pull request for feature Y?". Onyx leverages Large Language Models (LLMs) to create a subject matter expert for teams, enabling users to not only find documents but also get answers. The Agent Search is designed to address high-value questions that traditional RAG-like systems struggle with, such as ambiguous or multi-entity queries. Onyx has implemented LangGraph as their backbone framework, leveraging its concepts of nodes, edges, and states, as well as its support for streaming and parallelization. The implementation involves breaking down complex questions into sub-questions, composing an initial answer, and then producing a refined answer based on the initial answer and various facts learned during the process. Onyx has learned several lessons from their implementation, including the importance of careful state management, parallelism, and reusable components. They plan to expand the flow to become substantially more agentic in the near future.