The author of this text is exploring the use of Large Language Models (LLMs) to generate data pipelines, specifically for connecting to REST APIs. The problem they face is that LLMs struggle with extracting key parameters such as pagination, authentication, and primary keys from API documentation, which are often missing or not easily inferable. To overcome this challenge, the author uses an Airbyte YAML source as a starting point and feeds it into a Cursor project configured with custom LLM prompts and documentation context. The LLM is then used to convert the YAML into a Python pipeline using dlt's REST API source. The results show that the LLM can successfully generate a working pipeline, including incremental loading and authentication, on a simple example API. Key learnings from this experiment include the importance of preparing documentation in an LLM-friendly format, custom prompts, and instructions for better outcomes, as well as recognizing that some information may still require human intervention.