The future of knowledge graphs is a topic that has been explored in various directions. Knowledge graphs have roots in diverse fields, including artificial intelligence, semantic web, and graph theory, and have been built to support various applications over decades. The concept of combining structured representation with the learning capabilities of neural networks is part of a broader trend in AI research that aims to marry symbolic and subsymbolic approaches. Structured search excels in environments where data is well-organized, while semantic search shines in complex, natural language environments. A new approach combines both by storing both structure and meaning in a knowledge graph using Large Language Models (LLMs). This allows for a more robust feedback mechanism and can understand natural language queries regardless of the underlying structure. An experiment on the Movie Graph demonstrates how to implement natural language Q&A using LLMs and Neo4j Graph Database, showcasing promising outcomes from such an approach. Further development will focus on improving efficiency, relevance, and context generation, as well as leveraging metadata and domain knowledge to enhance the system's performance.