Smaller LLMs can be effective if they receive high-quality data, but for building AGI, significant infrastructure investments are necessary. The latest research on 1-bit LLMs suggests that these models have the potential to reduce costs and carbon footprint in generative AI. Additionally, exploring attention mechanisms in LLMs is crucial for understanding how these models process long context inputs. Meanwhile, state space models like Mamba offer unique advantages over transformer architecture, and Meta's AI superclusters are being developed to power AGI development.