✈️ Recent Travels: I spent the past three weeks in the United States and Japan, engaging in discussions about the AI boom as we approach the second anniversary of ChatGPT.
💻 Hot Chips Conference: The main reason for my trip was the Hot Chips semiconductor design conference at Stanford. I attended in person, focusing more on one-on-one conversations than presentations. Key takeaways include:
📈 AI Applications: In my previous video, I questioned the profitability of LLM-based applications. Insights include:
📉 Price War in LLM APIs: The cost of GPT-4 API tokens has dropped from $36 per million tokens to $4, an 80% reduction. This trend is attributed to software improvements rather than new semiconductors.
🔍 Foundation Models: The growth of the LLM ecosystem is hindered by the limitations of current Foundation Models. The industry is eagerly awaiting GPT-5 for advancements.
🏢 Nvidia's Dominance: Nvidia leads the AI chip market, followed by AMD. The future of Nvidia's dominance depends on:
📉 Intel's Challenges: Intel recently faced a 30% stock drop after a poor quarter. The company is considering options like capital expenditure cuts and potential splits.
🚗 Waymo's Progress: Waymo has become a common sight in San Francisco and Los Angeles, showcasing the success of autonomous vehicles through extensive mapping and data collection.
🔮 Future Outlook: The focus is shifting towards the next generation of Foundation Models and the scaling of data centers. Anticipation builds for what the next two years will bring in AI advancements.