A brisk look at Elon Musk’s Cortex AI supercluster project demonstrates an overreliance on the world’s most profitable chip brand for GPUs and high demand for cooling water and power
Cortex AI is packed with 70,000 AI servers and will require up to 130 megawatts (MW) of cooling and power to launch.
When you purchase through links on our site, we may earn an affiliate commission.Here’s how it works.
What you need to know
Tech enthusiast Elon Musk recently sharedthe progress of Tesla’s Cortex AI supercluster on X. The project is domiciled in Tesla’s headquarters in Austin, Texas. It’s packed with 70,000 AI servers and will require up to 130 megawatts (MW) of cooling and power to launch, with projections of 500 MW by 2026.
Video of the inside of Cortex today, the giant new AI training supercluster being built at Tesla HQ in Austin to solve real-world AI pic.twitter.com/DwJVUWUrb5August 26, 2024
Cortex AI will help improve and enhance Tesla’s AI models, prompting growth. For context, the company leverages AI tools for autonomous driving, energy management, and more.
Tesla’s supercluster is arguably the largest training cluster, packed with 50,000 NVIDIA H100 enterprise GPUs and an additional 20,000 of the company’s hardware. However, Musk had previously indicated that the Cortex AI would ship with 50,000 units of Tesla’s Dojo AI hardware.
In the interim, Musk is seemingly dedicated to improving Tesla’s custom Dojo supercomputer and wants to enhance its capabilities with a targeted training capacity of 8,000 H100 equivalents by the end of the year.
Per the video shared, it’s evident that there’s still work to be done before the supercluster becomes fully operational. According toElectrek, the cluster is running on a temporary cooling system. Additionally, Tesla requires more network feeders; all factors considered, the cluster could potentially be ready by October, which incidentally aligns with the much-anticipated launch of the Robotaxi.
According to a post shared by Musk on X earlier this year, Tesla will spend up to $10 billion this year “in combined training and inference AI.” Interestingly, leaked emails between Musk and NVIDIA reveal that the billionaire asked the chipmaker to prioritize shipments of processors to X and xAI ahead of Tesla (viaCNBC).
Consequently, veering off the goal to make Tesla a key player in the AI landscape. Musk’s request to let X skip the line ahead of Tesla delayed the company’s shipment of over $500 million processors by months.
Get the Windows Central Newsletter
All the latest news, reviews, and guides for Windows and Xbox diehards.
AI projects are becoming a tad expensive
Tesla’s Cortex AI project echoes the growing concern around generative AI and its exorbitant resource demands. Amid claims thatAI is a fadthat has already reached its peak, with projections of30% of AI projects being abandoned by 2025 after proof of concept, investors in the sector have highlighted their frustrations over the high water demand required for cooling (1 bottle of water per query).
This is coupled with the high power demand. Projections indicate that despite being on the verge of the biggest technological breakthrough with AI,there won’t be enough electricity to power AI advances by 2025. As it stands,Google and Microsoft’s electricity consumption surpasses the power usage of over 100 countries.
🎒The best Back to School deals📝
Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. You’ll also catch him occasionally contributing at iMore about Apple and AI. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.