We love a good look inside a supercomputer, with one of our recent favorites being the glimpse Nvidia gave us of Eos, the ninth fastest supercomputer on the planet.
Now, Elon Musk has provided a peek at the massive AI supercluster, newly dubbed Cortex, being used by X (formerly Twitter).
The supercluster, currently under construction at Tesla’s Giga Texas plant, is set to house 70,000 AI servers, with an initial power and cooling requirement of 130 megawatts, scaling up to 500 megawatts by 2026.
Tesla’s AI strategy
In the video, embedded below, Musk shows rows upon rows of server racks, potentially holding up to 2,000 GPU servers – just a fraction of the 50,000 Nvidia H100 GPUs and 20,000 Tesla hardware units expected to eventually populate Cortex. The video, although brief, offers a rare inside look at the infrastructure that will soon drive Tesla’s most ambitious AI projects.
Video of the inside of Cortex today, the giant new AI training supercluster being built at Tesla HQ in Austin to solve real-world AI pic.twitter.com/DwJVUWUrb5August 26, 2024
Cortex is being developed to advance Tesla’s AI capabilities, particularly for training the Full Self-Driving (FSD) autopilot system used in its cars and the Optimus robot, an autonomous humanoid set for limited production in 2025. The supercluster’s cooling system, featuring massive fans and Supermicro-provided liquid cooling, is designed to handle the extensive power demands, which, Tom’s Hardware points out, is comparable to a large coal power plant.
Cortex is part of Musk’s broader strategy to deploy several supercomputers, including the operational Memphis Supercluster, which is powered by 100,000 Nvidia H100 GPUs, and the upcoming $500 million Dojo supercomputer in Buffalo, New York.
Despite some delays in upgrading to Nvidia’s latest Blackwell GPUs, Musk’s aggressive acquisition of AI hardware shows how keen Tesla is to be at the forefront of AI development.
The divisive billionaire said earlier this year the company was planning to spend “over a billion dollars” on Nvidia and AMD hardware this year alone just to stay competitive in the AI space.
+ There are no comments
Add yours