Elon Musk told investors this month that his startup xAI is planning to build a supercomputer by the fall of 2025 that would power a future, smarter iteration of its Grok chatbot, The Information reports. This supercomputer, which Musk reportedly referred to as a “gigafactory of compute,” would rely on tens of thousands of NVIDIA H100 GPUs and cost billions of dollars to build. Musk has previously said the third version of Grok will require at least 100,000 of the chips — a fivefold increase over the 20,000 GPUs said to be in use for training Grok 2.0.
According to The Information, Musk also told investors in the presentation that the planned GPU cluster would be at least four times the size of anything used today by xAI competitors. Grok is currently in version 1.5, which was released in April, and is now touted to process visual information like photographs and diagrams as well as text. X earlier this month started rolling out AI-generated news summaries powered by Grok for premium users.
By Kristin Marting. On 1 June 2026, TORCHES continues with a conversation with the ingenious…
The first round of the NHL playoffs finally ended last night, and now we can…
Decentralized finance protocol Aave filed an emergency motion on Monday in New York to vacate…
The Queen told the BBC how she left Buckingham Palace to join the crowds Source…
I listened to your recent conversation with Arthur C. Brooks and found myself struggling with…
One-To-One Classroom Related Terms: 1:1 Technology · One-To-One Computing · Blended Learning · Personalized Learning…