Elon Musk told investors this month that his startup xAI is planning to build a supercomputer by the fall of 2025 that would power a future, smarter iteration of its Grok chatbot, The Information reports. This supercomputer, which Musk reportedly referred to as a “gigafactory of compute,” would rely on tens of thousands of NVIDIA H100 GPUs and cost billions of dollars to build. Musk has previously said the third version of Grok will require at least 100,000 of the chips — a fivefold increase over the 20,000 GPUs said to be in use for training Grok 2.0.
According to The Information, Musk also told investors in the presentation that the planned GPU cluster would be at least four times the size of anything used today by xAI competitors. Grok is currently in version 1.5, which was released in April, and is now touted to process visual information like photographs and diagrams as well as text. X earlier this month started rolling out AI-generated news summaries powered by Grok for premium users.
By Torange Yeghiazarian, Marina Johnson, Nabra Nelson. This episode is a deep dive into the…
The New York Rangers continue their quest to maximize their odds at the draft lottery…
Social media bearishness around Bitcoin has reached its highest level since the end of February,…
And if Woo helped lay the foundations of modern action cinema, Chow was right alongside…
contributed by Meg Price, the ei experience Social-emotional learning (SEL) by definition is a process…
The Writers Guild of America reached a tentative deal with studios and streamers on Saturday…