Elon Musk told investors this month that his startup xAI is planning to build a supercomputer by the fall of 2025 that would power a future, smarter iteration of its Grok chatbot, The Information reports. This supercomputer, which Musk reportedly referred to as a “gigafactory of compute,” would rely on tens of thousands of NVIDIA H100 GPUs and cost billions of dollars to build. Musk has previously said the third version of Grok will require at least 100,000 of the chips — a fivefold increase over the 20,000 GPUs said to be in use for training Grok 2.0.
According to The Information, Musk also told investors in the presentation that the planned GPU cluster would be at least four times the size of anything used today by xAI competitors. Grok is currently in version 1.5, which was released in April, and is now touted to process visual information like photographs and diagrams as well as text. X earlier this month started rolling out AI-generated news summaries powered by Grok for premium users.
ReutersOn Friday evening, a man ploughed a car into a crowd of shoppers at a…
Online, December 21, 2024 (Newswire.com) - WireDaily.com is proud to announce the release of its…
The 2025 Under the Radar Symposium kicks off with an exciting and robust keynote event.…
This is not a drill. The New York Rangers won a hockey game against a…
Interpol has issued a "Red Notice" for Hex founder Richard Schueler, also known as Richard…
Getty Images(Credit: Getty Images)From the awe-inspiring photo of a surfer in Tahiti to the iconic…