Elon Musk told investors this month that his startup xAI is planning to build a supercomputer by the fall of 2025 that would power a future, smarter iteration of its Grok chatbot, The Information reports. This supercomputer, which Musk reportedly referred to as a “gigafactory of compute,” would rely on tens of thousands of NVIDIA H100 GPUs and cost billions of dollars to build. Musk has previously said the third version of Grok will require at least 100,000 of the chips — a fivefold increase over the 20,000 GPUs said to be in use for training Grok 2.0.
According to The Information, Musk also told investors in the presentation that the planned GPU cluster would be at least four times the size of anything used today by xAI competitors. Grok is currently in version 1.5, which was released in April, and is now touted to process visual information like photographs and diagrams as well as text. X earlier this month started rolling out AI-generated news summaries powered by Grok for premium users.
By Jan Cohen-Cruz. In this episode, personal relationships between Finn and Jan and Alex and…
The Rangers and Leafs are battling it to see who can be the worse team…
US regulators have clarified that tokenized securities will receive the same capital treatment as their…
According to biographer Marc Eliot, in To The Limit: The Untold Story of the Eagles,…
Here is a link. The post Democracy continues appeared first on Marginal REVOLUTION. Source link
Some students walk into class ready to talk. Others enter quietly, holding their backpack close,…