Elon Musk has told investors that xAI, one of his many companies, wants to build a supercomputer to power the next version of Grok, a generative AI chatbot available for a fee on X. The news was first reported by The Information who saw an investor presentation detailing the plans.
According to the report, xAI could work with Oracle to develop the supercomputer and have it ready by the end of 2025. It will apparently use lots of Nvidia’s H100 GPUs clustered together in an arrangement four times the size of the biggest clusters today.
The billionaire said earlier this year that Grok 2 needed 20,000 Nvidia H100 GPUs to be trained and that upcoming models of Grok will require more than 100,000 H100 GPUs. By building this supercomputer, which has been called a “gigafactory of computers” named after the Tesla gigafactories, xAI might be able to develop those more advanced language models.
With OpenAI’s ChatGPT dominating the AI landscape, there’s a good chance you’ve not tried or heard much about Grok. It’s available to use on x.com but requires users to pay a subscription. One major advantage of Grok over other models is that it has direct access to all of the posts on X, which Musk constantly says is a replacement for news outlets.
Musk used to be involved with OpenAI until he had disagreements with its head Sam Altman over the direction of the company. Unlike OpenAI which has kept details of ChatGPT close to their chest, xAI has recently open-sourced the weights and architecture of Grok-1.
If xAI does go ahead with these plans, Nvidia will be a major beneficiary as it will have increased demand for the H100 GPU. The move will also be beneficial for the space as a whole as it will force other companies like OpenAI and Google to further innovate their models to stay ahead. Unfortunately, all that extra compute will need to be powered somehow so, hopefully, Musk can find a way to use renewable energy and keep water usage to a minimum.
Source: The Information via Reuters