Microsoft spends a fortune on ChatGPT supercomputer
Microsoft links together thousands of Nvidia graphics processing units in its Azure cloud computing platform.
Microsoft spent hundreds of millions of dollars to build a massive supercomputer used to help power OpenAI's ChatGPT chatbot, according to a report from Bloomberg.
Microsoft explained in a pair of blogs published on Monday how it created Azure's powerful artificial intelligence infrastructure used by OpenAI while explaining its systems.
While building the supercomputer aimed to power OpenIA's projects, Microsoft linked together thousands of Nvidia graphics processing units in its Azure cloud computing platform. This unlocked the AI capabilities of tools such as ChatGPT and Bing.
Read more: ChatGPT ‘massive opportunity’ for civil service and beyond: Donelan
Microsoft’s vice president of AI and cloud, Scott Guthrie, stated that the company spent hundreds of millions of dollars on this project. This demonstrates its will to throw even more money into the AI space.
Microsoft is already working to make Azure's AI capabilities more powerful by the time the launch takes place. The company viewed that this will allow OpenAI and other companies that rely on Azure to train larger and more complex AI models.
“We saw that we would need to build special-purpose clusters focusing on enabling large training workloads and OpenAI was one of the early proof points for that,” Eric Boyd, Microsoft’s corporate vice president of Azure AI, said in a statement.
“We worked closely with them to learn what are the key things they were looking for as they built out their training environments and what were the key things they need," he added.
Read next: Microsoft to fire 10,000 employees in 5% workforce cut: Reports