Microsoft continues to bet seriously on the integration of Bing with ChatGPT. The company has invested hundreds of millions of dollars to build a supercomputer that can empower the chatbot most famous of the moment. The power of Azure, the powerful platform for cloud services, is bigger than ever, and it is ready to boost the projects of Microsoft and OpenAI.
Thousands of NVIDIA brand GPUs have been used to build Microsoft’s supercomputer. This allows the company to offer OpenAI enough power to develop its projects, which has made artificial intelligences like ChatGPT become more and more powerful. But not only third parties benefit from the Azure infrastructure. Those of Redmond also use it to promote Bing and all the technologies behind the search engine that is becoming more and more popular.

But all this power is not enough for Microsoft. Therefore, the company is already working on launching new virtual machines. These will be supported by NVIDIA H100 and A100 Tensor Core GPU models. Likewise, the NVIDIA Quantum-1 InfinitiBand network platform is added. With this mix of hardware ingredients, Azure could deploy and scale much larger and more advanced AI models than ChatGPT.
According to Scott Guthrie, the vice president of AI and cloud at Microsoft, All of this has cost the company hundreds of millions of dollars.. Although it seems like an exorbitant number, the truth is not surprising after the multi-million dollar investments that the company has made in OpenAI. It certainly just shows us how the company sees AI as its future.
Microsoft sees its future in the integration of Bing with ChatGPT and artificial intelligence

“About five years ago, OpenAI pitched to Microsoft the bold idea that it could create AI systems that would forever change the way people interact with computers,” the company comments on its official blog. They describe how the company founded by Elon Musk assigned him a task: create a supercomputer capable of serving as the basis for the future of artificial intelligence.
“We saw that we would have to build special clusters focused on enabling large workloads, and OpenAI was one of the first test points for that. We worked closely with them to learn what key things they were looking for as they built their learning environments. [de IA].”
Eric Boyd, corporate vice president of Azure AI at Microsoft
Apparently, Microsoft has met. In closing the publication, Eric Boyd adds that from Redmond they continue to “innovate in the design and optimization of infrastructures specially designed for AI.” Of course, we can include collaboration with IT hardware vendors and data center equipment manufacturers here. All this will allow the company to turn Azure into a beacon of cloud computing.
Also, if ChatGPT and Bing become the focus of long-term plans for Microsoft, you will need a platform that can support your exponential growth. Also, we already have the first news of GPT-4, and if it turns out to be as powerful as it seems, the Azure infrastructure will have to be prepared.