Microsoft now allows anyone to check out its Bing Chat chatbot with no waitlist. Of course, that means a lot of people are now checking out and using the service for the first time. However, some Bing Chat users are now saying that responses are getting slower.
In a Twitter exchange with a user who stated he wanted to see faster response times, adding "some times I have to wait so long that it's ridiculous". Microsoft's Mikhail Parakhin, the head of its Advertising and Web Services, responded with an apology, stating "the usage keeps growing, we are not adding GPUs fast enough." However, he added that this current situation will be fixed.
Sorry about the latency: the usage keeps growing, we are not adding GPUs fast enough. We’ll get it fixed.
— Mikhail Parakhin (@MParakhin) May 8, 2023
This situation is one of the biggest roadblocks for generative AI, as these services need more and more specialized GPUs in data servers. The leading maker of these chips is currently NVIDIA. However, unconfirmed reports claim that AMD and Microsoft are teaming up to develop new AI chips that could take away some of the dependence from using NVIDIA.
Parakhin also took the time this weekend to answer some more Bing Chat questions from other Twitter users. He told one user that the long awaited Bing chat history feature is coming in "days". He also told another user that more aspect ratios in Bing Image Creator are being discussed but it may not happen "immediately". Finally, he told another user that he hopes Bing Chat will support Code Interpreter at some point, but added, "It needs to be done securely - not a trivial task."
22 Comments - Add comment