Earlier this month, Google announced plans to launch Bard, a chatbot AI designed to help users find information with a natural language model. Microsoft also announced the new Bing search, with its own chatbot that combines tech from Microsoft with tech from OpenAI, the company behind ChatGPT.
However, one aspect of all of this chatbot technology that none of these companies talk about that much is its cost to operate. A new report from Reuters offers more information on that part of this new technology. It chatted with Alphabet's Chairman John Hennessy, who stated that using a chatbot with a large language model likely costs as much as 10 times that of a normal search.
we will have to monetize it somehow at some point; the compute costs are eye-watering
— Sam Altman (@sama) December 5, 2022
Back in December 2022, just after the launch of ChatGPT, OpenAI's CEO Sam Altman stated on Twitter, "the compute costs are eye-watering." That's part of the reason why it launched ChatGPT Plus in early February, with more features and faster performance for the price of $20 a month.
Reuters used a report from Morgan Stanley, who estimated that it cost Google a fifth of a cent for each time someone made a search on its service in 2022. However, it estimates that if a chatbot AI like Bard were put it place, Google's search cost could go up by as much as $6 billion in 2024. That's assuming the chatbot handles half of Google's searches with 50 word answers.
Of course, the reasons for these higher costs is that it takes many more CPUs to generate all those natural sounding answers from a chatbot AI. That also means higher costs in electrical power to keep those servers working.
So what can be done to cut costs? One thing that will likely happen is that these chatbots will simply become more efficient and thus cost less to operate, but it's unclear just how much that will do to reduce the price. Another solution is that Google and Microsoft could show links to ads in chatbot answers which will generate revenue that will compensate for the cost of running those servers.
One other solution is something that ChatGPT is already doing; offering a subscription service where users can get faster and better chatbot answers. That option is likely to be a last resort for Google and Microsoft. As more and more users access these new AI-driven search services, they will have to figure out if the costs will come down enough for these companies to make them economically viable for their financial bottom line.
10 Comments - Add comment