OpenAI’s ChatGPT has been making headlines in recent months, and for good reason. The technology behind the language model is truly impressive, and its potential applications are vast. However, the cost of running ChatGPT is no small feat. CEO of OpenAI, Sam Altman, has referred to the costs as “eye-watering.” With estimated computing costs for “inference” (user queries) alone ranging from $700,000 to $1,000,000 per day, it’s no wonder that Altman is cautious about the future of ChatGPT. Clearly, Microsoft’s gung-ho drive to win substantial market share for Bing Search using ChatGPT technology is going to be a massive impact on the company’s annual operating budget.
The creation of ChatGPT’s model involved extensive amounts of training data, super-computational power, and energy consumption. The model was trained on a massive corpus of text data, which required large amounts of computational power and energy. This process involved updating the model’s parameters in response to the training data, a process that was repeated millions of times to reach the final model. The actual inference (user query) compute requirements are much lower, as the model only needs to perform a forward pass with the input data to generate a response. However, the high-dimensional nature of the model still requires significant computational power and energy to perform the inference, and it is these costs that are estimated in the $700K to $1M range, every day.
When these costs are put into such stark perspective, we must acknowledge that Microsoft has taken a remarkably bold step in incorporating ChatGPT into its Bing search engine. This integration will make ChatGPT-powered Bing Search available to over a billion people, making it one of the largest real-world deployments of ChatGPT to date. Microsoft CEO, Satya Nadella, has been a strong advocate for the integration, stating, “We believe that conversational AI will define the next generation of search, and we’re excited to bring GPT-3’s language capabilities to Bing to help people get things done.”
It’s a massive bet for Microsoft, and one that could have far-reaching implications for the future of search. With GPT-3’s advanced natural language processing capabilities, users can now engage with Bing in a more conversational manner, making search results more accurate and relevant. This could lead to a whole new level of user engagement and satisfaction, as well as increased revenue for Microsoft.
It is difficult to estimate the exact cost that Microsoft will incur in operating ChatGPT’s inference work on its Azure cloud computing platform. If the “New Bing” is even moderately successful in lifting its search market share, the investment into ChatGPT’s is likely to be substantial. However, as CEO Satya Nadella has stated, “We are making a long-term bet on the fundamental technology of language models, and we are committed to investing in its development and democratizing access to it.” Thus, Microsoft is likely willing to bear the cost of operating ChatGPT as part of its larger strategy in advancing its artificial intelligence capabilities and providing cutting-edge technology to its users.
However, there are also potential downsides to consider. The cost of running GPT-3 is startlingly high, and it remains to be seen how Microsoft will address this issue. Additionally, there are concerns about the ethical implications of using AI in search, including the potential for biased results and the spread of false information, important issues for corporate communications and, potentially, crisis management public relations, once New Bing goes into wide general availability.
Despite these concerns, the integration of OpenAI’s technology into Bing is a significant step forward for the general public’s use of internet search. As Sam Altman states, “GPT-3 is a remarkable achievement in the field of AI, and we’re excited to see how it will be used to solve real-world problems.” Similarly, Satya Nadella adds, “We believe that AI has the power to help people save time, make informed decisions, and get more done, and we’re committed to bringing these benefits to people through Bing.”
The integration of GPT-3 into Bing is a major milestone for the AI industry, and it will be fascinating to see how it impacts the future of search. Microsoft’s bold move could lead to significant advancements in the field, but it also highlights the challenges that come with the implementation of such cutting-edge technology. With costs running high and ethical concerns to address, it’s clear that there is still much work to be done. Nevertheless, the potential benefits of GPT-3 cannot be ignored, and Microsoft’s integration of the technology into Bing will be fascinating to watch.