The Generative AI Race Has a Dirty Secret

1 year ago
tgadmintechgreat
182

At the beginning of February first Google, then Microsoft announced a major overhaul of their search engines. Both tech giants have invested heavily in building or buying generative AI tools that use large language models to understand and answer complex questions. Now they are trying to integrate them into search, hoping that they will provide users with a richer and more accurate experience. Chinese search company Baidu announced he will follow suit.

But the excitement about these new tools may be hiding a dirty secret. The race to build high-performance AI-powered search engines is likely to require a dramatic increase in computing power, and with it a significant increase in the amount of energy needed by technology companies and the amount of carbon they emit.

“There are huge resources involved in indexing and searching Internet content, but enabling AI requires a different firepower,” says Alan Woodward, professor of cybersecurity at the University of Surrey in the UK. “This requires processing power, as well as storage and efficient retrieval. Every time we see a drastic change in online processing, we see a significant increase in the power and cooling resources needed for large data centers. I think this could be such a move.”

Learning large language models (LLMs), such as the models underlying ChatGPT OpenAI to be used in Microsoft’s enhanced Bing search engine, and Google’s equivalent, Bard, means parsing and calculating relationships in huge amounts of data, so they aimed to developed by companies with significant resources.

“Training these models requires a huge amount of computing power,” says Carlos Gomez-Rodriguez, a computer scientist at the University of Coruña in Spain. “Right now, only big tech companies can train them.”

Although neither OpenAI nor Google have disclosed what the computational cost of their products is, third party analysis The researchers estimate that GPT-3 training, on which ChatGPT is partially based, consumes 1,287 MWh and results in emissions of more than 550 tons of carbon dioxide equivalent — the same amount as one person makes 550 round trips between New York and New York. San Francisco.

“It’s not that bad, but then you have to take into account [the fact that] you not only need to train it, but also execute and serve millions of users,” says Gomez-Rodriguez.

There is also a big difference between using ChatGPT, which investment bank UBS estimates 13 million daily users– as a standalone product and integrate it into Bing, which processes half a billion searches every day.

Martin Bouchard, co-founder of Canadian data center company QScale, believes that, based on his reading of Microsoft and Google’s plans for search, adding generative AI to the process would require “at least four or five times the computation for each search.” He points out that ChatGPT is currently ending its understanding of the world at the end of 2021 as part of an attempt to reduce computing resource requirements.

To meet the requirements of search engine users, this must change. “If they are going to retrain the model frequently and add more parameters and stuff, it will be a completely different scale,” he says.

Leave a Reply