Google blames AI for 48% rise in climate emissions since 2019




When Google announced this week that its climate emissions had risen by 48 percent since 2019, it attributed the increase to artificial intelligence.

US tech firms are expanding their global networks of data centers, claiming that AI is driving this growth.

This has raised concerns about the significant energy consumption of AI technology and its environmental impact.

Each time a user makes a request to a chatbot or generative AI tool, it is processed by a data center.

Even before reaching this stage, developing AI programs, known as large language models (LLMs), requires substantial computing power.

This continuous operation consumes vast amounts of electricity and generates heat, necessitating further energy for cooling.

According to the International Energy Agency (IEA), data centers use roughly 40 percent of their electricity for computing and another 40 percent for cooling.

Since the launch of OpenAI’s ChatGPT bot in late 2022, major tech companies have been integrating AI into their products, raising concerns about potential spikes in electricity usage.

AI services demand significantly more power than non-AI alternatives. Studies have indicated that each request to ChatGPT consumes approximately 10 times the power of a single Google search.

If Google were to switch all its search queries to AI, the company’s electricity consumption could skyrocket.

Most new AI services and products rely on LLMs, which are resource-intensive to program, requiring high-powered computer chips that need more cooling, further increasing electricity use.

Before AI’s rise, data centers were estimated to account for about one percent of global electricity demand.

In 2022, the IEA reported that data centers, cryptocurrencies, and AI together consumed 460 TWh of electricity worldwide, nearly two percent of total global electricity demand. The agency projected this figure could double by 2026, equating to Japan’s electricity consumption.

Alex De Vries, a researcher and founder of the Digiconomist website, estimated AI’s electricity use by analyzing sales projections from NVIDIA, a leading supplier of AI-specialized servers.

He concluded that if NVIDIA’s 2023 sales projections were accurate and all servers operated at full capacity, they could consume between 85.4–134.0 TWh annually, comparable to the electricity usage of Argentina or Sweden.

De Vries noted that his estimates were conservative, as they did not account for cooling requirements. He added that the adoption of NVIDIA’s servers had surpassed projections, suggesting higher figures.

Fabrice Coquio of Digital Realty, a data center company, stated during a visit to one of its large facilities north of Paris in April that AI is set to transform the industry.

-channel8


AM:10:30:07/07/2024




viewer 232