Every time someone uses ChatGPT to write an essay, create an image, or advise them on planning their day, the environment pays the price.
It is estimated that a query on the chatbot that uses artificial intelligence requires at least 10 times more electricity than a standard Google search.
If all Google searches used generative AI in the same way, they could consume as much electricity as a country the size of Ireland, calculates Alex de Vriesthe founder of Digiconomist, a website that aims to expose the unintended consequences of digital trends.
Yet someone using ChatGPT or another AI application has no way of knowing how much energy their questions will consume as they are processed in tech companies’ massive data centers.
De Vries said the growing energy demand from AI technologies will undoubtedly force the world to burn more oil, gas and coal, which warm the climate.
“Even if we can power AI with renewable energy, we have to understand that renewable energy is limited and so we will use more fossil fuels elsewhere,” he said. “The end result will be increased carbon emissions.”
AI is also thirsty for water. ChatGPT gulps down about a 16-ounce bottle in just 10 queries, calculates Shaolei Renassociate professor of electrical and computer engineering at UC Riverside, and colleagues.
The increasing consumption of energy and water by AI has has raised concerns in California and around the world. Experts have detailed how this could delay the transition to green energy, while increasing consumers’ electricity bills and the risk of power outages.
To try to avoid these consequences, De Vries, Ren and other experts are calling on tech companies to disclose to users how much energy and water their queries will consume.
“I think the first step is to have more transparency,” Mr. Ren said. AI developers, he added, “tend to be secretive about their energy and water usage.”
Ren said users should be informed on websites where they are prompted to enter their searches how much energy and water the search will require, similar to the way Google tells people searching for airline tickets how much carbon emissions the trip will generate, he said.
“If we had this knowledge, we could make more informed decisions,” he said.
Data centers, the massive warehouses of computer servers that power the internet, have long been energy guzzlers. But the specialized computer chips needed for generative AI consume far more electricity because they are designed to read vast amounts of data.
The new chips also generate so much heat that it takes even more energy and water to keep them cool.
Although the benefits and risks of AI are not yet fully understood, companies are increasingly integrating this technology into their existing products.
In May, for example, Google announced the addition of what it called “AI Overviews” to its search engine. Now, whenever someone types a question into Google Search, the company’s AI generates an answer from the search results, which appears at the top of the page.
Not all of the answers generated by Google’s AI were correct, including when it asked a user to add Elmer’s glue to the pizza sauce to keep the cheese from sliding off the crust.
But researchers who don’t want these AI-generated responses or want to avoid additional electricity and water consumption can’t turn off the feature.
“Currently, the user does not have the option to unsubscribe,” Ren said.
Google did not respond to questions from The Times.
OpenAI, the company that created ChatGPT, responded with a prepared statement but declined to answer specific questions, such as how much energy and water the chatbot uses.
“AI can be very energy intensive, so we’re constantly working to improve its efficiency,” OpenAI said. “We’re carefully considering how to best use our computing power and supporting our partners’ efforts to achieve their sustainability goals. We also believe AI can play a key role in accelerating scientific progress in discovering climate solutions.”
Three years ago, Google committed to reaching “net zero” (where its greenhouse gas emissions would be equal to what it eliminates) by 2030.
The company is not making progress toward that goal. By 2023, its total carbon emissions have increased by 13%, the company revealed in a July reportSince 2019, its emissions have increased by 48%.
“As we further integrate AI into our products, reducing emissions may prove challenging due to the increasing demand for energy from the increased computational intensity of AI,” the company said in the report.
Google added that it expects its emissions to continue to increase before decreasing at some point in the future. It did not specify when that might happen.
The company also revealed that its data centers consumed 6.1 billion gallons of water in 2023, up 17% from the previous year.
“We are committed to developing AI responsibly by working to reduce its environmental footprint,” the report says.
De Vries said he was disappointed that Google did not disclose in its report how much AI was increasing its energy requirements. The company said in the report that such a “distinction between AI and other workloads” would be “meaningless.”
By not reporting AI power consumption separately, he said, it’s impossible to calculate exactly how much additional electricity Google Search now uses with the addition of AI insights.
“Although they are able to provide the required information, they are now withholding it,” he said.
Newsletter
Towards a more sustainable California
Get Boiling Point, our newsletter exploring climate change, energy and the environment, and be part of the conversation – and the solution.
You may occasionally receive promotional content from the Los Angeles Times.