[ad_1]
Artificial intelligence can help tackle climate change, but to fulfill that promise companies need to find a way to limit AI’s own climate impact.
Information and communications technologies already contribute up to 3% of global greenhouse-gas emissions, according to many estimates. Data centers emit about the same amount as the aviation industry and consume hefty amounts of water. And as AI grows, the energy required to train and run its large-language models will increase.
However, AI also can cut emissions by making systems more efficient.
Alphabet’s
Google and American Airlines used AI to help planes create fewer vapor trails, which contribute to global warming. Google also uses it to forecast river floods and to recommend eco-friendly routes on its maps service. San Francisco-based startup Verse is using AI to simplify the process for companies to get clean power. AI is even being used to generate images of what a warmer world will look like, from depicting ocean encroachment on coastal cities to mapping arid lands prone to wildfires.
As AI becomes more integrated into society, finding a neutral—if not positive—net climate impact is crucial. For many companies using AI there are both positive and negative effects on their carbon emissions and water use. That can be quite a balancing act as they pursue their ambitious net-zero targets.
Imbalance of power
Sasha Luccioni, a research scientist at AI-application developer Hugging Face, worked with two other researchers to map the lifetime carbon footprint of a machine-learning model with 176 billion parameters named Bloom—and what they learned was startling.
The factors outside of the energy used in training the models—something most research in the field doesn’t factor in—“wound up being so significant they doubled total emissions,” she said.
For instance, manufacturing a graphics-processing unit—a piece of hardware that speeds up computations in deep learning—involves pure water and rare metals. That adds to climate cost. Bloom used more than a thousand of these GPUs in its training, and that is just one of many external factors Luccioni’s group evaluated in their research.
But models of similar parameter size to Bloom, such as OpenAI’s ChatGPT-3, had significantly higher carbon emissions—more than 20 times as much—and consumed around three times as much power as Bloom.
Energy sources are a large contributing factor in the emissions discrepancy, Luccioni said. If the electricity being used to train the models is sourced from something “cleaner,” that can cut down carbon emissions, even without changing the size of the large-learning model.
In the U.S., where there is no central electric grid, training models in one state versus another can have a significant impact on carbon emissions. In California, where there are large amounts of wind power, emissions could be lower than if the exact same model were trained using energy in a state like Virginia, which is reliant mostly on fossil fuels for power.
If Bloom does better by environmental standards than other large-language models, that is still relative. It used enough energy in training the model over March to July 2022 to power the average American home for 41 years, according to an April 2023 Stanford report. The training run also had 25 times the emissions of one passenger’s round-trip flight from New York to San Francisco, and over a third more than the average American in one year.
In deep water
AI is also very thirsty, according to research this year from Shaolei Ren, a professor of electrical and computer engineering at the University of California Riverside professor.
His work shows ChatGPT-3 needs to “drink” a 500-milliliter bottle of water for a basic conversation of 20 to 50 inquiries, depending on where the electricity is generated. ChatGPT-4 likely consumes more than that, though secrecy around the platform and training means not enough data is public to make an accurate inference.
His group’s estimates for one of Google’s large-language models, known as LaMDA, had water use on the order of a million liters for training alone. Google’s on-site data center water consumption overall in 2022 increased by roughly 20% compared with 2021.
“Wherever we use water, we are committed to doing so responsibly. This includes using alternatives to freshwater whenever possible, like wastewater, industrial water, or even seawater,” the company said in a blog post late last year. Google has a 2030 target to replenish 120% of the water it consumes, on average, across its offices and data centers.
Research from Google estimated its carbon emissions from training the LaMDA model at 26 tons, or the equivalent of about 22 passengers on a round-trip flight between San Francisco and New York.
As with electricity use, water use can become more efficient depending on where it is drawn from.
Microsoft
said last year that its Asian data centers’ actual water-use effectiveness was three times worse than that of the company’s locations in the Americas, meaning water use for identical AI training could triple based on location. That is because it is typically warmer in Asia, which necessitates water-cooled chillers.
Ren’s work showed that seasonal changes make huge differences, too—data centers in the summer can require more water because of evaporation during heat. When looking at both season and data-center location, his estimates showed the highest total water footprint could be more than triple that of the lowest.
Limiting the impact
Ren’s research worked to identify optimization methods that help balance carbon emissions and water use of data centers, while sharing the burden of them geographically, so no one region is facing the brunt of the environmental cost.
Ren said that because the work in AI—training and use—is over the internet, it can be simple to take steps like swapping what data center location a task is sent. Moving requests to data centers powered by clean energy, or to those in cooler regions that use less water, is a change that can add up for the climate. “There’s not much difference from the user perspective,” Ren said.
Google, for instance, says it is using AI to accelerate climate action by arming individuals and organizations with better information to make choices. “We have used tested practices to reduce the carbon footprint of workloads by large margins, helping reduce the energy of training a model by up to 100x and emissions by up to 1,000x. We plan to continue applying these tested practices and to keep developing new ways to make AI computing more efficient,” Amalia Kontesi, a spokesperson for Google said.
Equinix,
an internet-services company that has 248 data centers, is also looking for a solution. The company “is committed to designing, building and operating highly efficient data center infrastructure that is powered by clean energy,” said Christopher Wellise, vice president of sustainability.
Equinix saw a 96% renewable-energy coverage of its operational load last year, the fifth consecutive year with over 90% renewable-energy coverage, according to the company’s fiscal 2022 sustainability report. Equinix also backed five new solar farms in Spain earlier this year to help secure sustainable power to run its data centers.
Meanwhile, a practical step in limiting emissions would be to not integrate AI into platforms that don’t need it, said Hugging Face’s Luccioni.
“The cost of all these shiny new toys is unsustainable if you’re just switching out all these technologies that are working quite well to begin with, with these much more energy-intensive applications,” Luccioni said.
Copyright ©2022 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8
[ad_2]
Source link