When Oregon State University students use artificial intelligence for their research papers, they may not realize the environmental impact of their very paper.
Wenqian Dong and Stephen Ramsey, assistant professors at the College of Engineering who specialize in AI, detail how AI impacts the environment and what the university is doing to curb said impacts.
According to an email from Dong, AI models require a significant amount of computational power due to the amount of complex computations they require.
“Training GPT-3 … is estimated to have consumed 1,287 MWh, equivalent to the yearly energy consumption of over 120 U.S. households,” Dong said. “The energy used in these processes typically comes from non-renewable sources, further amplifying the environmental impact.”
Most of the energy demands created in the wake of AI systems come in the form of generative AI models such as the popular GPT-3 system developed by OpenAI, according to Dong.
Due to this demand in energy, companies are beginning to establish plans for emission neutrality which will help lessen the impact of AI on the environment.
“Institutions like the Allen Institute for AI, Lawrence Berkeley National Laboratory and various universities are leading the way in researching energy-efficient AI algorithms and hardware,” Dong said. “The goal is to innovate in a way that reduces the carbon footprint while still harnessing the transformative potential of AI.”
According to Ramsey, one of the universities conducting this research is OSU.
“The institution at the highest level is very interested in clean energy and very interested in artificial intelligence … As a land, sea, space and sun grant university, we have particular strength in some areas like hydropower. OSU is active in research in taking wave energy and tidal energy and turning it into electricity,” Ramsey said.
This trend is not just limited to universities, as companies are also trying to reduce carbon emissions, one example of this being Google, who has announced their largest offshore wind project in February 2024.
“Big tech companies that operate these enormous data centers … in the last 15 years have specifically looked to set up their data centers where they can get power cheaply and increasingly where they can get cold fresh water for cooling. That suggests companies will make decisions based on where to locate and operate their data centers which includes training AI models,” Ramsey said.
However, even though companies are increasingly looking towards energy efficient locations to operate their certain AI systems, that doesn’t necessarily mean they will always make the greenest choice, especially if it contradicts the company model in gaining the most amount of energy in the cheapest manner possible, according to Ramsey.
“It’s going to take a lot of encouragement from federal and state governments to make sure climate concerns are being addressed in a regulated manner,” Ramsey said.