Energy & Emissions Featured Water

The Hidden Costs of AI

Written by Office of Sustainability student associate, Meghan Jachna, Class of 2027.

Generative AI (artificial intelligence) tools such as ChatGPT and other Large Language Models have had a significant impact in almost every facet of society, but in education and university settings especially, generative AI is transforming the way that students and professors conduct their everyday work.  With 24/7 access to streamlined information, students now rely heavily on AI technology in all areas of education. A 2024 study from the Digital Education Council found that 86% of students use AI in their studies and that ChatGPT is the most used AI tool among students. WashU has even launched its own version of ChatGPT, further signifying the prominent role that AI now plays in academia.  

With the growing prevalence of AI in universities and many other areas of society, most people are aware of its numerous benefits. However, the environmental costs of this tool are often overlooked. The first major problem with AI is its massive energy use and high carbon emissions. The Internal Energy Agency 2024 report projected that the energy use associated with AI, data centers, and cryptocurrency would be equal to the amount of energy used by the entire country of Japan by 2026. Additionally, it is estimated that generative AI alone is “expected to consume 10 times more energy in 2026 than it did in 2023.” One of the reasons that generative AI uses so much more energy than a Google search, sending an email, or posting on social media is because the machine learning component of generative AI uses a significant amount of energy for training and input of data in order for it to generate highly tailored responses to the individual request of millions of different users. The data centers used to run AI require more energy intensive systems to process the higher quantity of data compared to other technologies and are often primarily run on gas or coal-powered electricity

Training a single model of ChatGPT consumes the amount of electricity equivalent to the “annual electricity consumption of 120 American households” which is concerning especially as new models of ChatGPT continue to be released. WashU recently upgraded to the Open AI’s new GPT-40 “omni” model which the university recognizes as the “most advanced artificial intelligence model capable of complex problem solving,” and that the model “will provide better functionality to WashU users, with text generation at double the speed.” However, for every new model that is released, a new round of AI training must be completed which requires another large energy investment. Is having a slightly faster and more advanced model worth the environmental costs associated with the major increase in energy use and carbon emissions? It is estimated that around 30 times more energy is used to generate information into a customized response using AI tools than simply taking it from the source. Therefore, as AI becomes a more common tool on campus, it is essential that students and faculty use it carefully and understand its environmental impact.  

While many tech companies say that they are working on ways to incorporate more renewable energy into powering their AI models, this goal has made relatively slow progress. The data centers used to power generative AI need a stable source of power, and many renewable forms tend to fluctuate more than can be tolerated by the data centers. Additionally, there is not enough existing and reliable renewable energy infrastructure yet for the massive amount of energy that is being required for the rapidly growing use of generative AI. And tech companies such as Google only have plans to expand their energy use. Google says it will soon be building more data centers right here in Missouri and in other places in the Midwest. These data centers are disruptive to the communities they are built in, causing noise pollution as well as “driving up residents’ power bills and taxing the electric grid.” Therefore, it is clear concerns around AI are not simply a matter of data in the cloud, but an issue affecting the well-being of real communities across the U.S.  

The high energy use associated with generative AI is not the only environmental concern of this increasingly popular tool. Because of the large amount of energy used to generate responses for ChatGPT tools, lots of heat is generated and requires quick cooling in order to keep the system functioning. Water systems are used to cool down the servers in the data centers. Yet this is not just a small amount of water being used to control these systems. Google’s data centers used around 5 billion gallons of fresh water for cooling in 2022 which represented a 20% increase from 2021. Additionally, it is estimated that water usage from AI could reach somewhere around 1.7 trillion gallons of water by 2027. This is “more than the total annual water withdrawal of…half of the United Kingdom.” These are staggeringly high statistics, and this type of intensive water use will not be sustainable long-term. 

So, what could be done about all of this? Generative AI is a powerful new tool, and it is not a realistic solution to simply discourage the use of it altogether. Now that generative AI responses are a feature of a simple Google search, most people are accessing AI tools every day. The question is if the responsibility for working to reduce the environmental damage associated with the use of AI should be a responsibility put on the shoulders of the individual users or rather tech companies and larger institutions creating and promoting its use.  

The first step to making change begins with education on many different levels. First of all, tech companies need to be more transparent about their energy and water use. Tech companies are not required to report their energy or water use and therefore there is not much public awareness about these environmental concerns. Putting in place more policies requiring companies to report the environmental impact of their AI systems will keep large tech companies accountable. Additionally, the institutions and businesses promoting the use of AI as an educational tool need to be providing students and employees with more education about the costs that come with one’s daily ChatGPT search. Making more people aware that a 100 word email generated by an AI chatbot results in the use of over a bottle of water, can lead to one using AI more sparingly. Encouraging an atmosphere of “digital sobriety” in which people are more conscious of one’s use of everyday technology can add up. Finally, the implementation of an Energy Star rating for AI models to help the public understand the most environmentally sustainable option could be beneficial for sparking more individualized concern for this topic. 

Furthermore, there could be more technical focused improvements to begin to address this issue. There are more efficient ways to train AI models that are not always fully utilized. Building data centers in regions with more reliable access to renewable energy can help encourage the transition to greater use of clean energy in this industry.  

One day, AI might be able to be used as an effective tool to begin to eliminate some of today’s most severe environmental concerns. But, as long as the public is largely left in the dark about what their daily AI use is actually doing to their planet, there will not be change. Increased education, transparency, and awareness about the true environmental cost of AI needs to begin now. 

No AI was used to generate this story.