top of page

The environmental costs of AI

Dearbhlá McMenamin explores the consequences of AI on our natural systems.


The popularity of AI has soared in recent years, with the integration of AI systems such as ChatGPT into our daily lives (Image by Solen Feyissa via Unplash)

 

AI has gained popularity with the integration of AI systems in our daily lives, with two thirds of the population using it every day. Popular search engines, including Microsoft and Google, now have their own AI assistant functions, CoPilot and Gemini, respectively. These assistants can complete a range of functions using data from across the internet. AI is artificial intelligence that has been trained using a large amount of data to recognise patterns to provide answers to prompts. Although there are many positives to using these systems, including the time efficiency, there are certainly drawbacks, in particular for energy and the environment.

 

The rise of AI

 

AI (artificial intelligence) has been in development since the 1950s and 60s, with Alan Turing first posing the question ‘Can machines think?’ in a scientific paper in the 1950s. Similarly, computer scientist and cognitive researcher John McCarthy made some significant discoveries in the field, and he coined the term ‘AI’ in 1955 at Dartmouth College. These early findings were built upon with the increase in more efficient and complex computer programs and larger amounts of data gathered. These can train better AI models for tasks such as image and audio recognition. From this point forward, the knowledge and level of interest in the field grew, and by the late 1990s, AI began to gain global recognition and snowball with the creation of new products with in-built AI functions.

 

By 2025, it is estimated that there will be 378 million individuals using AI. This is 116 million more than the figure was 5 years ago. In today’s technological market, there are hundreds of companies that have integrated AI processing into their products - large corporations, including Google, Meta, and Amazon, all utilise AI in their processes and products. Products such as Apple’s Siri enable the user to ask for directions hands-free whilst driving, whilst the Amazon Alexa can read recipe instructions to the user as they cook in their kitchen. Some systems, such as ChatGPT, also have functions for image recognition and image creation, which can be useful when searching a large dataset for certain factors. It’s conclusive that AI is useful in reducing the time spent completing tedious, repetitive tasks. However, several drawbacks to AI are materializing as real concerns as the popularity of AI systems continues to rise.

 

Data Centres

 

Data centres currently account for approximately 2% of the global electricity demand. Located in various places across the world, they deliver computing services such as storage, software and analytics, including AI. The energy consumption due to the rise in the use of AI is expected to use the same amount of energy annually that the Netherlands requires by 2027. Data centers are also reliant on large amounts of energy, water and resources. In Europe, countries with the highest growth of electricity consumption via data centres include, Germany, Ireland, and Spain. Some very large data centres can even account for the equal total electricity consumption of small cities.

 

Energy Usage & Carbon Emissions


By consuming large amounts of energy, data centres are also associated with a large amount of greenhouse gas (GHG) emissions. The International Energy Agency (IEA) estimates that a ChatGPT search uses ten times the amount of electricity as a regular Google search. By 2028, it is projected that the electricity consumption of the world's data centres will increase to 857 TWh. In 2023, this figure was 352 TWh.

 

As a large amount of electricity generation globally is still fuelled by fossil fuels such as coal-fired power plants, this produces a high amount of CO₂. Morgan Stanley research claimed in 2024 that the data centre industry globally was set to emit 2.5bn tonnes of CO₂ into the atmosphere by the 2030s. As the demand for AI processing increases, the electricity demand of data centres and AI is expected to increase exponentially. Although renewable energies could power these data centres, the majority are located within the US. The US has been recorded to host over 5000 data centres, and the US energy network is still heavily influenced by fossil fuels. In 2023, the country’s energy system was 60% powered by fossil fuels.

 

Water Usage


Data centres require cooling as they run 24/7, storing and processing information, and these activities can generate large amounts of heat. Cooling systems involve fans that require significant amounts of water. These can diminish local water sources in the data centre's area. In the USA, they consume 1.7bn litres of water daily, and there are many issues with reporting, as less than 33% of centres measure their water consumption rates. There are also reports that some of the biggest tech giants, like Google and Amazon, operate data centres in parts of the world that are already facing threats from drought. Some of these areas are within the US, with plans to develop more data centres in Thailand and Chile, which already face water scarcity.

 

Communities in Spain protest with the ‘Your cloud is drying my river’ campaign. In December 2024, Amazon requested permission to increase its water consumption limits in the Aragon region of Spain by 48%. Locals are worried this will impact the area's ability to deal with climate change, as global temperatures increase, the likelihood of heatwaves and extreme temperatures also increases. Spain, in recent years, has seen an increase in the number of heat-related deaths linked to the climate crisis, ranking first in Europe for their heat-related mortality rate. An excess of 15,000 people died in Spain due to high temperatures in 2022.

 

Methods of mitigation


The first method of mitigating the energy demand for AI is to reduce individual usage of AI. Using a regular search engine instead of an AI assistant will require less energy overall. Not only does this save environmental resources but will more than likely give you more accurate information as AI can often provide incorrect information and can produce biased results. It is also better to source your information independently as it encourages independent problem-solving and thinking. We don’t want to become over-reliant on AI so that it inhibits independent learning.

 

By powering data centres with renewable energies such as solar power, this will reduce reliance on fossil fuels. However, this means of electricity comes with an initial expense and uses many metals for initial construction. With the development of technology, better systems to optimise server operation and more efficient cooling systems also have the potential to reduce energy consumption on AI usage. These strategies should be implemented sooner rather than later, and it is important that these are not only employed at current data centres but in others that are yet to be built.

 


About the Author:

Dearbhlá is the News & Politics Editor for WILD. She has recently completed her BSc Environmental Science at The University of York and will graduate July 2025. Her interests include air quality, climate change and pollution.

 

Comentarios


Join Our Mailing List

Thanks for submitting!

Contact Us:

Email info@wildmag.co.uk for general enquiries, to work with us on a sponsored piece, or submit your article ideas.

  • Instagram
  • Facebook
  • Twitter
  • LinkedIn

© 2025 by Wild Magazine

bottom of page