The Hidden Environmental Cost of AI: A call for optimal AI usage

As we marvel at the capabilities of advanced AI models like GPT-4, Llama3, and others, an underlying environmental cost often goes unnoticed. Recent findings reveal that generating just 100 words with GPT-4 can consume up to three bottles of water. This might seem negligible for developed economies, but imagine the rural parts of Africa / Asia where women and children must walk miles to fetch water.

More importantly, look at the following statistics about ChatGPT alone to put things in perspective :

  • ChatGPT has over 200 million weekly active users
  • ChatGPT has an estimated 77.2 million monthly active users in the US.
  • 23% of US adults have used ChatGPT as of February 2024.
  • In January 2024, users spent an average of 13 minutes and 35 seconds per web session on ChatGPT.
  • 92% of Fortune 500 companies use OpenAI’s products.
  • API usage: API usage has doubled since the release of GPT-4o Mini.

That’s a lot of water.

So why this concern about water, and why now?

Water is essential for AI Apps like ChatGPT because the data centers running its AI models generate significant heat during intense computational tasks, necessitating effective cooling systems that often rely on water to dissipate this heat.  Evaporative cooling and chilled water systems directly use water to cool servers. In contrast, even air-cooled systems consume water indirectly through electricity production at power plants that use water in their process.

Moreover, maintaining optimal humidity levels in data centers requires water to prevent static electricity and other issues—all of which are amplified by the high computational demands and widespread usage of resource-hungry AI models, leading to increased water consumption and raising environmental concerns.

According to market research, the global AI industry is projected to reach over $190 billion by 2025, underscoring the exponential growth and integration of AI technologies across various sectors. However, this growth trajectory indicates that the environmental footprint of AI will continue to expand unless proactive measures are taken. It is very important that we keep the sustainability costs during the design and implementation of powerful AI applications.

So, until Elon Musk (or Sam Altman perhaps) finds a way to tap solar energy for all energy demands, it might be a good idea not to ask ChatGPT to generate a Korean-Cuban fusion recipe you will never cook.

References & Recommended Further Read :

  1. https://iea.blob.core.windows.net/assets/6b2fd954-2017-408e-bf08-952fdd62118a/Electricity2024-Analysisandforecastto2026.pdf
  2. https://www.tomshardware.com/tech-industry/artificial-intelligence/using-gpt-4-to-generate-100-words-consumes-up-to-3-bottles-of-water-ai-data-centers-also-raise-power-and-water-bills-for-nearby-residents

Original article published by Senthil Ravindran on LinkedIn.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top