The True Environmental Cost of Each AI Query We Make

Discover how the rapid rise of AI is pushing tech companies to rely on power plants to meet the energy demands of AI data centers.
 
AUSTIN, Texas - Jan. 15, 2025 - PRLog -- Using AI to Answer Your Questions May be Costlier than You Realize, especially for the Environment.

Since ChatGPT burst onto the scene, the adoption of Large Language Models (LLMs) has grown rapidly.

For example, OpenAI's ChatGPT is believed to have 200 million active users each month, while Google is adding AI-generated summaries to its traditional search results – broadening AI's reach to its entire user base.

To this, we can add Meta's customer base as well, as Meta adds AI feature sets to its Facebook, Instagram, and Threads applications.

And let's not forget the Generative AI category that is capable of producing computer-generated still images, such as Midjourney and Stable Diffusion, which is now expanding to incorporate the production of video imagery as well.

But what impact is this rapid adoption of AI having on the environment?

You may not realize it, but each LLM query (such as asking ChatGPT or Google for an AI-based answer) uses significantly more energy than you might expect.

How much more?

According to John Hennessy, the Chairman of Alphabet (Google's parent company), each query sent to a large language model (LLM) currently uses roughly 10 times more energy than a traditional search query. (Hennessy does believe the cost will come down over time, a prediction we will look at in more depth below.)

As companies race to implement LLMs into every imaginable application, this order-of-magnitude jump in energy usage will create shockwaves through the energy markets – and make it that much harder to meet the goals of reducing fossil fuel emissions.

Rising Demand for AI-Based Queries is Putting Increased Pressure on Fresh Water Resources Used by Data Centers

The rapid adoption of LLMs is not just driving up demand for increased energy production, it's also having a major impact on water demand – water that is used to cool all the new data centers being constructed around the world.

For example, researchers at UC Riverside and UT Arlington (doi.org/10.48550/arXiv.2304.03271) report that by 2027, global AI use could require between 4.2 – 6.6 billion cubic meters of water.

To put that number in context, it's a little less than 1% of US annual water consumption (444 cubic meters as of 2020). That may seem like a relatively small amount today, but given that many data centers are being built in places with rapidly depleting groundwater...

Read more...https://formaspace.com/articles/tech-lab/the-true-environ...

Contact
mktg@formaspace.com
8002511505
End
Source: » Follow
Email:***@formaspace.com Email Verified
Tags:Artificial Intelligence
Industry:Environment
Location:Austin - Texas - United States
Account Email Address Verified     Account Phone Number Verified     Disclaimer     Report Abuse
Formaspace News
Trending
Most Viewed
Daily News



Like PRLog?
9K2K1K
Click to Share