ChatGPT is not the earth’s friend
A single hyperscale data centre facility can consume as much electricity as 350,000 to 400,000 electric cars in a single year. Data centre electricity demand is set to grow to460 terawatt-hours. One terawatt-hour can power about 100 million homes for an hour. The demand AI (artificial intelligence) places on data centres requires about 4.2–6.6 billion cubic meters of fresh water annually. There are 1000 litres in a single cubic metre. Do you understand the problem?
A data centre, better understood as the backbone of the digital world and modern technology, is a physical facility that organisations such as Meta, Amazon, and Google use to store their critical applications and data. It is based on a network of computing and storage resources that enable the delivery of shared applications and data. Due to the sheer increase of digital activity, there has been a surge in data centre construction, with the USA holding the highest number of data centres in the world as of March 2025, with 5,426 facilities.
There are four types of datacentres: hyperscale, owned by technology giants such as Meta and Amazon, designed for highly scalable operations containing over 5000 servers with extensive infrastructure. These types are approximately 100,000 sq ft – 1,000,000 sq ft. These are the most energy-intensive. Then there are enterprise datacentres, about 10,000 – 50,000 sq ft and used by private companies like JPMorgan Chase and Coca-Cola. Colocation datacentres that are 5,000 – 10,00 sq ft large and used by third-party companies, and edge datacentres, under 5,000 sq ft large and used by healthcare companies, finance, and retail. The main issue surrounding data centre energy and resource consumption is with the hyperscale data centres, which have the storage capacity of several exabytes (1 exabyte is 1 million terabytes).
A typical chip plant uses nearly 10 million gallons per day.
As the opinion around generative AI becomes more positive, with companies and popular figures embracing its intrusion in everyday life, it is now becoming part of multiple systems: in education, where it is used for grading work and finding answers, in the corporate job market, where it streamlines hiring processes for internships and graduate roles, and in big tech, where it powers personalised recommendation systems on platforms such as Netflix, Instagram Reels, and TikTok. This puts an even larger strain on the datacentres, requiring more resources to cool these facilities down.
Hyperscale datacentres host thousands of servers, each carrying multiple advanced chips. Producing these chips requires ultrapure water, essential for cleaning and rinsing silicon wafers, as even minuscule impurities can cause defects. The process needs about 1,500 gallons of piped water to create 1000 gallons of hyper-pure water. A typical chip plant uses nearly 10 million gallons per day. As a result, each chip arrives at a data centre with a significant water footprint.
On top of this, rising AI workloads generate more heat than conventional air cooling can handle. Rising chip power and the high server density of hyperscale datacentres mean that as heat loads climb, air can no longer remove enough heat from densely packed racks, so freshwater, which absorbs heat 3,000 times more effectively than air, is used for its consistent quality and minimal mineral build-up. Water is used in chillers, cooling towers, and liquid cooling systems to manage server heat. It is used in electricity generation, as most electricity comes from thermoelectric power plants that require water for steam production and cooling, as well as for use in the supply chain.
The issue is that these huge data centres are mainly situated in regions where transmission capacity is already constrained.
You can’t just place a data centre anywhere, especially one as large as a hyperscale. It requires substantial real estate and resources, often competing with agriculture and housing. Data centres need to be placed in areas where freshwater is abundant, land is large and cheap, and there is close proximity to users. Places like Greater Beijing, Dallas, and Oregon are selected to construct these datacentres.
A single question uses nearly 10 times as much electricity to process as a Google search
Residents living near data centres do not reap the rewards of these data centres, prompting questions about sustainability. Residents report noise pollution (a low-frequency, constant hum from the coolers of these huge datacentres), a strain on the water supply, reported weak water pressure, dry wells and incredibly expensive water bills, and pollutants poisoning the water and air. To make matters worse, big tech companies are planning to move data centre construction to Africa and South America to extract the limited water and energy sources they have to catch up with AI datacentres’ increasing demands.
As we continue to use chatbots like ChatGPT, where a single question uses nearly 10 times as much electricity to process as a Google search, we have to question how this is impacting the environment. The integration of AI into almost every search platform, as well as the bombardment of social media apps with ‘AI slop’ (low-quality content produced in huge numbers, that lack meaning, depth, and value, to boost engagement within these apps), further stresses our limited resources, pushing us further away from our ambitious target of reducing net GHG (greenhouse gas) emissions by at least 55%.
It is up to the government and environmental companies to halt the production of hyperscale datacentres, to ease the strain on valuable natural resources, and to enact laws that force these megacompanies responsible for this environmental crisis to be more transparent about their emissions, instead of dodging the location-based accounting system used to track CO2 emissions.
Comments