A low mechanical hum can be heard somewhere on the edge of a northern Virginia data center campus. Thousands of servers stacked inside buildings without windows, each rack blinking with tiny lights, are the source of the sound. It appears to be nearly silent from the outside. However, inside, the machines are performing a massive task: processing the requests, queries, pictures, and concepts that people now routinely give to artificial intelligence.
As the AI boom develops, it’s difficult to ignore the fact that capability is typically the topic of discussion. more intelligent models. quicker responses. greater discoveries. However, as you stand next to those structures and listen to the continuous whir of cooling fans, a different idea starts to cross your mind. There must be a place for all of this intelligence. Additionally, it turns out to be very costly to run.
| Category | Details |
|---|---|
| Topic | Hidden Economic and Environmental Costs of Artificial Intelligence |
| Key Organizations | OpenAI, Google, Microsoft, IBM |
| Industry | Artificial Intelligence & Cloud Computing |
| Key Issue | Rising compute costs, energy consumption, and data center expansion |
| Estimated Energy Use | Data centers consumed about 460 TWh globally in 2022 |
| Training Example | GPT-3 training used roughly 1,287 MWh of electricity |
| Major Infrastructure | Global network of AI data centers |
| Environmental Impact | Carbon emissions, water usage for cooling, strain on power grids |
| Reference | https://www.ibm.com/ |
Tech industry executives are beginning to express this more honestly. According to an IBM report, computing expenses associated with generative AI may increase by almost 89% between 2023 and 2025. Although that figure doesn’t seem significant at first, it has already forced difficult choices inside corporate budgets. Some businesses have subtly postponed AI initiatives. Before making a complete commitment, some are carefully testing the economics.
There’s a feeling that the tech sector is about to enter an odd new phase, where the issue isn’t whether AI works but rather whether anyone can afford to run it at scale.
What engineers refer to as compute is partially to blame. Large clusters of specialized chips must run for weeks or months in order to train large language models. The process used more than a thousand megawatt-hours of electricity and produced hundreds of tons of carbon emissions when researchers trained early models like GPT-3. And that was just the training phase. Later on, the true surprise was revealed.
When an AI assistant is asked to summarize an email, create an image, or clarify a legal provision, the model does what engineers refer to as inference. That silent exchange between servers occurs billions of times every day. According to some estimates, a single AI query might use about 100 times as much energy as a conventional web search.
Investors appear to think that demand will continue to rise. As revenue skyrocketed in late 2024, OpenAI raised billions more in funding. At one point, the company reportedly made about $300 million a month. Amazing figures. However, those funds also contribute to the payment of massive infrastructure bills behind the scenes.
Giant data centers are emerging at a startling rate throughout the US, Europe, and portions of Asia. Today, you may notice what appears to be a new warehouse complex rising next to highways in some areas of rural Texas or northern Virginia. slabs of concrete. electrical transformer rows. fences with high security. They seem normal to the untrained eye. However, in reality, they are AI factories.
Thousands of GPUs process data streams continuously inside those buildings, using enormous amounts of power from neighboring electrical grids. The effects are already being felt in some areas. Over the past five years, the cost of electricity has increased dramatically in the vicinity of data center clusters. In anticipation of even greater demand, local utilities are covertly building transmission lines.
The AI industry may have undervalued this aspect of the problem.
The story’s environmental aspect is even more intricate. Data centers require cooling systems because they generate a lot of heat. Water is used in many establishments to prevent server overheating. For every kilowatt-hour of electricity used, a typical data center might use about 1.7 liters of water. The numbers quickly increase when you multiply that by thousands of machines operating continuously.
Some communities have started posing awkward queries. Who gains from these amenities? And who bears the environmental expense?
Last year, a local in Arizona reportedly made a joke about a new data center project, calling it “a cloud that drinks water.” Because it encapsulated something genuine, the phrase endured. Although AI may seem abstract and digital, its physical presence is becoming more apparent.
Tech firms maintain that they are developing solutions. Google and Microsoft have committed to lowering carbon emissions and using renewable energy to power data centers. These pledges are important. However, the math is still unclear, particularly as models get bigger and demand for AI tools continues to grow worldwide.
Geopolitics is currently being shaped by the race for compute. In the upcoming ten years, nations with robust chip manufacturing, affordable electricity, and dependable infrastructure may have a significant advantage. Countries without those resources may be left out of the AI economy.
All of this has a peculiar irony. Many times, artificial intelligence is presented as a technology with boundless potential—cloud-based algorithms free from physical limitations. However, the story becomes more grounded as one digs deeper. cooling towers and wires. power facilities. systems of water.
It’s possible that the machines are digital. The price isn’t. It seems like we’re still in the early stages of this story as we watch the industry take off. AI will most likely change society and business. That seems obvious. How much infrastructure, money, and energy the world is willing to invest to make it work is still up in the air.

