Not long ago, a data center was just a room with servers. Maybe a warehouse. Maybe a few racks humming quietly in the background. These facilities powered websites, emails, medical records, and online stores. They were important, but they were not monsters.
Today, data centers are becoming something else entirely.
They are becoming the new power plants.
And once you see it, you cannot unsee it.
The biggest technology companies in the world are now building data centers that use as much electricity as entire cities. Some are so large that utilities are planning new natural gas plants just to keep them running. Others are competing for water in drought-prone regions to cool their equipment. Communities are being asked to host these massive facilities with promises of jobs and innovation, while quietly absorbing the costs.
If this sounds familiar, it should.
It is the same story we have heard before with fossil fuel power plants, desalination projects, hydrogen hubs, and other mega-infrastructure. Bigger is presented as better. Centralization is framed as efficiency. And the public is told there is no alternative.
But there is.
First, let’s talk about why these new data centers are so huge.
The answer is artificial intelligence.
Training and running AI systems requires enormous computing power. Instead of a few servers handling emails or websites, companies now deploy tens of thousands of specialized chips working around the clock. These chips generate tremendous heat and consume staggering amounts of electricity.
A traditional data center from the early 2000s might have used a few megawatts of power. That is roughly the amount needed for a small neighborhood.
A modern AI data center can demand hundreds of megawatts. That is closer to the energy needs of a medium-sized city.
Suddenly, a building that once looked like an office park now looks like industrial infrastructure. Because that is exactly what it is.
But here is where the story gets interesting.
Companies say these giant facilities are necessary because they are more efficient. They argue that concentrating computing in massive campuses lowers the cost of delivering digital services.
And in a very narrow sense, that is true.
Just like a giant power plant can produce electricity more cheaply per unit than a small generator.
But that is not the whole story.
When infrastructure gets bigger, demand tends to grow to match it. This is called the rebound effect. Cheaper computing encourages more computing. More data storage. More streaming. More automation. More AI.
Soon, the total energy use rises, even if each individual task becomes more efficient.
We have seen this pattern before.
Utilities built large centralized power plants because they were supposedly cheaper. Then they encouraged customers to use more electricity to justify those investments. Over time, the system became bigger, more expensive, and harder to manage.
Now the tech industry is following the same path.
Data centers are clustering in regions with cheap land, tax incentives, and access to electricity. Local governments often offer generous subsidies to attract them. Ratepayers may end up funding grid upgrades. Water supplies are strained. And communities are left dealing with the environmental impacts.
Meanwhile, the benefits are not always what people expect.
These facilities are highly automated. They create relatively few permanent jobs. And once they are built, they lock communities into decades of infrastructure that consumes energy and resources.
So here is the obvious question.
Do we really need infrastructure on this scale?
Or is there another way?
The answer is yes, there is another way. And it already exists.
It is called distributed computing.
Instead of relying on a handful of enormous data centers, computing can be spread across many smaller systems. Think of local servers in schools, hospitals, businesses, and homes. Think of neighborhood data hubs. Think of devices working together instead of depending on a single giant facility.
If this sounds familiar, it should.
It is the same idea behind rooftop solar panels and virtual power plants. Instead of building one massive generator, energy is produced and managed across thousands of smaller sources. The system becomes more flexible, more resilient, and often more affordable.
The same logic applies to computing.
You can already run your own cloud storage at home. You can host websites on small servers. You can even run powerful AI tools on a personal computer. These technologies are not science fiction. They are available today.
The problem is not technology.
The problem is incentives.
Large centralized projects attract investors. They generate predictable revenue. They make it easier for companies to control data, pricing, and infrastructure. And they often qualify for public subsidies.
In other words, they are profitable.
But profitable does not always mean necessary.
And it certainly does not always mean better.
We should not assume that bigger infrastructure is the only path forward. Just as communities are embracing distributed energy solutions, we can imagine a future where computing is more local, more efficient, and more democratic.
A future where technology serves people, instead of the other way around.
Once you see the parallel between data centers and power plants, the pattern becomes clear.
Gigantic infrastructure is not inevitable.
It is a choice.
And like all choices, it can be changed.