Data centres now consume more electricity than some countries. Their hardware churn, hidden emissions, and economic centralisation make the “green cloud” a dangerous myth. Local-first computing offers a quieter, more resilient, and far more sustainable alternative.
The Real Environmental Cost of the Cloud
For years the cloud has been marketed as clean, efficient, and modern. Big providers love to show glossy photos of solar-powered data centres, wind turbines, and cheerful sustainability graphs that always point upward. But behind the PR lies a far less flattering reality: the environmental cost of the cloud is neither small nor abstract. It is vast, accelerating, and almost entirely invisible to the people who rely on these platforms every day.
We talk about the cloud as if it were immaterial, an airy metaphor drifting somewhere above us. The truth is far less poetic. The cloud is a planetary-scale network of concrete, steel, diesel generators, cooling systems, cables, transformers, batteries, and millions of servers consuming electricity every second they exist.
Cloud computing has a real footprint. It is long past time to acknowledge it.
The Energy Hunger of the Cloud
Global data centres are now among the most energy-intensive buildings humans create. The numbers speak for themselves:
- In 2024, data centres consumed around 415 TWh of electricity — roughly 1.5 % of all electricity used worldwide.
- Their growth rate has averaged 12 % per year for the past half-decade.
- By 2030, global data-centre electricity use is projected to climb to 945 TWh, close to 3 % of global electricity consumption.
- In the United States, data centres already account for ~4 % of national electricity use — potentially rising above 9 % by 2030.
- Cooling systems alone can consume billions of gallons of water annually in some regions.
These figures are not hypothetical; they are structural. And they expose two uncomfortable truths:
- Scale matters: Data centres already exceed the electricity demand of many industrial sectors.
- Growth outpaces efficiency: Efficiency gains cannot keep up with demand increases — especially with the rapid expansion of AI workloads.
A cloud region is not an abstraction; it is an industrial installation with the footprint to match.
The Hidden Emissions Cloud Providers Don’t Advertise
Even if a data centre ran entirely on renewable energy (few do), the majority of its real-world impact lies elsewhere: in the hardware.
A single server embodies:
- rare-earth metals
- lithium, copper, and aluminum
- silicon wafers
- plastics and composites
- hundreds of individual components sourced from global supply chains
The carbon footprint of manufacturing, shipping, refining, assembling, and later decommissioning that hardware often exceeds the emissions from operating it for months — sometimes years.
Yet cloud marketing focuses almost exclusively on operational emissions, not embodied ones. They account for electricity use — not the environmental cost of building the machines, replacing them on fixed cycles, or disposing of them.
The E-Waste Cloud Providers Would Prefer You Ignore
Cloud-scale hardware does not die when it wears out, it dies when the efficiency curve demands a replacement. Perfectly functional machines are pulled from racks, shredded, crushed, or “recycled” in processes that themselves demand energy and create pollution.
The logic is simple: Hyperscale platforms optimise for density, not longevity.
Local communities, small companies, researchers, and individuals could run much of this hardware for years. But the cloud economy is not designed for reuse; it is designed for throughput. One provider upgrading a region can generate scrap equivalent to the annual hardware consumption of a mid-sized country. This is not sustainable computing, it is industrial churn.
The Economic Footprint Nobody Mentions
Environmental impact and economic centralisation are tightly connected. Cloud infrastructure concentrates wealth and jobs in a handful of multinational corporations, removing both from local economies.
Every workload moved to the cloud means:
- fewer local technicians,
- fewer local infrastructure projects,
- fewer local IT roles,
- fewer local hardware purchases,
- and less money circulating within regional economies.
The cloud does not distribute opportunity; it aggregates it. The environmental footprint mirrors this economic gravity.
By contrast, local-first systems:
- employ local people,
- run on local energy (including your own solar),
- and strengthen the same communities that use them.
Cloud centralisation drains both the environment and local economies at the same time.
Why Decentralisation Is Climate Protection
Decentralisation is not just technically superior for autonomy, it is environmentally superior for sustainability.
Local-first systems reduce:
- server demand,
- long-haul data movement,
- redundant compute cycles,
- cooling overhead,
- hardware churn,
- and cloud region scaling pressure.
A single well-utilised local server is often more energy-efficient than the exact same workload running inside a hyperscale facility with:
- triple redundancy,
- cooling losses,
- backup power infrastructure,
- idle overhead,
- and routing inefficiencies.
Every local computation is one less cloud computation. And every cloud computation avoided is measurable environmental relief.
Cloud Fragility and the Illusion of Clean Reliability
Cloud services are regularly sold as “sustainable by design” — yet outages show the fragility of that narrative. When a region goes offline, it affects millions of users, entire countries, and critical infrastructure. A single configuration error can ripple across continents.
Local systems fail differently. They fail quietly, locally, without cascading across the world.
From a planetary perspective, dependency itself is the environmental problem: the failure of one cloud region forces millions of devices to reconnect, retry, reprocess, reload, and re-sync. That is energy wasted at planetary scale. Local-first computing avoids that by avoiding dependency altogether.
The Future Does Not Need More Cloud — It Needs More Autonomy
The cloud will not get greener through marketing slogans or renewable certificates. Its environmental cost is structural, not superficial.
Local-first computing will not save the planet by itself, but it directly reduces:
- energy consumption,
- hardware waste,
- embodied emissions,
- cooling demand,
- and the economic incentives that drive hyperscale growth.
It is not an aesthetic preference, it is a materially different mode of computing. Your own infrastructure demonstrates what is possible when computation is brought back home, powered by your own local energy, running on hardware you understand and control.
Why This Text
My conviction about local-first computing comes from observing a much larger pattern: the steady erosion of technical expertise as organisations push their infrastructure into the cloud. Every year, more companies retire their own systems, disband their internal teams, and replace engineering roles with vendor contracts. The result is a slow but unmistakable loss of knowledge — not just in individual companies, but across the entire industry.
This shift has human consequences. Work that once required deep understanding is increasingly replaced by centrally managed, highly standardised processes. The craft of engineering is quietly transformed into the administration of someone else’s platform. Entire professions are flattened, narrowed, and de-skilled, all in the name of efficiency.
Local-first systems resist this trend. They keep people close to the technology they depend on. They preserve the ability to build, troubleshoot, and understand one’s own infrastructure. They maintain a space where expertise still matters and where autonomy is not treated as an outdated concept.
This text, and the tools I continue to create, are grounded in that belief: that computing should strengthen our capabilities, not diminish them — and that progress does not need to come at the expense of our environment, our autonomy, or the knowledge that once defined our field.