EARNED – Thought Leadership (PR 535)
AI’s Energy Problem Is Getting Worse.
I’m Worried We’re Running Out of Time.
By Sundar Pichai, Chief Executive Officer, Alphabet Inc.
I want to be honest about something that worries me. Every week, new AI data centers are announced. Every month, the energy demands of training and running large AI models grow. And every quarter, I see the same press releases from companies across this industry describing sustainability goals that are years away, measured in terms that are conveniently difficult to verify. I find that deeply concerning — not as a competitive observation, but as someone who has watched what it actually takes to close the gap between a commitment and a result.
At Alphabet, we have spent years doing the work that most of this industry is still describing in press releases. In 2024, we signed 8 gigawatts of clean energy agreements — the largest annual total in our company’s history. We have replenished 4.5 billion gallons of water in the communities where our data centers operate. Despite a 27 percent increase in our electricity consumption as AI demand grew, we reduced our data center energy emissions by 12 percent. Our AI chip architecture now delivers 30 times the energy performance per watt compared to our systems from five years ago. Through five Alphabet products, we enabled 26 million tonnes of CO2e reductions globally. We became the first corporation in the world to sign a power purchase agreement for a small modular nuclear reactor — because we believe serious environmental commitments require serious infrastructure investment, not aspirational timelines.
I am not sharing this to suggest we have solved the problem. I am sharing it because achieving these outcomes took years of sustained investment, difficult infrastructure decisions and a willingness to be measured against specific operational targets rather than aspirational language. And I am genuinely worried that most of the industry has not yet made those decisions — or made peace with the accountability that comes with them.
What concerns me most is the pace of AI expansion relative to the pace of environmental accountability. The AI industry is growing faster than its environmental conscience. When I look at the commitments being made industry-wide — and at the gap between those commitments and operational reality — I see an industry that is outsourcing its accountability to the future. That worries me, because the future has a way of arriving faster than anyone plans for.
The environmental cost of AI is not hypothetical. Data centers require enormous quantities of electricity and cooling. Water consumption is significant and local — it affects specific communities, not just aggregate statistics. Infrastructure expansion is accelerating faster than clean energy supply can match it. These are physical realities, not projections. And while I genuinely believe AI can help address climate challenges — we have seen this in DeepMind’s climate modeling work and energy grid optimization — that potential is undermined when we build AI systems in ways that make the underlying environmental problem worse. The technology cannot be both the cause and the cure if we refuse to take the cause seriously.
Transparency plays a central role in this. I am concerned by how rarely sustainability commitments in this industry are tied to independently verifiable, grid-level data. Publishing a carbon neutrality goal without disclosing current operational baselines is not transparency. It is a placeholder. The companies building AI at the largest scale have an obligation to do better — not because regulators will eventually require it, but because the credibility of this entire industry depends on it.
What I am calling for is not impossible — Alphabet has already demonstrated that it is not impossible. We did not wait for clean energy to become universally convenient before investing in it. We did not wait for nuclear infrastructure to mature before becoming the first company to sign an agreement to power our operations with a small modular reactor. We did not wait for regulation before publishing grid-level emissions data. The path exists. The decisions are available. What I am no longer certain of is whether the urgency is felt across this industry — and that uncertainty is what keeps me up at night.
Strategy Note:
This thought-leadership piece adopts a distinctively concerned tone, positioning Sundar Pichai as a CEO genuinely troubled by the widening gap between AI expansion and environmental accountability across the industry. The tone is candid and personal rather than corporate, which is appropriate for a major editorial publication such as The New York Times or Fast Company. By establishing Alphabet’s verified 2025 track record first — 8 gigawatts of clean energy signed in a single year, a 12 percent reduction in data center energy emissions despite rising electricity demand, and the world’s first corporate small modular nuclear reactor agreement — the piece uses Alphabet’s demonstrated achievements as a credible benchmark from which to express concern about industry-wide inaction. This structure makes the concern authoritative rather than alarmist, and avoids the trap of corporate aspirational messaging: every claim in the piece reflects something Alphabet has already achieved, not something it intends to do.
The closing paragraph reinforces this distinction by shifting from a list of industry recommendations to a direct account of decisions Alphabet has already made — framing accountability not as a future goal but as an existing standard the rest of the industry has yet to meet. The headline creates immediate tension and draws the reader in with a sense of urgency. The CEO’s voice remains human and honest throughout, reinforcing Alphabet’s ESG positioning while issuing a direct challenge to an industry that has not yet matched a standard Alphabet has already demonstrated is operationally achievable.
PDF Version
A PDF version of this section is available for download below.