How to Reduce the Negative Environmental Impacts of AI, while advancing innovation?
The other day, we were flying from San Francisco to Washington, DC. As the plane began its descent, we looked out the window. What we saw was striking, not just the roads and rooftops, but clusters of massive buildings stretching across the landscape: massive data centers. Dozens of them, with more under construction.
We kept thinking. These buildings aren’t just tech infrastructure; they’re energy ecosystems. Each one draws enormous amounts of power, water, and materials. And with the AI boom, more of them are coming.
That view from the sky sparked a question we’ve been thinking about ever since: How can we reduce the environmental impact of AI while still advancing innovation?
Why This Matters
AI relies on physical systems that require electricity, water, metals, and more. And as AI adoption continues to scale, its environmental footprint is growing just as fast. That means if we do not act now, we risk building a digital future that’s unsustainable by design.
But here’s the good news: every stage of the AI lifecycle offers a chance to reduce harm and rethink the status quo.
The “How”: Four Key Leverage Points
- Build Smarter Infrastructure
Decisions made early on, like where to place a data center or how it’s cooled, can have long-term environmental consequences. AEC (Architecture, Engineering, and Construction) firms can go beyond regulatory compliance and offer full lifecycle environmental assessments to identify optimal sites and systems. Simulating energy use in the design phase, optimizing layouts for airflow, and monitoring HVAC performance during operations all contribute to lower emissions and smarter infrastructure. - Design Greener Hardware
Chips, servers, and storage systems that power AI are made with energy- and resource-intensive processes. There’s an urgent need for innovation in sourcing, modular design, and reuse. Reducing material waste, increasing recyclability, and lengthening hardware lifespans can make a big difference at scale. - Train and Deploy AI More Efficiently
Tech companies have the opportunity to shift high-energy workloads to align with renewable availability (e.g., using a “Green Schedule”). They can also reduce unnecessary training cycles by breaking down large problems into smaller ones, deploying lighter models, and optimizing experiments to be more energy-conscious from the start. - Use AI More Thoughtfully
Finally, usage matters. Not every task needs a massive model. Just as we don’t run our dishwasher for a single fork, we should ask: where does AI create value, and when is it worth the energy it consumes? Using AI tools more intentionally is one of the easiest and most overlooked steps we can take.
Key Takeaways
- Infrastructure decisions, hardware design, and development practices all matter
- There are major opportunities for AEC firms, manufacturers, and tech providers to lead change
- End-users play a role too by using AI more deliberately and efficiently
Reducing AI’s footprint is not about saying “no” to innovation. It is about saying “yes” to smarter, more sustainable choices at every level.
This article is just a glimpse of what’s in our full AI Sustainability White Paper.