The AI Boom Has a Power Problem
by Claire Brady, EdD
“We, too, are navigating the tension between innovation and infrastructure, experimentation and ethics, ambition and sustainability. Many campuses are feeling the allure of “big AI”—flashy pilots, ambitious centers, bold announcements. But the lesson from The Atlantic’s analysis is clear: scale without substance is risky business.”
In their recent Atlantic piece, “Here’s How the AI Crash Happens,” Matteo Wong and Charlie Warzel paint a vivid picture of the world’s newest boomtowns—places like New Carlisle, Indiana, where farmland is giving way to vast data centers that hum and glow with the energy demands of the AI race. The numbers are staggering: billions of dollars invested, more power consumed than entire cities, and one company—Nvidia—now worth more than Ford, General Motors, and Tesla combined.
The story reads like a modern-day gold rush. Only this time, the gold is data. And the pickaxes are GPUs (Graphics Processing Unit).
But beneath the spectacle lies a sobering question: What if the AI boom doesn’t yield the returns everyone expects? Wong and Warzel call this moment a “benevolent hostage situation”—an economy propped up by AI investments that may not be sustainable. It’s an uneasy echo of the dot-com bubble, only bigger, faster, and far more power-hungry.
For higher education leaders, this isn’t just a Wall Street story. It’s a mirror.
We, too, are navigating the tension between innovation and infrastructure, experimentation and ethics, ambition and sustainability. Many campuses are feeling the allure of “big AI”—flashy pilots, ambitious centers, bold announcements. But the lesson from The Atlantic’s analysis is clear: scale without substance is risky business.
So what can higher education learn from the potential “AI crash”?
1. Start with real value, not perceived urgency.
Just because everyone else is investing in AI doesn’t mean we need to chase the same kind of scale. The most meaningful advances on campuses right now aren’t coming from massive infrastructure projects—they’re coming from small, strategic integrations that improve advising, accessibility, or administrative efficiency.
2. Build energy literacy into AI literacy.
Data centers don’t run on magic; they run on megawatts. As institutions commit to sustainability goals, leaders should factor in the environmental costs of cloud-based AI tools, even at the classroom level. AI adoption isn’t just a digital decision—it’s an ecological one.
3. Follow the human ROI.
The real return on AI in education won’t be measured in dollars or data points, but in how it expands capacity for teaching, learning, and belonging. If an AI tool doesn’t help educators or students thrive, it doesn’t belong in the portfolio.
The Atlantic article closes with a chilling paradox: whether the AI boom collapses or succeeds spectacularly, disruption is inevitable. That may be true for Wall Street, but higher education has another path available—steady, intentional growth rooted in mission, not momentum.
Our job isn’t to build the next data center. It’s to build capacity—for discernment, for responsible leadership, and for human-centered innovation.
Because in higher ed, the real “AI crash” wouldn’t be financial—it would be forgetting why we started exploring this technology in the first place.
Read the full article here: https://www.theatlantic.com/technology/2025/10/data-centers-ai-crash/684765/?utm_source=flipboard&utm_content=user%2FTheAtlantic