AI Won’t Fix Dysfunctional Systems—It Will Expose Them Faster

by Claire Brady, EdD

One of my favorite LinkedIn newsletters “Experiments in Intelligence” comes from Dave Birss, whose reflections on creativity and AI always get me thinking about how his lessons apply to higher education. In a recent newsletter, he described AI as a “turbo button” for whatever systems you already have in place. That image stuck with me, because in higher ed, our systems are often complex, bureaucratic, and—let’s be honest—a little dysfunctional. The temptation is to believe AI can smooth everything out. But the reality is this: if your processes are clunky, your structures siloed, or your incentives misaligned, AI will only magnify the mess.

A Mirror, Not a Cure

Generative AI excels at acceleration and multiplication. But it lacks judgment. If you feed a broken system into AI, you’ll just get the broken system faster. Imagine an advising office where students must navigate multiple intake forms, each owned by a different unit. AI can route those forms more quickly, but the fundamental problem—fragmented ownership—remains. The student still gets bounced around. The frustration simply arrives at lightning speed.

The Higher Ed Version of "Faster Dysfunction"

We don’t need to look far to see the risks. Consider these familiar examples:

  • Bloated approvals: new hires routed through five committees and three signatures. An AI tool might shorten the turnaround time on paperwork—but it won’t resolve why so many sign-offs exist in the first place.

  • Data silos: Admissions, financial aid, and student success systems all use different definitions of “enrolled.” AI analytics layered on top may churn out slick dashboards, but the outputs will still reflect fractured inputs.

  • Distorted incentives: If success is measured only by enrollment growth, AI-driven outreach may increase applications while worsening yield or straining support services.

In all these cases, AI doesn’t fix the dysfunction. It simply scales it.

What Leaders Can Do

Before adding AI to your strategy, pause and ask three questions:

  1. Where are our root dysfunctions? Audit the obstacles—bureaucracy, misaligned incentives, outdated processes—that frustrate staff and students.

  2. What can we streamline without AI? Simplify approvals, align definitions, and reduce unnecessary duplication first. This clears the runway for AI to be a true accelerator rather than a stress test.

  3. How will we measure success responsibly? Metrics must go beyond speed and volume. Faster processing that erodes trust or morale is not progress.

The Payoff of Getting It Right

When higher ed leaders commit to structural clarity, AI becomes a genuine force multiplier. Imagine:

  • A simplified advising model where AI supports personalized nudges instead of routing students through red tape.

  • A unified data infrastructure where predictive analytics actually reflect the student journey.

  • Incentives aligned with student success, so AI tools reinforce equity and belonging rather than enrollment at all costs.

AI should not be seen as a shortcut around difficult leadership work. It’s a catalyst—but only when paired with human-centered decisions, structural clarity, and intentional strategy.

The Bottom Line for Higher Ed Leaders

Dave Birss’s recent newsletter was a timely reminder: AI doesn’t relieve leaders of responsibility—it magnifies it. In higher ed, that means our real work is to strengthen our systems and cultures so that AI accelerates the good, not the broken. Get the human stuff right, and AI will fuel transformation.

A round silver-framed mirror sits on a gray surface against a muted blue-gray wall. Bold white text in the mirror’s reflection reads “AI AS A MIRROR,” with soft lighting highlighting the clean, minimal design.
Previous
Previous

What’s Worth Watching? Deloitte’s 5 Big Trends for Higher Ed in 2025

Next
Next

AI’s Not So Hidden Thirst