Pilot Smarter Not Harder

By Dr. Claire Brady

“85% of AI pilots fail”

That headline has made the rounds—and it’s designed in some ways to make you panic. But for higher education leaders, this statistic should inspire reflection more than fear. Because it’s not that the technology fails. It’s that our approach to it does.

In my keynote at the ACCT Leadership Congress, I reminded thousands of college trustees and presidents: “you’re not governing a technology adoption—you’re governing a mission-critical transformation that happens to be enabled by technology.”

AI isn’t a side project or an app you download; it’s a shift in how work gets done, decisions get made, and learning is supported. And like any institutional change, it lives or dies by leadership, culture, and clarity.

So why do so many AI pilots fail?

Let’s start with the patterns:

  • Expensive platforms very few people use after launch.

  • Tools that deliver recommendations no one is equipped to implement.

  • Pilots that never scale beyond early adopters.

  • Projects that amplify the college’s weaknesses instead of solving them.

The truth is, pilot success isn’t about launching and leaving—a pilot isn’t a finish line; it’s a learning lab. The goal isn’t perfection; it’s progress through iteration, insight, and forward momentum.

Most AI failures have less to do with the algorithms and more to do with human systems. We treat AI like a tech rollout instead of a behavioral and organizational change. Here’s what that looks like in practice—and how to do it better:

Start with behaviors, not platforms.

AI tools don’t magically make teams more efficient or student experiences more personal. Those outcomes come from shifting daily habits—how we write, plan, analyze data, and make decisions. Focus on small, repeatable changes that model the new behaviors you want to see.

Move at the right speed.

Going too fast can overwhelm teams; going too slow means you lose momentum. Leaders should set a realistic pace—fast enough to stay relevant, slow enough to be responsible.

  • Invest in data quality and integration.

  • Weak data systems are the silent killers of AI progress. Before launching another pilot, ask: Do we trust our data? If not, start there.

  • Make AI a strategic priority, not an experiment.

  • Treat AI literacy, ethics, and governance as leadership competencies.

  • Align projects with student success goals and institutional KPIs, not with curiosity alone.

We’re moving from using AI as a side tool to working in environments where AI is built into everything we touch. This shift requires courage, coordination, and culture—not just code.

When colleges embrace AI as an opportunity to learn, adapt, and strengthen their mission, the success rate skyrockets. Because in the end, the goal isn’t to adopt AI. It’s to build institutions that can adapt—with humanity, strategy, and purpose leading the way.

Next
Next

Making Higher Ed’s Relationship with AI Last