Making Higher Ed’s Relationship with AI Last
by Claire Brady, EdD
“The real work begins once the novelty fades and we're left to decide: What kind of relationship do we want with AI?”
Something has shifted on campus this year. What used to be tentative curiosity about AI has become full-on enthusiasm. New committees, pilot projects, and playgrounds of possibility are everywhere. As an AI consultant and coach, I’m not mad about it—in fact, I LOVE seeing the energy, the creativity, and the willingness to explore. But I also recognize what’s happening: New Relationship Energy is in full effect.
You know the feeling. It’s that spark at the start of something new—exciting, full of promise, and brimming with potential. Everyone’s talking about what’s possible, testing new tools, and imagining how everything might change. But just like in any new relationship, the thrill of discovery can make it easy to overlook complexity, limits, and the work it takes to build something lasting.
This is New Relationship Energy—and higher ed is deep in it.
New Relationship Energy—or NRE—is that unmistakable spark that ignites when something (or someone) new enters your life. It's thrilling, energizing, and full of possibility. In the early days, everything feels easy. You overlook flaws, idealize potential, and feel certain that this—finally—is the thing that will change everything.
Sound familiar?
Across campuses, AI has captured that same kind of electric attention. Leaders are experimenting with tools, faculty are testing classroom applications, and staff are finding new efficiencies in advising, communications, and operations. The enthusiasm is real—and well deserved. But as anyone who's ever navigated the difference between chemistry and compatibility know, spark alone doesn't sustain a relationship.
NRE is a spark, not an engine.
The real work begins once the novelty fades and we're left to decide: What kind of relationship do we actually want with AI?
For higher education, this means moving beyond the "wow" phase and into the "how." It means shifting from experimentation to intentional integration—grounded in our mission, values, and people. Like any lasting partnership, the goal isn't to chase the rush, but to build something resilient, ethical, and mutually beneficial.
Building a Relationship That Lasts
None of this is easy. You'll face budget constraints, competing priorities, and faculty who range from AI evangelists to absolute skeptics. But here's what intentional partnership might look like:
1. Move from curiosity to clarity. In the early stages, curiosity fuels exploration. But clarity sustains growth. Colleges and universities need shared language and frameworks for what responsible AI adoption looks like on their campuses. This doesn't mean every department defines "responsible use" differently—it means creating a campus-wide AI literacy framework that everyone can reference and build upon. It's not about dampening curiosity—it's about channeling it toward coherent strategy and aligned purpose.
2. Define your relationship goals. Every healthy partnership has boundaries and intentions. Are you courting AI for efficiency? For access? For equity? For innovation? The answer will shape not only which tools you choose but how you measure success and ensure accountability. A community college focused on access might prioritize AI tutoring tools that provide 24/7 support. A research university centered on innovation might invest in AI that accelerates discovery. Know what you're building toward, and let that guide your choices.
3. Prioritize trust over speed. NRE can tempt us to move fast—sign contracts, launch pilots, deploy bots. But sustainable AI integration requires trust: trust among staff, trust from students, and trust in the systems themselves. When one university deployed an AI chatbot for student services without faculty consultation, adoption cratered within weeks. The tool wasn't the problem—the process was. That means transparency, ethical guardrails, and space for reflection when things don't go as planned.
4. Keep the human connection at the center. At its best, AI should free us to connect, not free us from connection. The ultimate test of this relationship won't be how advanced the technology becomes, but whether it helps educators, staff, and students feel more seen, supported, and successful. If your AI writing assistant gives faculty more time for meaningful student conversations, it's working. If it replaces those conversations, something's gone wrong.
5. Set realistic expectations (and don’t be afraid to date around). No single AI tool will solve all your problems—or transform your institution overnight. Expecting perfection from the first platform or partnership is like assuming the first date will end in a long-term commitment. Explore, test, compare, and learn before deciding what works best for your unique context. Healthy boundaries and realistic expectations make for happier, longer-lasting relationships—with people and technology.
As the NRE fades—and it will—the question for higher education isn't "What's next?" but "What endures?" Can we turn initial attraction into enduring partnership? Can we build systems of care, innovation, and integrity that last longer than the hype cycle?
If we can, then maybe this won't just be another fleeting fling with the next shiny tool. It will be the start of something deeper: a relationship built not on infatuation, but on intention.
So which kind of institution will yours be—the one still chasing the next AI trend in 2026, or the one that built something that actually lasts?