AI Doesn't a Student Engagement Problem—It's has a Trust Problem
by Claire L. Brady, EdD
A new report from Gallup, the Walton Family Foundation, and GSV Ventures—The AI Paradox: More Exposure, Less Confidence Among Gen Z—offers one of the clearest signals yet about how today’s students are really experiencing artificial intelligence.
Based on a national survey of more than 1,500 young people ages 14 to 29, the findings challenge a common assumption: that increased access to AI naturally leads to increased confidence, trust, and readiness.
It doesn’t.
Instead, the data reveals a more complex reality. Gen Z is using AI, but their optimism is declining. They recognize its importance, but question its impact. They know they’ll need it, but aren’t fully convinced it’s helping them learn, think, or prepare for the future.
This blog series breaks down the most important implications for higher education leaders—and, more importantly, what to do next. Because this moment isn’t just about adopting AI tools. It’s about how we lead through a shift that is already reshaping how students learn, work, and make meaning of their education.
Part 2/3
Gen Z is already using AI. The real issue isn't adoption — it's trust.
New data from Gallup's Voices of Gen Z study makes this impossible to ignore. Nearly half of Gen Z workers believe the risks of AI outweigh its benefits. And 69% say they trust work completed by humans more than work that involves AI. That's not a perception gap. That's a fundamental trust deficit — and it's showing up before many of our students even fully enter the workforce.
We are preparing students for a future powered by tools they don't fully believe in. That's the leadership challenge. And it goes far beyond classroom policy.
The Workplace Signal We Can't Ignore
Gen Z's skepticism isn't abstract — it's grounded in how they see AI shaping their futures. They are entering a labor market where AI is positioned as both opportunity and threat, particularly for entry-level roles. And they are responding accordingly: cautiously, critically, and in many cases, skeptically.
This is a generation that has grown up alongside rapid technological change. They are not naïve about innovation. If anything, they are more discerning. And that discernment is sending us a clear signal: trust must be earned.
Where Higher Ed Comes In
Higher education has a unique — and urgent — opportunity here. We don't just prepare students to use tools. We shape how they understand them. If students leave our institutions seeing AI as something to fear, outsource, or blindly rely on, we've missed the moment.
What we need to build instead is what I call informed trust. Not blind adoption. Not blanket resistance. The ability to evaluate, use, and question AI with genuine confidence — to know when it's serving you and when it isn't. That's the outcome higher ed should be designing toward.
What Leaders Should Be Doing Now
Make AI use transparent across the institution. If students don't trust AI, they won't trust how institutions are using it either. Be explicit about where AI appears — in advising, communications, operations — and where human judgment remains central.
Reframe AI as augmentation, not replacement. Students are worried about losing skills and jobs. Messaging and practice need to consistently reinforce that AI supports human capability rather than substituting for it — and then actually demonstrate that in how you teach.
Integrate AI into career readiness conversations. If nearly half of students believe AI skills are necessary for their careers, then career services, internships, and employer partnerships need to address it directly — not as a footnote, but as a core competency.
Teach evaluation, not just execution. Knowing how to prompt isn't enough. Students need to know when to trust outputs, when to question them, and how to verify them. That's a teachable skill. Treat it like one.
The Bottom Line
This is not an adoption problem. It's a trust-building moment.
Higher education has a choice: treat AI as a tool to integrate, or meet it as a leadership challenge to navigate. The institutions that will lead in this next era aren't the ones that adopt AI the fastest. They're the ones that help students develop informed trust — the ability to use AI with confidence, question it with skill, and know the difference.
Note: Image created using ChatGPT