The AI Confidence Gap Is Real—And Higher Ed Is Sitting Right in the Middle of It
by Claire L. Brady, EdD
A new report from Gallup, the Walton Family Foundation, and GSV Ventures—The AI Paradox: More Exposure, Less Confidence Among Gen Z—offers one of the clearest signals yet about how today’s students are really experiencing artificial intelligence.
Based on a national survey of more than 1,500 young people ages 14 to 29, the findings challenge a common assumption: that increased access to AI naturally leads to increased confidence, trust, and readiness.
It doesn’t.
Instead, the data reveals a more complex reality. Gen Z is using AI, but their optimism is declining. They recognize its importance, but question its impact. They know they’ll need it, but aren’t fully convinced it’s helping them learn, think, or prepare for the future.
This blog series breaks down the most important implications for higher education leaders—and, more importantly, what to do next. Because this moment isn’t just about adopting AI tools. It’s about how we lead through a shift that is already reshaping how students learn, work, and make meaning of their education.
Part 1/3
More exposure, less confidence.
That's the headline from a new national survey of more than 1,500 young people ages 14–29, published by Gallup, the Walton Family Foundation, and GSV Ventures. And for higher education leaders, it should stop you cold.
Let that sink in for a moment.
Just over half of Gen Z reports using AI at least weekly, a number that has essentially plateaued over the past year. At the same time, excitement has dropped, hopefulness has declined, and negative emotions like anger and anxiety are rising.
This is not what most of us expected.
We assumed that more access would naturally lead to more comfort, more fluency, and ultimately more trust. But this data tells a different story: exposure alone is not building confidence.
And for higher education, that’s a big deal.
Because our students are not just learning with AI—they are forming beliefs about it. Beliefs that will shape how they engage, how they trust, and whether they see AI as a tool for growth or a threat to their development.
What’s driving the skepticism?
At the core, Gen Z is deeply concerned about what AI might be doing to their thinking. The report shows that many students believe AI may actually harm creativity and critical thinking, and a striking 80% say it’s at least somewhat likely that AI will make learning more difficult in the future.
That’s not resistance. That’s reflection.
They’re asking the exact questions we should want them to ask: Is this helping me learn—or just helping me finish?
And yet, here’s the tension: they also know they need AI. A growing majority believe AI skills will be necessary for college and beyond. So they’re caught in the middle—expected to use a tool they don’t fully trust. That’s the confidence gap. And higher ed is right in the middle of it.
What This Means for Higher Ed Leaders
If exposure isn’t enough, then strategy matters more than ever. This is where institutions need to shift from access to intentionality.
1. Move from “allowing AI” to teaching AI.
Policies are rising fast in K-12 and higher ed—but policy is not pedagogy. Students need structured opportunities to practice using AI in ways that strengthen, not replace, their thinking.
2. Design for cognitive partnership, not cognitive outsourcing.
If students fear losing critical thinking skills, our assignments should explicitly require them to use AI and demonstrate their own reasoning alongside it.
3. Name the tension out loud.
Students are already aware of the tradeoffs. When institutions pretend AI is purely positive—or purely dangerous—we lose credibility. Trust grows when we acknowledge both.
4. Build confidence through guided use.
The report makes one thing clear: frequent users feel more positive than non-users. But even they are becoming more skeptical. That tells us frequency alone isn’t enough—quality of use matters.
This is the moment for higher ed to step into its role—not just as a provider of access, but as a designer of learning experiences. Because if we don’t help students build confidence with AI, they won’t build it on their own.
And if they don’t trust how to use it well, they won’t use it in ways that support their success.
Note: image created using ChatGPT