Scott Galloway just exposed Higher Ed’s Greatest Pressure Points
by Claire L. Brady
Full disclosure-I have a bit of a love–eye-roll relationship with Scott Galloway.
On any given episode of Pivot podcast, he’ll say something insightful, something provocative, and something that makes me wonder why he went there at all. So when I read his recent Scott Galloway, Higher Education’s Useful Irritant interview in The Chronicle of Higher Education, I had the same reaction.
It made me laugh. It made me bristle. And it made me think.
Because here’s the thing:
Scott Galloway might be wrong about a lot of things in higher ed—but he’s not wrong about the pressure points. And those pressure points are exactly where AI is about to force change.
Galloway takes aim at inefficiency, bloated structures, rising costs, and misaligned incentives. He questions whether institutions are truly educating students or primarily certifying them. He challenges the tension between access and exclusivity. And, in his own uniquely blunt way, he calls out a system that often resists accountability while protecting prestige.
You don’t have to agree with his tone—or even his conclusions—to recognize why this resonates right now.
Because higher education leaders feel it too.
We see it in the friction of everyday operations.
We see it in the growing expectations from students and families.
We see it in the quiet (and sometimes not-so-quiet) questions about value, outcomes, and relevance.
And now, we’re seeing it through the lens of artificial intelligence.
AI is not creating these pressure points. It’s exposing them.
Take efficiency. When a tool can draft communications in seconds, summarize complex data instantly, or streamline workflows that once took hours, the question shifts quickly. It’s no longer can we adopt AI? It’s why are we still doing this the hard way?
Take the student experience. If we’re honest, parts of the system still function more like sorting mechanisms than support systems. AI has the potential to change that—enabling more personalized advising, more responsive services, and more scalable support. Not replacing people, but allowing us to show up where it matters most.
Take trust. Galloway’s critique hints at something deeper: skepticism about whether institutions are acting in students’ best interests. AI raises the stakes. Implemented poorly, it accelerates distrust. Led well, it becomes an opportunity to rebuild it—through transparency, ethics, and intentional design.
And then there’s culture—the throughline in all of this. The biggest barriers in higher education are not technical. They’re human. They’re embedded in how we make decisions, how we define value, and how willing we are to evolve.
AI doesn’t solve that. It surfaces it.
So what do we do with this moment?
If these are the pressure points, then this is where leaders should focus:
1. Start with friction, not tools.
Before launching another AI initiative, ask: where are we losing time, clarity, or momentum? Target the everyday inefficiencies your teams already feel. That’s where AI will have the most immediate and credible impact.
2. Redesign for support, not just scale.
Don’t use AI to do more of the same faster. Use it to rethink how students experience your institution—especially in advising, communication, and access to services. If it doesn’t improve the student experience, it’s not the right use case.
3. Make trust a design principle.
Be explicit about how AI is being used, where human oversight exists, and what guardrails are in place. Transparency isn’t a communication strategy—it’s a leadership strategy.
4. Build capacity through doing, not just training.
Workshops are helpful. But confidence comes from application. Create small, bounded projects where staff and faculty can experiment, learn, and see impact in real time.
5. Align AI work to institutional priorities.
If your AI efforts live on the margins, they won’t stick. Connect them directly to your strategic plan—student success, retention, operational excellence. Make the value visible.
6. Pay attention to the culture signals.
Resistance is data. Skepticism is insight. Instead of pushing past it, engage it. The questions people are asking about AI are often the same questions they’ve been asking about the institution itself.
We don’t need to adopt Galloway’s worldview to learn from his critique. But we would be wise to pay attention to where it lands.
Because those friction points aren’t abstract.
They’re operational.
They’re cultural.
And increasingly, they’re visible to everyone.
AI isn’t the disruption. It’s the mirror. And what it’s reflecting back to us is where the real work begins.
Read the full Chronicle article here: https://www.chronicle.com/article/scott-galloway-higher-educations-useful-irritant?
Note: this image was created using ChatGPT