When AI Arrives on Campus Faster Than the Culture Can Handle

by Claire L. Brady, EdD

A recent Boston Globe article, “Dartmouth College went all-in on AI. Then came the tension,” offers a fascinating—and instructive—look at what happens when a campus moves quickly to embrace artificial intelligence. Dartmouth has leaned into AI more aggressively than most peer institutions, investing in infrastructure, partnerships, and campus-wide access to tools. But as the article makes clear, technological momentum doesn’t always move at the same speed as institutional culture.

The result? A familiar pattern that many higher education leaders are already beginning to recognize.

As the Globe describes it, Dartmouth’s rollout sparked debate across campus: “Students and professors debating whether they should use artificial intelligence for class assignments.” Faculty raised concerns about academic integrity, copyright issues tied to AI training data, and the broader implications for teaching and learning.

None of this should surprise us.

AI is not just another campus technology upgrade. It touches the core of what higher education does: teaching, scholarship, authorship, and the relationship between students and knowledge.

One Dartmouth faculty member captured the moment well: “There is no escaping [AI], and they have to figure out how to use it wisely.” That statement should be taped to every provost’s desk in America. Because here’s the truth: avoiding AI is not a strategy. But neither is rushing headlong into adoption without thoughtful leadership.

The Real Lesson: Technology Moves Faster Than Governance

Dartmouth’s experience highlights a challenge I see on campuses everywhere. Leaders often move quickly to secure licenses, announce partnerships, or launch pilot programs. Meanwhile, faculty, staff, and students are still trying to make sense of what the technology means for their work.

The Globe reports that “over half of participating professors had not changed their assessments to reflect AI.” At the same time, many faculty were banning AI in syllabi—a policy one Dartmouth professor described as “totally unenforceable.”

That disconnect is the real story.

Institutions are introducing powerful tools without fully redesigning the systems that surround them: pedagogy, assessment, governance, ethics, and communication.

Four Questions Higher Ed Leaders Should Be Asking Right Now

If Dartmouth’s experience teaches us anything, it’s that AI adoption must be as much about leadership as technology. Here are four practical questions leaders should be discussing on their campuses.

1. Are we investing in AI literacy—or just AI access?

Providing licenses is easy. Helping faculty and students understand when and how to use AI responsibly is the real work.

Action: Create structured AI literacy programs for faculty, staff, and students that address pedagogy, ethics, and critical evaluation.

2. Have we redesigned assessment for the AI era?

If assignments can be completed by a chatbot, the issue isn’t student behavior—it’s assignment design.

Action: Support faculty in redesigning assessments toward applied learning, reflection, oral defense, and project-based work.

3. Who actually owns AI governance on campus?

Many institutions are still operating without clear decision-making structures.

Action: Establish cross-campus AI governance groups that include faculty, student affairs leaders, IT, and legal counsel.

4. Are we having honest conversations about tradeoffs?

As one expert in the article put it, adopting AI means accepting loss alongside gain: “The hard part is to accept that you are going to lose something.”

Action: Create forums for open dialogue about academic values, authorship, environmental costs, and the role of human thinking in the AI era.

Leading the Moment, Not Just Managing the Tool

What I appreciate about Dartmouth’s story is that it shows the messiness of real institutional change. There are debates. There are missteps. There are tensions.

That’s not failure—that’s governance.

Higher education is not supposed to adopt transformative technologies quietly. We are supposed to question them, interrogate them, and shape them. As Dartmouth’s spokesperson put it, the goal is “shaping how AI develops in education, rather than being shaped by it.” That’s exactly the mindset higher education leaders need right now. Because AI isn’t just arriving on our campuses. It’s asking us to decide who we want to be.

Read the full article: “Dartmouth College went all-in on AI. Then came the tension.” Boston Globe:

https://www.bostonglobe.com/2026/02/25/technology/dartmouth-ai-tension/

A green background with bold white typography reads, “When AI Arrives on Campus Faster Than the Culture Can Handle.” Below a thin horizontal line, smaller white text reads, “a Dartmouth College Case Study.”
Next
Next

NASPA Reflections: Leading AI Without Losing Your Values