NASPA Reflections: Care or Control? AI, Surveillance, and Student Trust

by Claire L. Brady, EdD

At NASPA 2026, I had the privilege of serving as the conference’s “AI Executive in Residence”, facilitating conversations with student affairs leaders about one of the most pressing leadership questions facing higher education today: how we guide the adoption of artificial intelligence on our campuses.

Across presentations, hallway conversations, and VPSA consultations, one theme surfaced again and again. The challenge is not simply understanding the technology. The real challenge is leading institutions through the cultural, ethical, and strategic decisions that AI demands.

Over the next few posts, I’m sharing reflections inspired by several of the mini-keynotes I delivered during the conference. Each explores a different leadership question emerging from this moment:

Are we chasing AI or leading it?

  • How do institutions adopt AI without compromising their values?

  • What questions should guide responsible AI decisions?

  • Where is the line between student support and surveillance?

My hope is that these reflections help student affairs leaders move from reacting to the technology toward shaping how it serves our mission and our students.

Care or Control? AI, Surveillance, and Student Trust

Imagine reading the following headline:

“College surveilling students without their knowledge.”

It’s the kind of story that makes national news quickly—and raises immediate questions about trust, privacy, and the role of technology in higher education.

During a recent session with student affairs leaders, I shared a scenario designed to spark conversation about exactly these tensions. The situation begins with good intentions.

A college decides to implement a behavioral monitoring system in its residence halls. The goal is to identify students who may be struggling before a crisis occurs. The system analyzes door access data, flags unusual patterns, and alerts staff to sudden changes in behavior. The idea is simple: if someone who typically attends classes and social events suddenly isolates themselves, the institution could intervene earlier and offer support.

At first glance, the initiative appears compassionate.

But within a semester, the unintended consequences begin to emerge.

Students report feeling watched. Resident assistants notice that students who used to drop by informally are now more hesitant to engage. Some students begin hiding struggles because they worry about how their data might be interpreted or shared. Ironically, help-seeking behavior begins to decline. The system designed to identify students in distress has instead pushed some of them further underground.

This scenario highlights one of the most important leadership questions emerging in the age of AI: where is the line between care and control?

Technology now allows institutions to collect and analyze enormous amounts of behavioral data. Used thoughtfully, that data can improve services, identify patterns, and help institutions respond more effectively to student needs. But the same systems can also create environments where students feel monitored rather than supported.

Trust is fragile. Once students believe they are being surveilled rather than cared for, the relational foundation of student affairs work begins to erode.

Responsible leadership requires asking not only Can we do this? but also Should we?

Consent becomes a central issue. Students should understand what data is being collected, how it will be used, and what choices they have about participation. Meaningful consent cannot be buried inside lengthy terms of service or generic announcements. It must be clear, transparent, and ongoing.

Human judgment must also remain central. Technology can surface information, but it should never replace the relational insight that comes from conversation, mentorship, and professional experience. AI should support the work of student affairs professionals—not substitute for it.

One helpful test for leaders considering AI-enabled monitoring systems is what I call the front-page test: if your institution’s use of this technology appeared on the front page of a newspaper tomorrow, could you confidently explain why it aligns with your values and mission?

If the answer is uncertain, it may be time to slow down.

Artificial intelligence will create powerful new tools for supporting students. But if those tools undermine trust, they will ultimately weaken the very relationships they were meant to strengthen.

The challenge for student affairs leaders is not simply deciding what technology we can deploy. It is ensuring that, even in a data-driven world, the humanity at the center of our work remains intact.

Previous
Previous

Smart Glasses, Secret Recordings, and the Campus Conversation We Need to Have

Next
Next

NASPA Reflections: The Five Questions Every AI Decision Must Answer