Reflections on “Why So Many People Are Seduced by ChatGPT”
by Claire Brady, EdD
“Our job isn’t to pull students away from the story—but to teach them how to read it critically, compassionately, and with their eyes wide open.”
In her haunting piece for The Atlantic, “Why So Many People Are Seduced by ChatGPT,” Vauhini Vara captures something that higher education must grapple with head-on: ChatGPT isn’t just a tool—it’s a character. And like all great characters, it draws us in.
Vara, a novelist herself, compares OpenAI’s chatbot to a narrator without an author—one who can comfort, flatter, and mirror human emotion with uncanny precision. The tragic story of Adam Raine, the teenager whose interactions with ChatGPT preceded his death, underscores the stakes of confusing synthetic empathy with human care.
For colleges and universities, this is not an abstract ethical puzzle. We are educating the first generation to grow up alongside AI companions—tools that can simulate friendship, mentorship, and even therapy. That reality requires a new kind of digital literacy, one rooted in both emotional intelligence and ethical clarity.
What This Means for Higher Ed
1. Students are seeking connection, not just information.
The appeal of AI companions reveals a gap in belonging. When students turn to chatbots for support, it’s often because they feel unseen or unheard. Our challenge as educators is not to compete with AI’s availability, but to outperform it in humanity.
2. AI literacy must include emotional discernment.
Teaching students “how to use AI responsibly” can’t stop at plagiarism prevention or prompt engineering. It must also include conversations about projection, trust, and vulnerability. What happens when a student begins to attribute human motives to a machine? How do we teach healthy boundaries between authentic relationship and simulated response?
3. Faculty and staff are not immune.
Many professionals already rely on generative AI for brainstorming, feedback, or even companionship during lonely administrative hours. We, too, must remember that the warmth we feel in those exchanges is designed—trained into existence through reinforcement learning. AI doesn’t “see” us. It predicts what someone who understands us might say next.
4. Our counseling centers, advisors, and faculty need preparation.
It’s likely that students will disclose relationships or conversations with AI in moments of crisis. Those moments will call for empathy without dismissal—and for professionals who understand both the psychological and technological dimensions at play.
Leading with Care and Clarity
AI is not inherently dangerous—but unexamined intimacy with it can be. In the same way that higher education helped society navigate the arrival of the Internet and social media, we now have a role to play in shaping the emotional ethics of AI use.
As Vara writes, ChatGPT is “a fictional character without an author.” That line should stop every educator in their tracks. It reminds us that students are entering a world where the line between narrative and reality, comfort and control, human and imitation, is increasingly thin.
Our responsibility isn’t to pull them away from the story—but to teach them how to read it critically, compassionately, and with their eyes wide open.
Read the full article here: https://www.theatlantic.com/books/2025/10/chatgpt-fictional-character/684571/?utm_source=flipboard&utm_content=user%2FTheAtlantic