top of page
Search

The Promise and Peril: AI's Dual Impact on Higher Education Accessibility

  • Writer: Claire Brady
    Claire Brady
  • 16 minutes ago
  • 3 min read

Artificial intelligence (AI) technologies present a fascinating paradox for higher education accessibility. While they offer unprecedented potential to remove barriers for students with disabilities, they simultaneously risk creating new obstacles if not designed and implemented with inclusion at the forefront.


The promise is compelling. AI-powered tools can provide real-time translation for multilingual students, transform complex texts into plain language, offer personalized learning paths, and provide 24/7 support through sophisticated chatbots. For students with disabilities, these capabilities can dramatically improve educational access.


Consider how text summarization tools can support students with cognitive processing challenges, or how AI-powered note-taking can benefit students with learning disabilities. Predictive analytics, when properly designed, could help identify struggling students early and connect them with appropriate resources. The flexibility of self-paced AI tutoring can accommodate diverse learning needs and styles.


Yet alongside these possibilities lie significant perils. Many AI systems still reflect the biases of their training data and design assumptions. Automated captioning services often contain errors that impact comprehension. Facial recognition technologies used in proctoring software create barriers for neurodiverse students. Even interfaces of popular AI tools may present accessibility challenges—ChatGPT, for instance, has had notable issues with screen reader compatibility.


Most AI tools currently cater to users with high working memory, consistent attention spans, and low cognitive load—design assumptions that create barriers for learners with ADHD, dyslexia, traumatic brain injuries, or processing disorders. These barriers often escape detection in standard accessibility audits, as they don't fit neatly into compliance frameworks.


The cognitive accessibility challenges are particularly concerning. Consider the overwhelming nature of chatbot interfaces with nested logic trees, the poor contrast in visual AI-generated summaries, or auto-generated text that lacks clarity and context. For students with cognitive differences, these design flaws can render tools unusable.

Similarly, AI systems often make rigid assumptions about how users think, learn, and communicate. Tools that penalize nonstandard grammar in reflective writing or AI tutoring platforms that don't accommodate exploratory thinking can disadvantage neurodiverse students who process information differently.


Even temporary or situational accessibility needs—such as a student using a voice-only interface on a noisy bus or navigating with one hand while recovering from an injury—are frequently overlooked in AI design. Yet addressing these transitory needs ultimately benefits all users through more flexible, adaptable interfaces.


For higher education leaders, this duality of promise and peril demands thoughtful, proactive engagement.


We must:


Advocate for inclusive design from the start: Encourage vendors to include neurodiverse learners and those with disabilities in co-creation processes.

Evaluate AI tools through comprehensive accessibility rubrics: Look beyond compliance to assess cognitive accessibility, flexibility, and representation.

Insist on customization options: Prioritize tools that allow users to adjust pace, format, and interaction style.


Balance innovation with protection: Embrace AI's potential while remaining vigilant about new barriers it might create.


Center diverse user experiences: Recognize that accessibility needs are multifaceted and sometimes temporary.


The future of AI in higher education will be shaped by the priorities we establish today. By acknowledging both the transformative promise and significant perils of these technologies, we can work toward implementations that genuinely enhance accessibility rather than creating new forms of exclusion.


As we navigate this complex landscape, our north star must remain the diverse experiences and needs of our students. With thoughtful leadership and intentional design, AI can become a powerful force for educational equity rather than another layer of systemic barrier.


Ready to move from promise to practice in your AI strategy?

Dr. Claire Brady offers “From Promise to Practice: AI’s Role in Higher Ed Accessibility,” that equips higher education leaders with the tools to design, implement, and evaluate AI technologies through an equity and accessibility lens. Whether you're a disability services professional, a senior leader exploring AI adoption, or a faculty champion for inclusive innovation—this training will challenge your thinking and sharpen your approach. Explore this session and other high-impact AI trainings at www.drclairebrady.com, or reach out directly to schedule a conversation.



 
 
 

Comments


bottom of page