Why 85% of AI Projects Fail—And How Higher Ed Leaders Can Beat the Odds
By Claire Brady, EdD
“If your student information system (SIS), learning management system (LMS), and other platforms are siloed, outdated, or inconsistently used, your AI tools will underperform—or worse, perpetuate bias or misinformation.”
As higher education institutions rush to explore the possibilities of AI—from predictive analytics to student success chatbots—one staggering statistic should stop us in our tracks: 85% of all AI models fail due to poor data quality. That sobering reality, highlighted in Jameel Francis’s recent Forbes article, should be a wake-up call for every higher ed leader navigating AI adoption.
“AI applications are only as good as the data they are trained on,” said Troy Demmer in his 2024 testimony to Congress. “Trustworthy AI requires trustworthy data inputs.” And that’s not just true in national security—it’s equally true when we’re trying to ensure students don’t fall through the cracks, optimize course offerings, or personalize support services.
Many institutions are excited about AI’s promise, but enthusiasm alone doesn’t build sustainable systems. As Francis explains, the real culprit behind AI’s failure isn’t the technology—it’s the data. Or more precisely, the lack of quality, complete, and usable data that AI models depend on.
Here’s where it hits home for higher ed: If your student information system (SIS), learning management system (LMS), and other platforms are siloed, outdated, or inconsistently used, your AI tools will underperform—or worse, perpetuate bias or misinformation.
Common pitfalls include:
Overfitting, where AI models rely too heavily on narrow data sets
Data bias, resulting from incomplete or non-representative samples
Underfitting, where the models never learn enough to be accurate
Data drift, where systems fail to adapt to evolving student needs and behaviors
Essential strategies to future-proof AI investments
1. Data Integration
You can’t improve what you can’t see. With the rise of cloud platforms and digital transformation, colleges now sit on more data than ever. But unless we connect that data—across admissions, advising, retention, and beyond—it won’t yield meaningful insights. Integration methods like ETL (extract, transform, load) and middleware solutions enable institutions to unify data sources and prep them for AI analysis. As Francis notes, “Possessing quality data starts with having a complete picture of the information generated by your organization.”
2. Data Quality Management (DQM)
Beyond integration, we need an institutional culture that prioritizes data integrity. This includes data governance frameworks, clear accountability, compliance with standards (like FERPA, HIPAA, or GDPR), and ongoing quality monitoring. DQM isn’t just a tech upgrade—it’s a leadership strategy. If quality data isn’t seen as mission-critical, “issues such as incomplete and outdated data will become the norm.”
The bottom line?
Moving fast on AI without a solid data foundation is like building a plane mid-flight. And as tempting as it may be to race toward the next big innovation, the real win lies in getting your data house in order first.
Higher ed leaders don’t need to fear AI—but we do need to respect the groundwork it requires. When we treat data as an asset and steward it with intention, we not only reduce the risk of failure—we unlock the power of AI to better serve our students, our institutions, and our collective future.