Rethinking Higher Education in the Age of AI
Presented by Dr. J.D. Mosley-Matchett
In our first Friday SLO Talk of Spring 2026, Dr. J.D. Mosley-Matchett challenged us to move beyond reactionary thinking about artificial intelligence in higher education. Instead of asking whether AI is “good” or “bad,” she framed the moment through three powerful metaphors:
- Buggy Whips
- Rocket Ships
- Total Eclipse
Each represents a different narrative shaping how institutions respond to AI.
🎥 Watch the Full Session
🎙 Listen to the Podcast Version
📊 Download the PowerPoint
1. The “Buggy Whip” Narrative
Is higher education becoming obsolete?
The buggy whip metaphor reflects a fear that traditional higher education is outdated—an artifact of another era. Dr. Mosley-Matchett noted that institutions often cling to familiar models because familiarity feels safe. The “sage on the stage” model, the lecture-centered classroom, and tightly controlled assessment practices represent long-standing norms.
But here is the twist: modern buggy whips still exist. They have evolved. They are redesigned for new purposes.
The implication is not that institutions disappear. It is that they must adapt their form while preserving their function.
AI disrupts the traditional knowledge gatekeeping model. When information is available instantly, the role of faculty cannot remain limited to transmission. The question becomes:
What value does higher education provide when knowledge is no longer scarce?
The session highlighted that institutions must rethink purpose rather than defend tradition.
2. The “Rocket Ship” Narrative
Are degrees still the fastest path to opportunity?
The second narrative frames college as a launchpad for economic mobility. Families invest heavily in degrees with the expectation of higher pay and better careers.
However, Dr. Mosley-Matchett raised critical tensions:
- Rising tuition costs
- Grade inflation
- Over-reliance on standardized and multiple-choice assessments
- Employers questioning whether degrees reliably signal competency
If degrees are to remain credible signals of capability, institutions must ensure that learning represents demonstrable skill and performance, not merely course completion.
She emphasized that AI does not eliminate the need for degrees. Instead, it raises expectations. Graduates must leave equipped to operate in AI-enabled workplaces.
A powerful point emerged:
Employers do not prohibit AI use. Yet classrooms often do.
If higher education isolates students from the tools they will use professionally, we risk undermining our own relevance.
3. The “Total Eclipse” Fear
Will AI replace thinking itself?
The most extreme narrative predicts institutional collapse. If AI can generate explanations, essays, and analyses, what remains uniquely human?
Dr. Mosley-Matchett pushed back against the idea that AI is a “boogeyman.” It is a tool. Like previous technological shifts, it creates disruption—but not extinction.
She identified several deeper issues:
The Cone of Shame
Many faculty hesitate to use AI openly. Some perceive AI collaboration as intellectual weakness. This stigma inhibits responsible integration.
Mixed Messages to Students
When institutions label AI as cheating while students use it daily in real life, confusion results. Students receive contradictory signals about professional expectations.
Training Gaps
Institutions are effective at drafting policies but less effective at providing structured faculty development. Without training, fear fills the vacuum.
Structural Vulnerabilities Exposed
The discussion did not shy away from hard truths. Among the vulnerabilities identified:
- Grade inflation diluting degree meaning
- Homogenized assessment practices limiting differentiation
- Attempts to be “all things to all people”
- Misalignment between institutional identity and workforce needs
- Legislative interference driven by perception rather than evidence
One particularly important theme was differentiation. Community colleges, research universities, and workforce-focused institutions serve different purposes. When institutions blur identity in pursuit of prestige or expansion, they risk strategic drift.
Clarity of mission matters more in times of disruption.
Curiosity as a Leadership Imperative
One of the most resonant themes was curiosity.
The faculty who will thrive are not necessarily those with technical mastery—but those who remain curious and courageous. Avoidance is not neutrality; it is abdication.
Higher education cannot ask students to be lifelong learners while resisting learning itself.
Rethinking Authorship and Assessment
The session also touched on a foundational issue: authorship.
If AI becomes a collaborative cognitive tool, institutions must rethink:
- What constitutes original work?
- How do we define appropriate collaboration?
- What evidence demonstrates learning?
The conversation suggests that assessment design—not surveillance—will determine institutional credibility moving forward.
What Remains Distinctly Human?
AI can automate certain forms of pattern recognition, drafting, and summarization. But human performance remains distinct in:
- Ethical judgment
- Contextual decision-making
- Interpersonal negotiation
- Adaptive problem-solving in ambiguous environments
- Accountability for consequences
The question for institutions is not whether AI replaces thinking. The question is:
Which forms of thinking and performance should we be cultivating?
Closing Reflection
The future of higher education is unlikely to be a total eclipse. It is also unlikely to be a simple continuation of the past.
We are somewhere between buggy whip nostalgia and rocket ship acceleration.
The institutions that will endure are those that:
- Clarify their mission
- Invest in faculty development
- Redesign assessment for authentic performance
- Integrate AI responsibly
- Replace fear with informed experimentation
AI does not determine the future of higher education. Institutional choices do.
