Is AI Driving Your Institution's AI Strategy?
Every school has an AI strategy — the only question is who’s really in control: people or AI.
TL;DR
Every school already has an AI strategy — whether it realizes it or not. The only question is who’s writing it: the people inside the institution, or the algorithms their students already use. When a school chooses not to act, it’s not neutral; it’s letting default settings, commercial incentives, and student behaviors decide its path. The invisible hand of AI is already shaping how students learn, how teachers assess, and what content dominates attention. When schools fail to set their own vision, the system quietly sets one for them — through recommendation engines, productivity apps, and AI tutors that define what “good learning” looks like. The challenge now is not whether schools should have an AI strategy, but whether they will shape it intentionally or inherit one designed elsewhere. Perhaps by AI.
__
I’m not an education skeptic. Far from it.
But I believe we’re repeating the same mistakes in the world of schooling and learning that tech already made — and unless we wake up, we risk losing control of not just AI, but of education itself.
We’re already in the storm — whether we see it or not
We like to think schools are neutral spaces — that education can remain untouched by the rapid rise of AI if we simply resist it.
But that illusion is gone. Students already use AI for homework, projects, and assessments — often without their teachers knowing. They rely on it to explore personal interests, shifting learning beyond the boundaries of the school walls. AI is now woven into learning, quietly reshaping it from within.
Each time a student opens ChatGPT, an AI tutor, or an automated feedback tool, the foundations of teaching and learning shift:
Authority: Teachers are no longer the sole gatekeepers of knowledge.
Assessment: Essays and homework lose meaning when AI can generate them.
Equity: Access to AI becomes a new form of advantage.
Learning Process: Memorization gives way to prompting, editing, and evaluating AI output.
Content: Students can already learn much of what they want through AI — imperfectly, but independently.
This isn’t about machines taking over classrooms; it’s about the invisible reprogramming of educational norms every time AI is used, whether schools acknowledge it or not.
And there’s only so much resistance and even engagement is possible. Technology evolves faster than institutions. Curriculum frameworks, accreditation systems, teacher training, and assessment models were all built for change in a slower era. When disruption comes, they strain, react, or try to suppress it. But the cat is out of the bag — and students are already living in the future while AI in school is still debating.
The mistakes we’ve made — and are repeating
When social media surged, institutions often reacted when it was too late: platforms became entrenched, opaque, and indispensable. Regulation, policy, and public understanding always lagged behind. By the time power was recognized, it was consolidated in a few hands.
In education, we risk the same pattern:
Waiting until scale to act. If we wait until AI tools are entirely ubiquitious (they may already be) before establishing norms we’ll be forced to catch up under crisis conditions.
Treating AI as an add-on, not foundational. If we see AI as just another app or tool, we’ll fail to address how it changes learning modalities, assessment, pedagogy, access, and power dynamics.
Ignoring governance from the start. Too many education systems treat policy, oversight, ethics as retrofits — trying to regulate after deployment — and that leaves us vulnerable to irreversible harm.
The pace of change is overwhelming — but we must try harder
Yes: the speed at which AI is advancing is dizzying. Teachers, administrators, policymakers are overwhelmed. Changing century-old systems is slow and painful. Resistance, inertia, institutional silos: all of that is real.
But that is not an excuse for inaction. If we don’t engage proactively — rethinking curricula, reimagining assessment, updating governance models, training educators, and building safeguards — we risk ceding control over how AI shapes learning to AI. The alternative is that AI will shape institutions for us — not through democratic deliberation, but by default. AI may already have more influence on our students than any adult.
In short: we must try harder. Not because we will get it perfect, but because the window of influence is small. If we wait, the decisions will already be baked in, with incentives, architectures, and norms that are extremely difficult to undo.
A new civilization is being built — and education will be transformed from the outside in
A part of the reason for slow change is that many educators are only slowly starting to see the significances.
We often imagine change in education as coming from within and in the ways we want: reform, innovation, policy shifts that are led by committee. But increasingly, what’s happening outside schools — in AI labs, in global networks, in alternative learning platforms — may define the future of education more than any school board or university faculty.
We are seeing the building of a new civilization — one shaped by algorithmic intelligence, networked learning, decentralization, constant adaptation. In that new society, learning is continuous, porous, boundaryless. Credentials may be replaced by reputational systems or evidence of skills verified in new ways. Institutions that can’t adapt may lose legitimacy or relevance.
This shift will reshape education whether traditional systems like it or not. Schools that resist may become obsolete, marginalized, or undercut by emergent alternatives that better align with new civilizational norms. The new models of learning — microcredentials, peer networks, hybrid AI-human tutors, lifelong learning platforms — might not emerge from inside the university but from outside it, in the same way that massive open platforms overtook traditional media.
What we need to do differently — starting now
Embed governance, equity, and ethics from the ground up. Don’t let policy be an afterthought. Design AI-augmented learning systems with oversight, auditability, inclusivity, and human judgment as core tenets.
Reimagine what teaching and assessment can be. Use AI not just to automate old tasks but to rethink how students demonstrate understanding, collaborate, self-direct, and grow.
Train and empower educators. Teachers and professors must be agents, not victims, of change. They need capacity, autonomy, and voice in shaping the AI tools that influence their classrooms.
Foster hybrid intelligence, not automation. Let AI amplify human creativity, empathy, and insight — not replace them. Preserve human in the loop as a design principle.
Build flexible, interoperable architectures. Encourage open standards, modular systems, and transparent APIs so learning ecosystems can evolve rather than be locked in.
Center equity and access. Without deliberate safeguards, AI-powered education could exacerbate inequity: advantaging those with better tools, data, connectivity or privileged institutions.
Experiment boldly, but responsibly. Pilot new models, iterate, test, audit, scrutinize, fail fast—but fail informed and with care.
Cultivate governance coalitions. Regulatory bodies, educators, technologists, students, and civil society must collaborate. No one group can steer this alone.
The moment: opportunities and risks
If we act well, we may usher in an education system more adaptive, personalized, equitable, and powerful than anything before. Students could learn whenever, wherever, at their pace, cross boundaries, and access rich human–AI partnerships.
But if we fail — or wait too long — we risk ceding control of education’s future to those who build technology first and ask questions later. The infrastructure, the architectures, the norms will reflect whoever built them — not necessarily what is just, humane, or wise.
We don’t get a second chance at building foundational systems. The time to act is now.
If we don’t act now we are letting AI decide. Or the companies that built it. Or our neighbors. Or everyone and everything but us.


