Earlier today, Allie Miller wrote a post on LinkedIn encouraging business leaders to start thinking about what AGI, or “'AI that can outperform humans on cognitive tasks',” something that is not likely very far off, means for their business.
I’ve been meaning to write a Substack on this, hoping to frame the AI discussion for educational leaders beyond questions related to AI as “edtech,” so I stole Allie’s idea and simply asked Claude to rewrite her post for education leaders.
What follows is Claude’s rewrite with some tweaks from me. Claude’s original is here.
__
There is no playbook for AGI in education, but there is one thing most people don't realize that can actually help. As I see it, one of the reasons educational leaders have not responded to this challenge is that they are terrified of looking wrong in the AI age, and it's holding our institutions back. And I say this as someone regularly speaking to university presidents, superintendents, and heads of school—this fear is stoking more anxiety than they realize. Even worse, that fear is contagious throughout our educational communities.
Although it is not up to any one educational leader to solve this, it is important to push leaders toward productive thinking.
Here is the question we wish we could ask each educational leader to get a better sense of what the future could look like and how prepared your institution is to succeed:
"What impact could or will AGI (or let's just say 'AI that can outperform humans on cognitive tasks') have on your institution, across students, faculty, curriculum, and operations?"
But when you ask that, you likely get this answer:
"We just don't know! No one knows! But we're absolutely going to keep an eye on it. We're working hard, piloting AI tools every semester. We have committees working on this; they will issue a report in 6 months. Over the next 5 years, many faculty who do not like AI will retire. And we're confident we have the best institution to adapt."
That's the safe answer they're required to give. But it won't lead us through this transformation. You can push them and watch how they respond. But they'll likely just double down on not knowing.
Instead, I'd advocate that folks ask questions that elicit multidimensional thinking.
Here are a few to start:
Questions That Drive Deeper Thinking
What do you see as the possible paths for your institution in a world where AIs can do all knowledge work we are training students to do nearly as well or better than they can do it?
How will you compete against schools (Alpha School (K-12)) and George Mason/NY state system schools (college/university)) that are working 24/7 to introduce curriculum and new learning models to prepare for this world?
How can K-12 and university leaders work together to reduce emerging mismatches between K-12 instruction and university readiness (students graduating from AI-ready schools attending non-AI ready universities and vice-versa)?
What are the knowns and unknowns about Superintelligence's impact on learning and teaching?
What assumptions do you hold today about how education works?
Have any assumptions broken in the last month? Six months? A year?
How have you updated your thinking since November 2024?
What is your current strategic plan for integrating your school into a world of digital, and then physical, super intelligence?
What systems do you have to change that plan if needed?
The president, provost, head of school, or superintendent has their position for a reason. They set a vision and captain institutional resources—faculty, funding, board relations, community influence, culture, student data, and time—toward that vision.
We are not asking them to etch a 10-year strategic plan in stone. We are asking for transparency into what they're doing about the most significant shift in human capability in generations.
Creating Space for Open Educational Dialogue
One of the most important things we need to solve: space for this open thought in education. We need ways for educational leaders to share this thinking without alarming parents, students, or accreditors.
One model is thoughtful educational leadership—publishing reflective pieces on the future of learning, like major university presidents have begun doing. Another is creating innovation labs or pilot programs to test new ways of teaching and learning, then sharing what those programs discover.
Systems thinking and multidimensional thinking is infinitely more important in 2025 than 2019 for educational institutions. Both leaders and their communities should move toward this deeper dialogue. We also need channels to share these insights across the educational landscape.
The stakes are too high—and our students' futures too important—for surface-level responses to the most transformative technology in human history.
We already have digital superintelligence in some narrow domains. There is probably at least an 80% chance we’ll have digital superintelligence that can generalize across all domains and even create materially significant new knowledge by the time most of the students currently enrolled in your institution graduate.