New MIT Article on Generative AI and Education: Equitable Integration of AI in Schools
The downsides of the technology are inevitable. The question is whether or not schools actively engage to equitably capture the upsides.
Computer science teacher Chad McGowan told us his district initially blocked ChatGPT at the server level. He argued for them to unblock it, since the wealthier students would have access at home, using tablets and computers. The poorer students only had their school-issued Chromebooks, and he feared that they would fall behind in learning about the strengths and weaknesses of AI. The district agreed with him and unblocked the service.
A few days ago, MIT released a long article on generative AI in education.
I strongly encourage you to read the whole document, but I want to highlight a few things.
AI is in schools. Unlike SMART boards or 1:1 iPad programs, the use of AI in schools is not in any way dependent on school leaders bringing it or allowing it; MIT defines this as an arrival technology. It’s here and it is disrupting assessment and it doesn’t need anyone’s permission to do so. The questions schools face are how to equitably manage and integrate the technology, which can involve bringing it in, but it is not exclusively focused on that. The downsides of the technology are inevitable. The question is whether or not schools actively engage to equitably capture the upsides.
History tells us that the schools with the most resources—with reasonable classroom sizes, with enough substitutes, with clean and safe buildings, with technologists and instructional coaches, with professional development budgets—will adapt to disruptive technologies most effectively. Some teachers and schools in less-resourced districts will do brilliant work, but on the whole, wealthier schools in wealthier communities will have the greater advantage when it comes to realizing the benefits of generative AI. Without additional support and investment, less-resourced schools in poorer neighborhoods are more likely to encounter the detrimental aspects of AI without the same benefits realized by wealthier schools, thus widening the disparity in educational experiences.
AI will continue to advance but we don’t know exactly in what manner and how quickly. The document notes that while there is disagreement over when and if we will achieve AGI, “there will likely be future developments in the technology.” If there is one understatement in the document, it is probably that. If you follow the space, you’ve seen almost weekly (and often more) advances, including yesterday’s release of hume.ai, an AI that cannot only display emotions but can monitor our emotions and then adjust its messaging and persuade us. As Marc Watkins (LinkedIn, blog) noted today, “We aren't ready for AI that can persuade us by scanning our emotions.”
On a personal note, I remain surprised that many schools are not concerned with teaching students about such life and world-changing developments, instead focusing solely on getting them to learn the curriculum they taught in 2021-2 and was likely developed years before.
While we constantly complain that students take the easy way out of their work, we have to recognize that as educators many of us are doing the same. It’s much easier to keep teaching what you’ve been teaching than to teach about this new world.
AI is both amazing and limited. AI can both do amazing things (aid the discovery of new materials and completely change the way scientific research is being done), but it still struggles (some systems) with accuracy in high school math and lacks common sense reasoning. Proponents and opponents highlight each of these, but the reality is that it’s good at some things and bad at others and you need to know how to use it well in what instances. That takes professional development.
The discussion needs to move beyond “cheating” and ethical use.”
(W)e want to reframe the question from “How will students use generative AI to cheat?” to “How does technology bypass productive thinking?”
Various surveys of student use show anywhere between 60 and 90+ percent of students using AI to assist with their schoolwork. Many consider much that “cheating.”
So, is it productive to say that 60-90% of our students are cheaters? What comes of this, especially since everyone knows there is no way to prove it? Where does this leave education?
Focusing on students as cheaters is unproductive and unsustainable. If all of our students are cheaters who aren’t learning anything, we should probably just shut down our schools.
Instead, we need to focus on instructional redesign (yes, as the report acknowledges, it’s easier said than done), so students have assignments that will boost their “productive” thinking when they use AI. Of course, this may also require some more in-class, low-stakes formative assessments and conversations with students about the importance of not circumventing productive thinking, but it really involves changing how most classrooms function.
These are all difficult challenges but thinking that the majority of our students are ‘cheaters’ is not a productive frame and also makes us as educators irrelevant.
Professional development. Professional development related to understanding AI, types of AI, its strengths and weaknesses, and how to deploy it productively in the classroom requires not only a lot of thought but a lot of training and iterative training. More and more schools have started along that path, but others have not. This will be a heavy lift for many schools as they find themselves in more constrained resource environments with the drying-up of COVID-19 funds. The article does encourage people from industry and other nonprofits to contribute.
We cannot expect teachers to be able to conduct and lead all this experimentation in their own free time. A signature challenge that education faces, in a moment of postpandemic exhaustion, is finding the resources to pay for teachers’ time to conduct this important reflection, experimentation, and refinement of their practice.
Guidance. The article encourages districts to develop AI guidance. We do have a free 63-page document based on all of the guidance documents issued to date to help you get started. That document is free to use, and we do have a large program to help schools. We have been actively helping districts develop such documents and we encourage you to reach out. If you don’t like us, ask someone else.
What we know. As the report notes, “So far, we have not seen AI tutors, chatbots, or other helpers with the potential to “change the game” of education.” I tend to agree, though I suspect many are coming soon, and we already seeing the decentralization of education that is supported by these technologies (Utah VR Charter School).
Regardless, though, I think the rush to figure out AI and integrate it often “misses the forest through the trees.”
I’ve said this so many times I’m turning blue: What schools need to start with when adapting to the AI world is doing more of what they already know how to do: teaching problem-solving, critical thinking, creativity, and decision-making skills. They just need to turbo-charge their focus. More schools need debate teams, robotics teams, and CTE programs; they need to help students develop portfolios. None of these are as cool as building a bot with some scrappy new tech, but they are more likely to help students succeed in an AI world.
Writing detectors don’t work. Yep. Here’s another new study:
The implications of how schools respond or fail to respond are quite significant.
(M)ore affluent students have an advantage since they are more likely to have higher-quality access to AI, as well as have access to additional staff managing the application of the technologies.
Students who can afford a personal device may have an advantage over students with fewer resources. And students who attend schools with better-prepared teachers, who have benefitted from more planning time and professional development, will have an advantage over students whose teachers have not had those opportunities.
Again, it’s not about AI Yes/No. It’s about how to integrate and manage it both equitably and responsibly.
And that requires doing the hard work we expect our students to do.