Our AI-Enabled Students Don't Need Another Brick in the Wall
But they will be controlled by algorithmic bricks if we abdicate our responsibility to them.
Plagiarism Detectors as Bricks in the Wall
A few weeks ago, I started this Substack to share my thoughts on developments in AI related to education. I started by writing about how we will soon be living among computers that are smarter than us. I explored the arguments related to potential unemployment, the importance of AI literacy, the emerging role of bot teachers, what I learned from editing a book about AI in education, and the massive academic disruption we are facing. I even threw a softball: The highlights of the May US Department of Education report on AI.
The Substack has more than 16,000 views. But do you know what the two most popular blog posts are? Sixteen Reasons Not to Use AI Detectors in Education and I Beat an AI Plagiarism Detector with Two Words. The latter got more than 1,000 views in 24 hours.
Why? Because many educators believe they can retain the current educational structures with the control provided by plagiarism detectors. They regularly ask me to recommend “effective” plagiarism detectors. We’ve seen this rodeo before:
But this time, the students are not having it. They are coming for the bricks in the wall with their AI tools, and the AI lie detectors, which can be defeated with two words, don’t work.” As John Nash notes, “High-performing students, particularly those with access to more resources, will use ChatGPT without detection.”
Emad Mostaque, the CEO of Stability.ai adds:
AI-enabled Students 1; AI-enabled Administrators/Teachers 0. Homework is now an apocalypse. Control shifts to the students and students are in the lead, even though teachers will conclude there is nothing to worry about because the undetected “smart” students aren’t cheating.
But it isn’t that simple. AI does not simply enable some Paulo Frieren ‘liberation’ in a “decentralized” yet networked world, where their every movement and thought is recorded and passes through networks that all flow from the largest “AI companies” in the world (Microsoft/Open AI, Google, Meta, Twitter/Tesla/Neuralink). AI enables a massive academic surveillance network, as we see here in a school building in China.
And it’s not just a reconstitution of surveillance. The algorithms start to define who we are. It’s no longer just “nature” and “nurture;” it’s also the “algorithms.” Ian Bremmer explains:
And the power that AIs have over us extends beyond the individual to society at large. Professor Harrari argues this could potentially threaten democracy itself, a much bigger deal than the loss of the plagiarised “school essay.”
To give a concrete example, VR headsets provide a lot of opportunities for students to experience places they’ve never been to and to lower the cost of education. But they can also record your biofeedback and notice if your eye movements suggest if you are attracted toward the opposite gender or the same gender, the latter of which can get you the death penalty in Uganda.
This surveillance and other forms of control don’t have to occur inside school buildings. It can easily occur outside the buildings, with students connected to devices 24/7 and every movement and potentially even every thought is recorded.
And many students will actively participate in this, thinking they are “liberated” from the “industrial model of education” where they were controlled by physical bricks in the wall. In an AI-World, they become active participants in their own control by algorithmic bricks that will facilitate their integration into the new AI-World’s capitalist economy and government surveillance architectures while amusing themselves to death on Tik Tok.
Will it be Orwell or Huxley that proves to be correct as we embrace these new technologies?
“Another Brick in the Wall” is dark. Another Brick in the Algorithm is an immersive utopia. Control is perfected.
The Education Emergency
The questions educators need to ask themselves are not should AI be kept out of schools or how it should be kept out of schools; it cannot be kept out of schools or out of society. Any belief that AI can be effectively restricted by AI-enabled “bricks” is based on nothing but a lie educators have been told to make a dime and to deflect attention from the need to help students to live and grow productively in an AI-World.
And this time, the future of humanity is at stake. If educators do not actively learn how to help students develop in an AI world, students and society will be controlled by AI itself. This is true even if schools never purchase a single “AI app.”"
In this emergency, one that risks far greater harm than COVID-19, educators must, at a minimum, do the following.
Learn the basics of how generative AI works. Educators need to understand the basics of how this technology works if they are going to help students empower themselves in an AI world. The only way to utilise a technology positively it is to understand it. And AI doesn’t care if you hide from it; as long as there are “free” services online, it will find you and use you if you don’t engage it.
Help teachers develop new assessments that empower students. New assessments that are more interesting and valuable to students in an AI-world, including project- based learning approaches, phenomenological learning, and in-class debating need to be developed. These will help students master content and to learn the durable skills, including communication and critical thinking, they need to both work with AIs and challenge its problematic uses.
Listen to the kids. Many of our students have been using AI in school much longer than teachers and professors. I’ve been teaching an AI Literacy course to high school students and one 13-year-old produced one of the best AI policies I’ve seen (concerns about detectors in tow :)):
English AI Policy:
- Students ARE allowed to use AI tools such as ChatGPT to complete English assignments.
- Students can use AI tools to brainstorm ideas, make essay outlines, and find information.
- Students can NOT use AI tools to write their entire essays or to do all their work.
- When using AI tools students must fact-checked and cited using MLA citations.
- If students are suspected of using AI tools in unregulated ways their work will be checked with 4 or more AI detectors and will be compared to the students prior work.
- If students are caught using AI tools in unregulated ways they will be given a warning and will have to complete their assignment again. If they are caught using AI tools in unregulated ways multiple times then they will receive detention and possibly worse consequences.
While this may seem like a small example relative to the larger issues discussed, it demonstrates what students have to contribute.
At the university level, undergraduates from across the United States have organised what will be sure to become one of the leading conferences on AI in education in the United States, at which I’m proud to speak.
In most instances, it’s the students, not the faculty and administrators, who are the ones driving this train.
Learn, create, design, imagine, build, fix, think critically, and communicate. There is a high probability that by 2035 we will be living in a world of artificial general intelligence (AGI), where machines are as smart as us. We may be living in a world of artificial superior intelligence (ASI), with machines that might be 40-50 times smarter than us. As Will Richardson, Co-Founder of the Big Questions Institute, notes:
Kids entering school this fall will "graduate" in 2035. If we're not preparing them to learn and create and design and build and fix and imagine on their own and with others, wtf are we doing? More "courses?" More tests?
Those are damned good questions, Richardson has really answered his own questions. Our students need to understand the AI World that is emerging around them, to think critically about how they want to participate in it, to embrace technology when it empowers them and to reject it when it disempowers them, to imagine and build worlds they want to live in. They should have the opportunity to be taught how to use the technology for their benefit.
Conclusion
Much has been written about the meaning of Pink Floyd’s Another Brick in the Wall, but one interpretation is that Roger Waters, who became the band’s lead writer, saw schools and teachers as authority figures in an “industrial model of education” as individuals who restricted his creativity and limited his potential. Enabled with AIs, today’s students are bringing that model to its knees.
This “AI-Enabled model” has the potential to liberate students expanding learning opportunities, adapting instruction to student needs, and by removing the control architecture of the in-school day. But it also has the potential to replicate the social control mechanisms of the industrial model in even more insidious ways unless educators pro-actively work with students to provide them with the tools to challenge the new disciplinary nets created by algorithms. This begins by teaching them with and about the technology in ways that empowers them, including how to make choices about its role in their lives.
This is now the essential role of education and there is no more pressing issue. It’s time for educators to step up and engage. Our students don’t need another Brick in the Wall, physical or alogrithmic. They need to be empowered.
Stefan Bauschard
The AI Boot Camp (AI for Leaders)
Educating4AI (AI Literacy for Students)