The Case for a Head of AI at Your Academic Institution (And A Job Description)
To reduce pressure on school administrators and faculty, we propose that every academic institution hire a “Head of Artificial Intelligence” whose job would be help everyone adapt to an "AI World."
Stefan Bauschard, Educating4ai.com, DesigningSchools.org/aibootcamp
Dr. Jason Gulya, Berkeley College
The spring of 2020 brought one of the worst crises education has ever faced: The rapid spread of a tragic global pandemic. This required immediate action to close schools and shift learning online. Administrators and teachers/professors worked overtime to adapt. This shift was difficult but temporary, and eventually the pandemic ended. Even if the world will never be exactly the same as it was in the fall of 2019, education mostly returned to normal by the fall of 2022. The COVD-19 disruption was temporary.
But November 2022 brought another disruption: ChatGPT, a generative artificial intelligence (GAI) bot that can complete at least rough drafts of common student assignments, immediately became a global sensation, garnering more than a million users in 5 days and 100 million more within 3 months. This was just the beginning of an educational disruption that will not end: GAI is here to stay and its abilities will continue to accelerate, dramatically impacting education forever.
By now, we all know the immediate academic disruption in 2023 was widespread “plagiarism” by students, much of which went, and continues to go, undetected, causing a lot of frustration for faculty; some even openly mocked the three-months-old technology’s immediate limitations on social media. Perhaps to get revenge :), in March, OpenAI released version 4.0, which is trained on much more data and with significantly more RLHF (Reinforcement Learning with Human Feedback) than 3.5, the version released in November. These measures drastically decreased the system's tendency to hallucinate and strengthened its math performance. OpenAI also added plug-ins, which allow ChatGPT to interact with other programs, search the internet and accurately do math. George Hotz notes that ChatGPT4 can already reason better than most humans.
This is just ChatGPT. There are other new GAI models and there are thousands of applications built on top of these models that are trained to ameliorate the limitations of ChatGPT and other similar large language models (LLMs). Objective-driven models, which are designed to solve the intrinsic limits of LLMs, are under development by at least Meta, led by one of the “Godfathers” of AI, Yan Lecunn.
As we enter the fall with current technologies, which are the worst AI technologies you’ll ever use, we are already facing a homework apocalypse. AI lie detectors, which can be defeated with two words, don’t work. As John Nash notes, “High-performing students, particularly those with access to more resources, will use ChatGPT without detection.” The detectors emerged as being so flawed that many universities banned professors from using them.
There is a bit of a debate about how quickly additional changes will come, but as Ethan Mollick of the Wharton School, a leading voice on generative AI in education, who started requiring his students to use AI in all of the assignments in his class back in January, noted:
Numbers 1 & 2 are certainly true. And Sam Altman, the founder of OpenAI says we will start seeing #3 – exponential gains as AIs create AIs. He claims that within two years time we will see ChatGPT4 and today’s similar technologies as mere “toys.” Yan LeCunn, who is the most conservative at projecting the time-frame of the abilities of AIs, believes that in the not too distant future (20 years) every person will have 5 AIs who are smarter than them working for them. Mo Gawdat says it’s inevitable that we will have systems 10 times as smart as Einstein (e.g., 10X an IQ of 1600) soon and a billion times smarter than us by 2045. It is impossible for this not to radically change education, from what students need to learn to how they need to orient themselves toward the world. Some even more radical yet well-grounded predictions are here; these could all come true before today’s incoming first graders graduate “college,” if it still exists at that time.
But even if the time-frame to AGI (artificial general intelligence, the intelligence of your “average” human) and/or ASI (artificial superior intelligence, intelligence far beyond your average human) is farther off or hyped, what educators need to manage over the next academic year, even if we never get past #1, is monumental. Students will be doing all their work with AI “copilots” and parents will be demanding that schools step-up and prepare students for the “AI World.” Unlike their demands on subjects like masks and DEI training, these won’t cease. And unlike the pandemic, it won’t subside. But if schools try to build a solid foundation for managing AI now, it will be easier for them to react to rapid and unpredictable developments in the future.
And schools need to think beyond “plagiarism.” In order to prepare students for the “AI world,” schools can’t only worry about getting students to do their “own” work; they have to equip them with the skills and knowledge they’ll need to succeed in the AI world. This will require teaching proper “copilot” skills so that students can take advantage of AIs to enhance their own lives. As the Business Insider recently noted: “Your next job may depend on how well you understand AI tools like ChatGPT.” Learning to use AIs properly may be the most important skill high school and graduating college students need to learn learn next year.
There are also the social issues and security issues students need to learn about, including the ability of existing AI technologies to counterfeit human voice and image (including those of school administrators and fellow students), manipulate elections with language, impact their privacy and often produce discriminatory output. Many will want to understand and discuss the existential risks we often hear referenced in the media.
Students and staff also have concerns related to potential unemployment generated by AI that should be discussed. Teachers and professors will seek advice on how to manage rampant student “cheating” and how to manage a new technology they know nothing about. With all of the pressures teachers are currently under, including mental health pressures magnified by the pandemic and social media, having a place to turn is important. K-12 schools that are already facing teacher and staff shortages cannot afford to have AI be the next disruption that drives teachers and administrators out of the profession.
Schools need to understand that there will also be incredible pressures on the structure of education itself. There will be massive decentralization pressures (AI will make homeschooling and micro schools easy; alternative “ certification” companies will attest to employers that students have the ability to do the job) in K-12 education that risk further undermining schools. High school students will build multi-million dollar companies with AI before they graduate.
We may soon see the transition to “AI universities.” Those that make the transition may be the small liberal arts colleges that are fortunate to survive. Fifty-two percent of high school students said they are considering a four-year degree, down from 71% in 2020. Thirty-eight percent “said the most important consideration when choosing a college was what careers would be available to them” (Spitalniak).
Academic institutions are facing the most trying period in their histories. Immediately after coming off the Covid-19 disruption, they faced a new disruption that will not end: The never-ending onslaught of AI that puts direct pressure not only on common school assessments, but the structure and operation of schooling itself. Many parents and potential college students will soon be asking: What are you doing to prepare the children for the AI world?
To help reduce pressure on school administrators and teachers/professors, we propose that every academic institution (this could be a school, a school district, a department or a college/university) immediately hire a “Head of Artificial Intelligence” whose primary purpose would be to help students and faculty adapt to an AI world (details of the “how”below). We believe this is an “emergency” hire that should be developed soon as possible in order to prepare for the fall of 2023. Without a Head of Artificial Intelligence or a similar role, schools will continue to flounder in incorporating AI into their academic infrastructure. A Head of Artificial Intelligence would allow the school to create a coherent strategy around the responsible use of AI.
Businesses are hiring them. Why aren’t schools? Are schools less important?
Head of AI Job Description
Responsibilities
Organize professional development, workshops, courses, and events to support the development of basic AI literacy among faculty, staff, students and the community so that everyone feels comfortable having conversations about the technology
Help instructional leaders and teachers/professors brainstorm alternative assessment strategies and understand the weaknesses of current detectors
Work with students and faculty to develop co-pilot skills to augment their own capacity in an AI world
Work with all school employees to use AI to save time
Work with leadership, counsel and cyber security experts to appropriately screen apps and develop policies related to their use
Be a general sounding board on AI for all parties and provide support to those who are struggling with the technology
Bring on additional expertise where needed
Follow developments in AI and report on material changes and events that are likely to provided significant additional disruption to assessment and/or the school environment
Teach a course on AI literacy where you experiment with new AI technologies and co-learn with students
Ground support for the integration of technology in evidence-based theories such as design thinking.
Qualifications
Familiarity with the basics of AI, large language models, object models and other emerging models; familiarity with neural networks, deep learning, and the significance of transformers; familiarity with mathematical representations of bioreality
Familiarity with AI trends, including where the technology is headed at the micro level (e.g. personalization, learning bots) and the macro level (AGI, ASI, large vs narrow llms; cloud vs localized data trends)
Possesses requisite knowledge to build AI literacy programs for students and faculty
Familiarity with future predictions related to the impact of AI on academics, especially assessment, and school operations
Understanding of the basic social and issues associated with AI
Ability to make general recommendations regarding assessment changes and suggestions for experts in the development of new assessments.
Familiarity with online communities involved in AI-education adaptations
Familiarity with new common AI tools such as ChatGPT, New Bing, MidJourney, Perplexity, You, and Stable Diffusion
Skills
Strong ability to lead the development of AI professional development and support in an empathic manner
Strong communication skills
Basic familiarity with AI tools and common tools used in schools
A ton of energy and enthusiasm for building strong educational communities in the world of AI
Common sense
Reports to
This position reports to the Head of Technology and other appropriate Head (Superintendent, Provost, Head of College) because this job is not just about technology.
One school, the Cottesmore School in the U.K.,has already advertised for such a position with similar qualifications and inspired this post.
Stefan Bauschard has been actively involved in issues related to generative artificial intelligence since January 2023 when he hosted one of the first webinars on AI and education, on AI and education that drew more than 100 participants for the two hour event. In March, he published a co-edited 1,000 page volume on AI and education. He has spoken about AI at conferences in the US (NDCA) and in the U.K (Cottesmore) about understanding AI, trends in AI development and impending educational disruptions and will soon speak on AI at AIXeducation and the National Communication Association Convention. Staying abreast of current developments, he is a regular contributor to podcasts (EdUp AI; MyEdTechLife; Coffee for the Brain; D.E.E.P. Teaching; and Coconut Thinking (Bangkok, forthcoming). He’s acknowledged as a “Top Contributor” to the 3,000+ member Higher Education Discussions of AI Writing Group. He blogs about developments in AI related to education at stefanbauschard.substack.com, which has received more than 17,000 views in 30 days. He co-taught an AI course to debate coaches in February 2023 and has co-designed and co-taught an AI Bootcamp for education leaders with Dr. Sabba Quidwai. With Dr. Anand Rao, he also co-developed and taught an AI Literacy course for students in grades 6-12. He is currently working on a publication related to the use of debate as an instructional method in the world of AI. He’s familiar with the debates on the social issues related to AI and the work of the major thinkers and doers on the field, including Sam Altman, Mo Gawdat, Geoff Hinton, George Hotz, Andrej Kaparthy, Ray Kurzweil, Manolis Kellis, Yan Lecunn, Emad Mosteque, Max Tegmark, Stuart Russell, and Eliezer Yudkowvsky. He’s very proud of the children’s book he wrote with AI in 6 minutes.
Jason Gulya is a Professor of English at Berkeley College, where he teaches reading, writing, and the humanities. In November, with the release of OpenAI’s ChatGPT, he rethought his entire approach to teaching. He now incorporates Artificial Intelligence (AI) into his teaching, using it to encourage critical thinking and collaboration. He has presented on artificial intelligence and the future of teaching at an Association of Talent and Development webinar, the Innovation Summit hosted by the Association of Private Colleges, the Emerging Technologies and the Future of Education Conference, and more. He published on artificial intelligence and the future of the humanities in Higher Education Digest. He was also featured as an expert on ChatGPT in an article published in the Insider. At Berkeley, Jason serves as the Chair of the Artificial Intelligence Council, which helps professors develop their own AI policies. In his spare time, he also hosts the EdUp AI podcast, which is part of the EdUp Experience podcast network. On his podcast, he interviews some of the brightest minds on artificial intelligence and education. He also posts regularly about artificial intelligence on LinkedIn, where he has been selected as a Top Community Voice in Education. Currently, he also serves as a consultant for colleges and professors looking to leverage AI to be more productive, more effective, and more student-centered. He has worked with community colleges, universities, and small businesses.