AI Won’t Replace Teachers — But A Classroom Revolution is Coming

Teachers must remain in charge, but for that, they too will need to evolve. Here’s how.

Momo Bertrand | 25 May 2023
No image

I recently asked Bard, Google’s conversational chatbot, whether artificial intelligence would replace teachers. Here’s what it said, “It is unlikely that AI will completely replace teachers in the near future.”

During a poetry night, I remember joking with a friend that it takes a broken heart to nurture and heal another heart. I added, “Until AI experiences heartbreaks, we must trust human teachers to nurture the hearts and minds of the next generation.”

Yet it’s hard to ignore the growing questions and concerns emerging from — and about — the teaching community on the impact of AI on their jobs, their classrooms and their very vocation.

Governments, foundations and corporations have channelled billions of dollars to research, develop, and deploy AI systems in recent years, which broadly speaking, can perform intelligent tasks normally associated with humans.

For instance, Bard and fellow conversational chatbot ChatGPT can write essays, give feedback on computer code, and even pen elegant poems. AI is also employed to power voice assistants like Siri, recommend products on e-commerce sites, and detect deadly diseases among others.

At the moment, AI still lags behind humans in most disciplines, especially complex tasks that require a blend of technical competencies and socio-emotional skills. In fact, many experts agree that in the near term, AI will mostly complement rather than replace humans.

Importantly, even as AI advances, we must not relinquish all things cognitive to machines. Doing so would not only exacerbate tech dependence but also undermine critical thinking and reflection which are essential aspects of the human experience. We must continue to teach children how to think.

However, AI is forcing us to reimagine education as a vehicle for democratising thinking and knowing. There is no denying that. About 40% of the world’s population is under 24. If schools fail to prepare this generation of youth for the age of thinking machines, the consequences on social and economic peace may be dire.

AI has the potential to underpin positive transformation in education. For instance, AI-powered computer vision and voice-to-text apps can significantly boost school accessibility for learners with visual and hearing impairments. AI can also reduce teacher workloads, especially in environments where teachers’ capacity and headcount are low. However human educators must remain central to teaching and learning.

Prepare students to ask better questions

A young university official in Cameroon recently told me that he and his colleagues were “trying to see how our classes will prepare students for technology and AI.”

Going forward, more teachers and education officials will have to think in this way. On the surface, this requires reviewing curricula, syllabi and teacher professional development programmes, and incorporating objectives and content on AI literacy, risks, ethics and skills, among other things.

At a deeper level, as machines become better at answering questions, educators should guide students to ask better questions. This will go beyond writing good prompts for conversational AI. Today’s schools should inspire students to be curious as this is an essential ingredient to conducting primary research, including in frontier areas, where humans have an edge over AI.

Additionally, as AI heralds rapid transformation and change in labour markets, socio-emotional skills like adaptability should become central to curricula. Educators should aim to plant the seeds of adaptability in the hearts and minds of students.

When change becomes the only constant, we should not just help students to learn, we must inspire them to love lifelong learning.

Help avoid echo chambers

AI is almost certain to worsen the problem of misinformation. Very soon, anyone with an internet connection will be able to produce solid arguments on any subject simply by inputting a prompt into an AI platform.

Echo chambers could grow exponentially if we don’t train today’s young people to find common ground and hold peaceful conversations with people they don’t agree with.

Short of action, AI may feed the flames of extremism and polarisation.

Tackling the most pressing challenges of our time — climate change, pandemics, migration — will require unprecedented levels of collaboration at the global, regional and national levels. While AI will unlock new possibilities to analyse, organise, and process information necessary to fix these issues, this potential will be useless if we can’t talk to each other.

That is why teaching learners the ability to find common ground is so important.

Use AI as a teaching assistant

We’ve known for decades that students learn better when teaching is personalised. However, limited teacher headcounts and rapidly growing student populations, especially in low-income countries, have prevented tailored teaching approaches from fully taking off.

AI could change this.

Adaptive learning technologies powered by AI are already showing promising results in literacy and numeracy. Typically, AI-powered adaptive learning solutions assess students’ current knowledge and competencies, identify gaps, deliver content and quizzes at the right level, and provide feedback to improve learning outcomes.

A World Bank review has reported promising results from adaptive learning pilots around the globe. AI can thereby help complement teachers’ efforts and underpin significant improvements in educational outcomes.

To be clear, human educators will remain pivotal to learning. In the same way that libraries and search engines don’t take away instruction responsibility from teachers, human educators must remain central in the age of educational AI.

Teachers will still set ambitious learning goals, lead instruction, and motivate and inspire learners among other key tasks.

AI needs education

Importantly, the use of AI educational solutions should take into account issues of privacy, inclusion, bias and accuracy. Currently, generative AI often produces inaccurate, biased, racist and sexist responses.

Academic institutions can help address this. They can serve as spaces for debate, research and experimentation aimed at making AI more safe, inclusive, accurate and obedient. Universities can also apply a rigorous research lens to separate hype from reality and ensure that the technology serves rather than harms shared human development.

Academics can also play a vital role in helping governments to anticipate and manage the disruptive effects of AI. For instance, as AI disrupts sectors and occupations replacing old jobs and creating new ones, tertiary education institutions will be vital to skill, upskill, and reskill today’s workforce for the future.

The future

Innovation works in mysterious ways, and we are barely witnessing the first moments of AI’s Cambrian explosion.

No one knows what the Age of AI will bring.

However, we know that the pace of change will accelerate. The skills landscape will shift. Education will have to evolve. Therefore, we can adapt our curricula and instructional techniques to match a world where machines think.

We can teach learners to find common ground and hold peaceful conversations, even when they don’t agree with their interlocutors. We can empower our teachers and lecturers not only to use AI for adaptive learning, but also to make AI solutions in education and beyond more safe, inclusive, secure and obedient.

The journey will be long. We may stumble. We may fall. But we must rise again. We must keep walking, to ensure that AI contributes to creating a world where knowledge is democratised and used for the common good.

Momo Bertrand is an Education Specialist at The World Bank. He hails from Cameroon and previously worked as a Training and Communications Specialist at the United Nations (ITCILO).

This article was originally published on Al Jazeera.
Views in this article are author’s own and do not necessarily reflect CGS policy.