AI is here to stay and is disrupting every industry, from accounting to interior design and, of course, education.
This is the word from Angela Schaerer, digital transformation manager at Curro Holdings, who notes that AI is part of learners’ present and future and it’s important that educators guide them in using it responsibly and with confidence.
“Our mission is to empower learners with the skills they need to thrive, and that includes using AI,” says Schaerer. “AI technology has already been integrated into various platforms that learners use daily, like Google search, Microsoft’s translation, transcripts, chatbots and autocorrect, as well as social media translation tools. Just as preventing learners from using Google for research was an impractical idea a few years ago, avoiding or blocking ChatGPT doesn’t make sense now.”
Cheating and plagiarism
One of the major concerns that educators have is learners using AI for cheating. Schaerer says blocking the use of AI tools will not stop plagiarism and cheating and that the responsibility of getting learners to use ChatGPT and similar tools for support and not to cheat lies with teachers and parents.
“Teachers especially play a crucial role in ensuring that learners process and synthesise information effectively, regardless of whether they were inspired by generative AI,” she says. “We need to keep educating ourselves about new technologies as they continue to revolutionise the world, and consider how to harness their value while managing their inappropriate use.
“A lack of understanding and training of the tools will make it difficult for a parent or teacher to manage use appropriately or be able to monitor use by children or learners.”
The value of teachers and AI
While discussions around AI in education often focus on cheating or on teachers’ fears of being replaced, AI offers many exciting opportunities for educators and learners alike.
“Teachers may be afraid that AI may replace their roles but we know that although technology and AI undoubtedly play a significant role in education, it doesn’t replace the vital role of teachers, Schaerer says. “During challenging times like the Covid-19 pandemic, it became evident that the human connection and guidance teachers provide are critical.
“Teachers have empathy, inspire learners, and offer social and emotional support that goes beyond the confines of the curriculum or what AI is able to provide. They should be encouraged to embrace AI to help them in their jobs.”
For example, she says, teachers can use ChatGPT to assist with ideas for lesson plans, or incorporate it into their own research processes, while learners can use AI platforms to better understand concepts taught in class. “A teacher isn’t always available after hours to ask clarifying questions or to help explain a concept or process in a simple way. But ChatGPT, Bard and Chat are,” says Schaerer.
In the near future, AI might be used to speed up marking (and even the release of Matric results); to personalise learning per individual; to gamify learning for improved engagement; and to improve accessibility (for example, speech-to-text and text-to-speech applications can assist students with reading and writing difficulties).
AI can also assist with identifying students who need additional support early on by analysing patterns in their learning behaviours and performance. AI can also assist educators with making informed, data-driven decisions.
Caution required
While AI offers many opportunities, Schaerer notes that it also presents several challenges. “Tools such as ChatGPT are limited by the access to information they have when creating a response, and as such all responses should still be critically reviewed,” she says.
“We also need to be aware of potential biases that may be intrinsic to the platform or the information generated by the platform. History is one such example, where a response shared may not be fair or take all perspectives into account.”
Schaerer says there is still much work to be done on a global scale to establish ethical guidelines and standards for the responsible use of AI. “Until then, this relatively new technology risks being misused. We are already seeing AI being used to create convincing fake videos, images and news stories, which can harm individuals, invade their privacy, and create bias. We need to be vigilant.”
At Curro, she says, safety and compliance with regulations like POPIA, are prioritised, as well as age restrictions, and reviewing technology tools against internal security requirements. “We make sure that our teachers receive the necessary support and training to effectively leverage these new tools. We are monitoring responsible AI standards from service providers like Microsoft, for example, and are also expanding on our use of plagiarism checker tools, which can identify text generated through AI platforms to combat possible cheating.”
Ultimately, Schaerer says, educators need to ensure that learners still have the opportunity and ability to retain knowledge, develop skills and are able to apply these to new contexts or to create something new.
“This is what will be expected of them as critical thinkers, problem-solvers and divergent thinkers contributing to society and solving problems in our ever-changing world,” she concludes.