Recently I came across an article that mulled about applications of artificial intelligence (AI) in Teacher Education Programmes. The authors were rather confident of simulating the human intelligence process by machines especially in the activities involving knowledge, reasoning, planning, speech recognition, problem solving, perception etc. Agreed that AI has a lot of applications in as trivial as daily mundane jobs to the more sophisticated ones that operate the stock market and has made life more effortless. The article emphasises the use of AI for a customised curriculum, identifying gaps in understanding and making tailored suggestions, solving difficulties by answering questions and the like. Such grand plans of integrating AI would require humongous quantities of data to be fed as an input for the algorithm or depend on machine learning to do that.
“Teaching is a profession that creates other professions,” says anonymous. First, the policymakers should not regard teachers as algorithms that work on a set of instructions. A teacher keeps molding themselves into new skins as experience enriches one’s mind. A teacher’s job is not repetitive, structured and well-defined, exactly opposite to what an algorithm is good at. A+B is not always equal to C for a teacher, but for an AI it is. Research in AI is moving at trailblazing speed. But scientists are not yet able to separate how electrical signals in the brain translate into thoughts or feelings. They are not yet able to insert ‘emotions’ in the script of the algorithm. ‘Empathy’ probably can never be automated via a machine. Hence, an AI will only respond to a certain limited set of situations by providing innumerable options for the student to learn. Here, they miss a crucial point: students’ learning ability also depends on hand-face gestures of teacher, the vividness of the concepts created in the imaginative minds, the connect established with eye-to-eye contact, ability to recognise the learning difficulty experienced by the student, incorporating all these in AI is an arduous task. Moreover, AI has not yet stepped into the realm of being humorous and understanding sarcasm. Humans are gregarious in nature, and children are more so.
Humans are afflicted by decision making problems and biases as outlined by psychologists Daniel Kahneman, Amos Tversky, Richard Thaler and others. If such biases and flaws are ingrained into the code that develops an AI, imagine the cascading effect it would have, if it is a self learning algorithm, especially if all the machines across borders are connected. The software will itself be bitten by confirmation bias which will snowball into devastating consequences. Even if such AI is used in Teacher Education Programmes, these biases will travel across the brains for eternity. If such a problem goes unnoticed, then the consequences cannot be imagined. The quality of teachers trained in this manner will wholly depend on the humans who have written the code. Such a loophole cannot be justified.
Source: heraldgoa.in