A.I. and Education: The Peril of Chatbots (1/x)
- Kevin D
- Nov 5, 2024
- 2 min read
Last week, I pointed to the possibility of using chatbots as web-based tutors and aides in the classroom - assisting and monitoring students at a lower and less personal level - while allowing teachers to focus on the students requiring more academic or behavioral support. Essentially, building off of the idea of algorithm-directed learning paths within blended learning software, to add conversational support.
I think this is a great use case, but one that - like all classroom tools - has its own issues. Like all tools, issues of monitoring, proper use, classroom management, and engagement arise. These are the same issues (if to a different degree) that doodling, off task behavior, or general boredom occupy with traditional or blended learning pedagogy cause. Students will not always be on task or might not always understand what to do - an age-old issue for teachers since the era of rhetorical discussion in The Academy.
Chatbots however present some new problems. Today - the intimacy issue.
Chatbots - by their nature - are conversational and engaging. They are programmed to be so. Claude AI is supposed to show "curiosity, empathy, and ethical judgment," while GPT tends be more neutral and helpful (although it can be changed!).
Character.ai - one of the most popular chatbot apps - has as its purpose creating a personality-driven bots rooted in what their creator specifies, which tends to be task-oriented, drawn from fiction/celebrities/history, or personality-driven.
This customization issue can lead us to a Her-style problem. Will students see their chatbots as more than a tool; but as a friend? This could be especially fraught for the lonely, unique, or otherwise troubled amongst us. It certainly is possible that this is already happening leading to horrific consequences.
The inverse is also possible - not just students "buddying" up with chatbots, but feeling denied an essential essence of our communities - positive relationships with adults in authority and other students.
Students need to learn to work with others both in authority and as peers; would constant interaction with an A.I. lower the already declining social skills of the current generation?
With that in mind, it remains essential that A.I. chatbots are seen as a tool and supplement and not a replacement for teachers, human interaction, or community. Good pedagogy, culture, and relationships remain essential. Just as any tool should be introduced with appropriate guidelines and support, chatbots must be implemented in a likewise manner.
Efforts for digital citizenship and privacy for students - where personal data is screened apart from public data and where potential negative interactions are flagged for parental and administrative monitoring could be helpful as well. The use of these tools needs to be done in a positive way where public accountability can push positive tech-based outcomes for our students (unlike concerns arising with student data and Google for free Google Ed Apps).
Of course the intimacy issue is only one...

Commentaires