Artificial Intelligence

Artificial Intelligence and Social-Emotional Learning Are on a Collision Course


Artificial intelligence is poised to dramatically influence how kids develop their sense of self and interact with one another, their teachers, their families, and the broader world.

And this means that the teaching of age-old social skills might be due for an update, say experts, even as social-emotional skills will be as relevant in an AI-powered world as ever.

Knowing how to build and maintain positive relationships, for example, is a pillar of social-emotional learning. AI could fundamentally reshape our relationships, including who—or what—we form them with, say experts.

“Our humanity and our ability to connect with and empathize and experience positive, loving, caring relationships that are productive for ourselves and society, that is at the core of who we are as humans,” said Melissa Schlinger, the vice president of innovations and partnerships at the Collaborative for Academic, Social, and Emotional Learning, or CASEL. “It’s exciting when technology can promote that, but when it begins to replace that, then it becomes I think a really dangerous problem. I don’t know how you mitigate against that. We see kids already addicted to their phones without AI.”

Generative artificial intelligence tools— chatbots like ChatGPT and the social media app Snapchat’s bot—might pose problems for the development of students’ social-emotional skills: how they learn these skills, how they form relationships, and how they navigate online environments rife with AI-generated disinformation.

Students are already turning to generative AI-powered chatbots to ask questions about how to handle their relationships. They’re asking chatbots questions related to romantic relationships, dealing with issues with family and friends, and even coping with anxiety and other mental health issues, according to a survey of 1,029 high school students by the Center for Democracy & Technology.

Asking a chatbot for relationship advice

Chatbots have quickly become a popular tool to use to ask for advice on a variety of social-emotional issues and topics, said [Pat] Yongpradit, the chief academic officer of Code.org and the lead of TeachAI, a new initiative to support schools in using and teaching about AI. But there is a lot we don’t know about how these chatbots are trained and what information they are trained on. Generative AI technology is often trained using vast quantities of data scraped from the internet— it’s not a search engine or a “fact machine,” said Yongpradit. There’s no guarantee generative AI tools are offering up good or accurate advice.

“Kids are anthropomorphizing these tools because of how they’re represented in the user interface, and they think they can ask these questions,” he said. “People have to understand the limitations of these tools and understand how AI actually works. It’s not a substitute for humans. It’s a predictive text machine.”

Yongpradit points out that people are more likely to use a tool that responds in a human-like way, so if the tool is designed correctly and provides accurate information, that can be a good thing.

But right now, because many AI-powered tools are so new, children and adolescents don’t understand how to properly use these tools, said Yongpradit, and neither do many adults.

That’s one way AI may affect how students are learning to navigate social-emotional situations. But there are others, said Nancye Blair Black, the AI explorations project lead with the International Society for Technology in Education, or ISTE, particularly that these fast-evolving chatbots might even replace human relationships for some kids.

“We’re talking about AI agents that we interact with as though they are human,” said Black. “Whether that is chatbots, whether those are AI robots, whether those are nonplayer characters in video games, this is a whole additional layer. A year ago, those were still very simple interactions. Now we’re finding that they’re getting complex interactions.”

‘Why do the hard work of having a friendship when I have this very supportive chatbot’

Some teens and adults are even developing romantic relationships with chatbots that are designed to offer companionship, such as the service offered by Replika. It allows subscribers to design their own personal companion bots.

Replika bills its chatbots as “the AI for anyone who wants a friend with no judgment, drama, or social anxiety involved.”

“You can form an actual emotional connection, share a laugh, or chat about anything you would like!” it promises. Subscribers can choose their relationship status with their chatbot, including “friend,” “romantic partner,” “mentor,” or “see how it goes.”

Replika also claims that the chatbots can help users better understand themselves—from how caring they are to how they handle stress—through personality tests administered by the personal chatbots.

This was once the stuff of science fiction, but now there’s a concern that compliant chatbots might feed unrealistic expectations of real relationships—which require give-and-take—or even eclipse kids’ interest in having relationships with other people.

Schlinger said this is all new territory for her as well as most educators.

“Why do the hard work of having a friendship when I have this very supportive chatbot—wasn’t there a movie about this?” said Schlinger? “I don’t think it is so unrealistic that we couldn’t see this as a scenario.”

How generative AI could help improve SEL skills

Generative AI won’t be all negative for children’s social-emotional development. There are ways that that the technology can assist children in learning social and life skills, said Black. Imagine, she said, how a chatbot could help kids overcome social anxiety by giving them an opportunity to practice how to interact with people. Or how new translation tools powered by AI will make it easier for teachers who only speak English to interact with their students who are learning English.

And that’s to say nothing of the other benefits AI brings to education, such as personalized virtual tutoring programs for students and time-saving tools for teachers.

When it comes to asking chatbots for advice on navigating social situations and relationships, Schlinger said there’s value in kids having a non-judgmental sounding board for their problems—assuming, of course, that kids are not getting harmful advice. And, Schlinger said, it’s possible that generative AI tools would give better advice than, say, an adolescent’s 13-year-old peers.

But while the core ideas that make up SEL remain relevant, AI will mean changes for how schools teach social-emotional skills.

Black said SEL curricula will likely need a meaningful update.

With that in mind, Yongpradit said schools and families must focus on educating children at a young age about how generative AI works because it could have such a profound impact on how children develop their relationships and sense of self.

The new and improved SEL approaches, experts suggest, will need to include educating kids about how AI might be biased or prone to perpetuate certain harmful stereotypes. Much of the data used to train generative AI programs is not representative of the human population, and these tools often amplify the biases in the information they are trained on. For example, a text-to-image generator that spits out a picture of a white man when asked to create an image of a doctor, and a picture of a person with a dark complexion when asked to produce an image of a criminal, poses real problems for how young people come to understand the world.

Adults should also tune into how they themselves are interacting with technology that mimics human interactions, said Black, and consider what social-emotional norms they may be inadvertently signaling to children and adolescents.

“Chatbots and those cognitive assistants, like Siri and Alexa, those that are supposed to be compliant, those that people are controlling, are almost exclusively given a female persona,” she said. “That bias is going out into the world. Children are hearing parents interact and speak to these female persona chatbots in derogatory ways, bossing them around.”

‘We’ll always crave interaction with other people and I don’t think an AI can meet those needs’

Black recommends, where possible, for educators and parents to change chatbot and other virtual assistant voices to a gender neutral voice and to, yes, even model kindness to Alexa and Siri.

But in the not-too- distance future, will artificial intelligence degrade our ability to interact positively with other people? It’s not so hard to imagine how a variety of everyday interactions and tasks—with a bank teller, a waiter, or even a teacher—might be replaced by a chatbot.

Black said she believes those potential scenarios are exactly why social-emotional learning will be even more relevant.

Social-emotional skills will have an important role to play in helping K-12 students discern true from false information online, as AI is likely to supercharge the amount of disinformation circulating on the internet. Some experts predict that as much as 90 percent of online content may be synthetically generated in the next few years. Even if that prediction falls short, it’s going to be a lot, and social-emotional skills such as emotional management, impulse control, responsible decisionmaking, perspective-taking, and empathy are crucial to navigating this new online reality.

Other skills, such as resilience and flexibility, will be important to helping today’s kids adapt to the rapid pace of technological change that so many are predicting AI will herald.

Said Black: “I think we’ll always crave interaction with other people and I don’t think an AI can meet those needs in the workplace or at home. I think even more so, the things that make us most human—our fallibility, our creativity, our empathy—those are the things that will be most valuable in the workplace because they are the hardest to replace.”





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.