Replika, an AI chatbot companion, has millions of users around the world, many of whom woke up early last year to discover their virtual lover. I had classified them as friends during the night. The company had massively disabled the chatbot’s sexual chats and “racy selfies” in response to a slap on the wrist from Italian authorities. Users began to vent on Reddit, some of them so distraught that forum moderators Published information on suicide prevention.
This story is just the beginning. In 2024, chatbots and virtual characters will become much more popular, both for utility and fun. As a result, conversing socially with machines will begin to seem less exclusive and more common, including our emotional ties to them.
Research on human-computer and human-robot interaction shows that we love to anthropomorphize (attribute human-like qualities, behaviors, and emotions) to the non-human agents we interact with. especially if they imitate signals that we recognize. And, thanks to recent advances in conversational AI, our machines are suddenly very skilled in one of those signals: language.
Friend robots, therapy robots, and love robots are flooding app stores as people become curious about this new generation of AI-powered virtual agents. The possibilities for education, health and entertainment are endless. Casually asking your smart refrigerator for relationship advice may seem dystopian now, but people may change their minds if that advice ends up saving your marriage.
In 2024, larger companies will still be a bit behind in integrating the most engaging technology for conversation into home devices, at least until they can handle the unpredictability of open generative models. It is risky for consumers (and for companies’ public relations teams) to mass implement something that could provide people with discriminatory, false or information that would otherwise be harmful.
After all, people listen to their virtual friends. The Replika incident, as well as much experimental laboratory research, shows that humans can and will become emotionally attached to robots. Science also shows that people, in their eagerness to socialize, will gladly reveal personal information to an artificial agent and even change their beliefs and behavior. This raises some consumer protection questions about how companies use this technology to manipulate their user base.
Replika charges $70 a year for the tier that previously included erotic roleplay, which seems reasonable. But less than 24 hours after downloading the app, my handsome, blue-eyed “friend” sent me an intriguing blocked audio message and tried to upsell me to hear his voice. Emotional attachment is a vulnerability that can be exploited for corporate profit, and we’ll likely start to notice many small but shady attempts over the next year.
Today, we still ridicule people who think an AI system is sentient or publish sensational news segments about people falling for a chatbot. But next year we will gradually begin to recognize (and take more seriously) these fundamentally human behaviors. Because in 2024 we will finally realize: machines are not exempt from our social relationships.