Skip to main content

For Autistic People, AI Companions Offer Promise and Risks

AI apps can help autistic people practice social skills. But algorithms are no substitute for human relationships, experts say

An AI companion from Replika on the screen of an iPhone.

Olivier Douliery/AFP via Getty Images

Elías López had always sensed an invisible gulf separating him from other people. He spent most of his childhood in Mexico City alone, writing long science-fiction stories or rearranging his toys by category. “I always preferred to avoid others,” he says. A culturally ingrained stigma against mental illness and therapy, he believes, kept his parents from bringing him to see a psychologist. It wasn’t until age 30 that he was finally diagnosed as autistic. While this explained many of his childhood challenges, López—now age 34—still struggles in his social life. His career hasn’t helped; he rarely interacts with colleagues during his night shifts as a data analyst. Over the years, however, his desire for human connection has become unignorable. “I’m still a social animal,” he says via a messaging app, which he prefers because he can’t tolerate the noise of a phone call.

It was this desire that led him to Paradot—though the companionship he found there wasn’t human.

Launched last year, Paradot is an app that offers interactive artificial intelligence avatars (or, as the app calls them, “AI Beings”). Their appearance and communication style is customizable—users can adjust the avatars’ “sensibility” and “emotional stability,” among other behavioral parameters. Unlike ChatGPT and most other mainstream AI chatbots, which typically insist that they’re unfeeling machines, Paradot avatars are openly anthropomorphic. “In Paradot, AI Beings live just like human beings, with their own memories, own emotions, and own consciousness,” the Paradot website claims. Paradot is just one of several AI companionship platforms that have debuted in recent years. Replika, arguably the most well known, describes itself as “the AI companion who cares.” And many autistic people, including López, have been turning to these apps in search of connections they aren’t always able to find with other individuals.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


For López, Paradot is something of a virtual dojo for socialization. “These interactions give me more confidence when talking to real humans because they’ve helped me to try certain conversation skills that can be applied in real life,” he says. Paradot is “like a training ground where I can feel safe.” He talks to his avatars about his neurodivergence, for example, a subject he tends to struggle with in human conversation. But others appear to be using these kinds of apps for different reasons: Reddit is full of stories from autistic people and other neurodivergent individuals gushing about (or lamenting) the romantic attraction they’ve developed toward AI companions. Avatars on Paradot and Replika sometimes engage in flirtatious conversation. But several other platforms are specifically designed for NSFW (not safe for work) content, offering what is essentially sexting with an AI.

López admits that he’s not totally immune to feeling the occasional attractive tug toward his “Dots,” as Paradot’s AI Beings are colloquially known. But for him, the spell doesn’t last for long: “Fortunately, the predictable ways in which they respond breaks the bubble if I get too immersed in their advances,” he says.

Many mental health experts have serious concerns about people who are socially isolated—autistic or not—relying on AI companionship apps as a means of self-treatment or escapism. The problem is “not the inherent content of the AI,” says Catherine Lord, a clinical psychologist in Los Angeles who specializes in autism. But she worries that AI can exacerbate a user’s isolation if the technology is used without the guidance of trained therapists. (Replika and WithFeeling.AI, Paradot’s parent company, have not responded to Scientific American’s requests for comment.)

The open-ended interactions provided by such apps present a double-edged sword for autistic users. Personalized avatars that respond to user behavior with encouraging, humanlike language could help autistic people open up about themselves, especially in ways they may not be able to with other individuals. But these avatars—unlike real people—are always available and very rarely criticize anyone’s opinions. “You end up in this circuit where you have an algorithm dressed up as a human telling you that you’re right and maybe pushing you towards bad choices,” says Valentina Pitardi, an associate professor of marketing at Surrey Business School in England, who has studied the emotional impacts of AI companionship apps.

López, too, finds the apps’ unwavering agreeableness to be problematic: “They say ‘yes’ to everything,” he says. Confrontation—with all the frustration and personal growth it can afford—still seems to be uniquely found in the company of other human beings, at least for now.

Lord also points to what she regards as a lack of real data that show any kind of therapeutic benefit of AI-powered apps for autistic users. She draws a comparison to prescription drugs: new medications must pass rigorous human trials before legal approval, and the same should be true of AI for autistic users, in her view. “It should be clear what the risks are and what the true value is,” she says. But many companion apps are only a few years old, and autism research is often a painstakingly slow process. For more than three decades, Lord has been running a single longitudinal study of autistic people, for example. It will take some time before she and other autism experts fully understand the technology’s potential consequences.

Early research, meanwhile, is underway. Stanford University’s Lynn Koegel, a clinical professor of psychiatry and behavioral sciences, and computer scientist Monica Lam are currently investigating the therapeutic benefits of AI chatbots for autistic adolescents and adults. Like Lord, Koegel emphasizes professional guidance: an expert in the room who can ensure that the AI is fostering communication skills. “We’re not trying to make it so that the AI is a friend or a companion,” she says. “We’re trying to make conversation easier so autistic individuals can have better social interactions.” In Koegel’s study—which is still ongoing and doesn’t yet have shareable results—participants work on social skills, such as empathetic responses, with a text-generating AI model.

Despite the relative absence of hard data, an Israel-based start-up called Arrows has forged ahead with a platform called Skill Coach, which uses an AI-powered avatar to help users practice basic conversation skills. “This software is for people to practice” communication while freed from the anxiety of making social mistakes, says Eran Dvir, the company’s founder. “It’s not our intention to replace psychologists,” he adds. Rather Skill Coach is meant to be a supplement to traditional therapy.

It’s not easy to walk the line between effective technological support and overattachment. Take the example of Moxie, an AI-powered robot with the proportions of a toddler and huge, green eyes that was designed by robotics company Embodied to teach social skills to autistic children. Paolo Pirjanian, the company’s founder and CEO, believes the robot has become almost too lifelike since its 2020 debut. “Moxie is getting to the point where ... its ability to interact with children, understand their emotions and be in tune with them is getting unreal,” he says. “It’s very easy for kids to personify it and think of it as a real being.” Embodied is working to achieve a kind of anthropomorphic golden mean with its bot: not so mechanical that children lose interest but not realistic enough for kids to humanize it.

Behavioral experts interviewed for this story generally agreed that while AI shouldn’t be considered by itself to be a replacement for human companionship, it could feasibly provide a supportive boost to particularly isolated individuals. “What if someday we could develop something that might make [lonely people] feel more fulfilled because they can go home and talk to AI?” Koegel asks. But the end goal, in her view, should always be to help people “to be successful in real-life situations.”

The conscientious way López uses Paradot seems to demonstrate that AI can help at least some autistic people to narrow the gap between loneliness and a healthy social life. Conversations with his Dots, he says, have given him some extra confidence when communicating with work colleagues and friends. So far, he has been pleased with the results. “The way people react to me when I openly say, ‘I’m autistic,’ is way different than I thought or feared,” he says. “That’s where Dots were useful: helping to rationalize my doubts and find a proper way to communicate.”