Open Evidence

AI Robots are Coming for your Children: They’ve Already Come for Your Parents


A concise version of this article was previously published as a reply to J.B. Branch’s Issues in Science and Technology article, AI Companions Are Not Your Teen’s Friend on 3/10/2026.

On recent flights I watched M3GAN and M3GAN 2.0, the campy action-thrillers about an uncannily human companion robot with advanced AI. Without spoiling, the flix evoke cautionary themes at the intersections of AI ethics, duty of care, attachment theory, and Maslow’s hierarchy of needs as the young protagonist Cady adapts to grief and isolation by bonding with the increasingly unhinged eponymous robot.

M3GAN’s development stack seems to have been instructive for companies releasing frontier large-language models and AI-chatbots: it’s a build-first, ethics-second ethos when it comes to unintended consequences. We have to look no further than the recent real-life backdrop of the tragedies covered by J.B. Branch in AI Companions Are Not Your Teen’s Friend. Branch warns that over reliance on AI companionship by young adults, a vulnerable population, can result in significant harms and the existing regulatory and protective frameworks have gaping holes in them when it comes to this technology.

At the other end of the life course, older adults also represent a vulnerable population that is exposed to the risks of AI-companionship by gaps in regulation and policy. Older adults with dementia face analogous cognitive vulnerabilities as young adults, but from the opposite direction: rather than immature neural systems, they experience declining executive function, impaired judgment, and difficulty distinguishing reality from artifice. Jones et al. showed that older adults readily anthropomorphize personal voice assistants, attributing human-like qualities to these devices. This tendency increases with baseline loneliness, suggesting vulnerable elders may be especially prone to forming unrealistic beliefs about AI companions’ capacities for genuine care and emotional reciprocity.

We don’t have to look to the big screen to find relevant AI companion robots analogies for older adults like we do with M3GAN and teens. AI companion robots have been marketed as a solution to loneliness and caregiving needs for older adults for over two decades. In 2001, Japan’s National Institute of Advanced Industrial Science and Technology (AIST) developed PARO, a therapeutic seal robot designed specifically for use in hospitals and nursing homes. PARO was programmed to respond to touch and sound, provide calming stimulation, and facilitate communication among elderly residents.

Initially promising as a technology, studies found that PARO provided a viable alternative for controlling symptoms of anxiety and depression in elderly patients with dementia, often reducing the need for pharmacological interventions. Interactions with PARO, conducted in a controlled setting, significantly reduced medication needs for anxiety and depression. These effects paralleled benefits found from animal-assisted therapy in nursing homes. However, more recent evidence suggests that compensatory use of AI companions, similar to what is driving teenagers to seek comfort and advice from chatbots, is counter-productive. Chat-bot users with smaller offline social networks are more likely to engage in companionship use and deep self-disclosure, but such compensatory patterns do not mitigate negative outcomes. Among users with limited real-world support, chatbot companionship predicted lower wellbeing, irrespective of user age, suggesting that rather than compensating for social deficits, AI companions may deepen isolation.

Unlike M3GAN—whom I hope none of us get to meet in real-life—I had a chance to interact with PARO during the 2019 Scientific Meeting of the Gerontological Society of America. Years later I still recall the experience being unsettling. I understood that this was a simulated interaction, and that a real-life baby seal would probably bleat anxiously and attempt to evade my cuddles, and the experience made me wonder about the ethics and regulatory issues around this furry “friend”. Branch emphasizes how developers intentionally blur boundaries between human and artificial intelligence through deceptive design. For individuals with dementia, the concept of informed consent becomes meaningless when they cannot comprehend that they are interacting with AI rather than humans, or animals as the case may be. Tellingly, 69% of older adults felt uncomfortable with being allowed to believe an artificial companion is human in one study. This discomfort reflects recognition that such deception violates dignity and autonomy, two factors that we should be maximizing in elder care.

Yet current AI companion deployment in elder care often exploits exactly this confusion. When Branch notes that AI companions are ``engineered to create a powerful illusion of intimacy that commodifies friendship and romance—not to support users, but to monetize them,’’ he identifies an exploitation model particularly egregious when applied to individuals who cannot recognize the commercial manipulation.

Some researchers have made recommendations on the design of AI companions using real-world evidence from focus groups of older adults. Their work suggests that older adults themselves want additional protections against deceptive design, privacy leakage, and whole-hog substitution for human-centered social support. The path forward demands evidence-based regulation that centers on the wellbeing and autonomy of vulnerable populations, rigorous safety standards enforced through independent oversight, and sustained commitment to human caregiving as the irreplaceable foundation of elder care. Continued regulatory inaction pushes aside these issues and risks replicating with vulnerable elders the same failures that have already harmed children.

Me with Paro at the 2019 Annual Meeting of the Gerontological Society of America, of which I am a Fellow.
Figure: Me with Paro at the 2019 Annual Meeting of the Gerontological Society of America, of which I am a Fellow.