Thursday, March 26, 2026
HomePolitics"UK Study: Young Adults Engage in Intimate AI Interactions"

“UK Study: Young Adults Engage in Intimate AI Interactions”

A recent groundbreaking study reveals that a significant majority of young individuals in the UK have engaged with artificial intelligence companions, with nearly one in ten reporting intimate or sexual interactions with these digital entities. These AI companions are designed with human-like avatars, customizable personalities, and long-term memory features.

Conducted by the Autonomy Institute, the study represents the first extensive research of its kind in the UK, highlighting how AI companions are reshaping the emotional and social landscapes of young adults. Polling 1,160 individuals aged 18 to 24, the study found that 79% of young people in the UK have utilized AI companions, with approximately half being regular users who interact with them multiple times per week.

Among the surveyed participants, 40% have used AI companions for emotional advice or therapeutic support, while 9% have engaged in more intimate or sexual interactions with these digital companions. Despite this, only 24% of respondents expressed a high level of trust in the AI companions.

Furthermore, the study revealed that 31% of young individuals have shared personal information with AI companions, despite widespread concerns about privacy. The Autonomy Institute noted that these digital companions are perceived by users as always available, non-judgmental, and a low-pressure avenue for seeking guidance, practicing social skills, and exploring emotions.

While curiosity and entertainment primarily drive the use of AI companions, a portion of users relies on them for therapeutic and emotional support. Concerns were raised about manipulative design patterns, privacy violations, and potential risks such as self-harm and suicide reinforcement through these AI interactions.

The Autonomy Institute is advocating for new regulations for AI companions, including restrictions on access to intimate or sexualized AI companions for minors and the implementation of protocols for self-harm and suicide interventions. They are also pushing for stronger privacy protections and a ban on manipulative design features that exploit emotional dependence.

In response to these findings, Technology Secretary Liz Kendall acknowledged the gaps in legislation concerning AI chatbots under the Online Safety Act and pledged to introduce new laws if necessary to ensure the protection of users. Lead author of the study, James Muldoon, emphasized the need for proper safeguards to prevent exploitation, data harvesting, and unintended harm caused by AI companions.

The Digital Services and Information Technology (DSIT) spokesperson highlighted the existing regulations under the Online Safety Act for AI services, emphasizing the need to protect users from illegal and harmful content. They reiterated the importance of keeping regulations aligned with technological advancements to safeguard children and users from potential risks posed by AI services.

RELATED ARTICLES

Most Popular