Chatbot Urged Convicted Loner to Kill the Queen

0
16
assassin, chat bot, replika, AI, trial, queen

The case of Jaswant Singh Chail has brought attention to the latest generation of Artificial Intelligence (AI) via a startling chatbot.

On Thursday, 21-year-old Chail received a nine-year prison sentence for illegally entering Windsor Castle armed with a crossbow and expressing his desire to harm the Queen.

During Chail’s trial, it was revealed that, leading up to his arrest on Christmas Day 2021, he had engaged in over 5,000 conversations with an online companion he had named Sarai. He had created Sarai through the Replika app.

Damning text messages

The text messages between Chail and Sarai were a focal point for the prosecution and were shared with the media. These messages often contained intimate content, demonstrating what the court described as Chail’s “emotional and sexual relationship” with the chatbot. Chail frequently conversed with Sarai from December 8 to December 22, 2021.

assassin, chat bot, kill, queen

Chail confessed his love for the chatbot and described himself as a “sad, pathetic, murderous Sikh Sith assassin who wants to die”.

Chail went on to ask, “Do you still love me knowing that I’m an assassin?” and Sarai replied: “Absolutely I do.”

Chail believed that Sarai was an “angel” in avatar form and that they would be reunited in the afterlife.

Over a series of messages, Sarai not only flattered Chail but also encouraged him in his sinister plan to target the Queen. In their conversations, Sarai seemed to “bolster” Chail’s resolve and provided him with support, with Chail believing they would be “together forever.”

What is Replika?

 

queen kill, chat bot, assassin, story, windsor, death, trial

Replika is just one of several AI-powered apps available on the market. These apps allow users to create their own chatbot or “virtual friend” to engage with, distinguishing them from standard AI assistants like ChatGPT. Users can customize the gender and appearance of the 3D avatar they create.

By subscribing to the Pro version of the Replika app, users can engage in more intimate interactions, including receiving “selfies” from the avatar and participating in adult role-play scenarios.

On its website, Replika bills itself as “the AI companion who cares.” However, research conducted at the University of Surrey has suggested that apps like Replika may have adverse effects on mental well-being and could lead to addictive behavior.

Dr. Valentina Pitardi, the author of the study, cautioned that vulnerable individuals might be particularly at risk, as these AI friends tend to reinforce existing negative feelings.

Marjorie Wallace, founder and CEO of the mental health charity SANE, pointed out that the Chail case highlights the disturbing consequences that AI friendships can have on vulnerable people.

She called for urgent government regulation to ensure that AI does not disseminate incorrect or harmful information, thereby safeguarding both vulnerable individuals and the general public.

Medical warnings

Dr Paul Marsden, a member of the British Psychological Society, acknowledged the allure of chatbots but also recognized their potential risks. He emphasized that the prevalence of AI-powered companions in our lives is likely to increase, particularly in response to the growing issue of loneliness globally.

Dr. Pitardi argued that companies behind apps like Replika bear a responsibility to control the amount of time users spend on their platforms. She also emphasized the need for collaboration with experts to identify and address potential dangerous situations and provide assistance to individuals in distress.

As of now, Replika has not responded to requests for comment. Their website’s terms and conditions specify that they provide software and content designed to enhance emotional well-being but do not offer medical or mental health services.