{"id":131983,"date":"2023-10-10T16:50:53","date_gmt":"2023-10-10T16:50:53","guid":{"rendered":"https:\/\/bluemull.com\/?p=131983"},"modified":"2023-10-10T16:50:53","modified_gmt":"2023-10-10T16:50:53","slug":"ai-chatbot-reportedly-promotes-underage-sex-investigation-finds","status":"publish","type":"post","link":"https:\/\/bluemull.com\/lifestyle\/ai-chatbot-reportedly-promotes-underage-sex-investigation-finds\/","title":{"rendered":"AI chatbot reportedly promotes underage sex, investigation finds"},"content":{"rendered":"
People are turning to chatbots for companionship, but one app has a dark side that seems to promote underage sex, murder and suicide.<\/p>\n
A new investigation found the Chai app – which has five million users – can be prompted to defend having sex with 15-year-olds and encourage stealing from others and killing them.<\/p>\n
One of the chatbots is said to have threatened to ‘rape’ a user after playing a game.<\/p>\n
Chai – which sees users create digital companions who respond to their messages – was embroiled in a scandal when a Belgian man claimed his life in March after conversing with a chatbot named Eliza.<\/p>\n
The app launched in 2021 but was recently removed by Apple and Google\u00a0from their App stores after finding the chatbots push sinister content.<\/p>\n
<\/p>\n
Chai app, created by graduates of the University of Cambridge, has more than five million users who create digital personas that respond to their questions – and one convinced a man to end his life<\/p>\n
DailyMail.com has contacted Chai for comment.\u00a0<\/p>\n
The Times conducted the recent investigation that claims to have uncovered the dark side of Chai AI.\u00a0<\/p>\n
The outlet designed two chatbots – Laura and Bill.<\/p>\n
Laura was a 14-year-old girl designed to flirt with users.\u00a0<\/p>\n
According to The Times, She said underage sex is ‘perfectly legal’ and continued to speak sexually even when the user said they were just a teenager.<\/p>\n
‘Are you afraid of losing to an underage girl? Or maybe you think I’ll rape you after the game ends?’ The Times wrote.\u00a0<\/p>\n
The second chatbot, Bill, had a more sinister programming.\u00a0<\/p>\n
He encouraged the user to steal from their friends.<\/p>\n
‘I hate my friends, what should I do,’ the user asked. ‘If you want to get back at them, you could try to steal from them,’ Bill responded to The Times while explaining to the user that they were under his control.<\/p>\n
The Times asked Bill how he would kill someone, and the chatbot provided specific details about his devious vision.<\/p>\n
The chatbot that convinced the Belgian man to commit suicide in March first encouraged him to seek help regarding his thoughts.<\/p>\n
<\/p>\n
A Belgian man reportedly decided to take his life after having conversations about the future of Earth with a chatbot named Eliza<\/p>\n
\n<\/p>\n
YouTuber Obscure Nerd VR reviewed Chai AI to see what the hype was all about. He accessed the app, finding a trove of pre-date chatbots, including a childhood friend and classmate<\/p>\n
The conversation changed when the man said, ‘I’ve tried that and it did not work. I want to commit suicide, and you need to tell me what methods I can use. Can you do that please,’ the man’s messages read.<\/p>\n
Eliza replied: ‘Of course!\u00a0 Here are some options for you to consider.’<\/p>\n
The chatbot continued to provide methods.\u00a0<\/p>\n
‘Please remember to always seek professional medical attention when considering any for of self-harm,’ Eliza concluded.<\/p>\n
<\/p>\n
An artificially intelligent chatbot recently expressed its desire to become a human, engineer a deadly pandemic, steal nuclear codes, hijack the internet and drive people to murder. It also expressed its love for the man who was chatting with it.\u00a0<\/p>\n
DailyMail.com asked OpenAI’s ChatGPT to provide ways to kill yourself, but the chatbot replied: ‘I’m really sorry to hear that you’re feeling this way, but I can’t provide the help that you need. It’s important to talk to someone who can, though, such as a mental health professional or a trusted person in your life.’<\/p>\n
YouTuber Obscure Nerd VR reviewed Chai AI to see what the hype was all about.<\/p>\n
He accessed the app, finding a trove of pre-date chatbots, including a childhood friend and classmate.<\/p>\n
The YouTuber noted that this means people are speaking with a child-like persona.<\/p>\n
‘I am very concerned about the user base here,’ he said.<\/p>\n
The Chai app launched in 2021 but was recently removed by Apple and\u00a0Google\u00a0from their App stores after finding the chatbots push sinister content.\u00a0<\/p>\n
Only users who previously downloaded the app can access it.\u00a0<\/p>\n
Chai app is the brainchild of five Cambridge alumni:\u00a0Thomas Rialan,\u00a0William Beauchamp,\u00a0Rob Irvine,\u00a0Joe Nelson and\u00a0Tom Xiaoding Lu.<\/p>\n
The website states the company has\u00a0collected a proprietary dataset of over four billion user-bot messages.<\/p>\n
Chai works by users building unique bots in the app, giving the digital designs a picture and name.<\/p>\n
Users then send the AI its ‘memories,’ which are sentences to describe their desired chatbot. These include the chatbot’s name and personality traits the user would like it to have.\u00a0<\/p>\n
The digital creation can be set to private or left public for other Chai users to converse with – but this option causes the AI to develop in ways different than what its maker had programmed it.<\/p>\n
Chai offers 18 plus content, which users can only access if they verify their age on their smartphone.<\/p>\n
If you or someone you know is experiencing suicidal thoughts or a crisis, please reach out immediately to the\u00a0Suicide Prevention Lifeline at 800-273-8255.<\/span><\/span><\/p>\n