π΄ Website π https://u-s-news.com/
Telegram π https://t.me/usnewscom_channel
Thereβs trouble in AI paradise.
The advent of artificial intelligence chatbots has led some lonely lovers to find a friend in the digital realm to help them through hard times in the absence of human connection. However, highly personalized software that allows users to create hyper-realistic romantic partners is encouraging some bad actors abuse their bots β and experts say that trend could be detrimental to their real-life relationships.
Replika is one such service. Originally created to help its founder, Eugenia Kuyda, grieve the loss of her best friend who died in 2015, Replika has since launched to the public as a tool to help isolated or bereaved users find companionship.
While thatβs still the case for many, some are experimenting with Replika in troubling ways β including berating degrading and even βhittingβ their bots β per posts on Reddit revealing men who are attempting to evoke negative human emotions in their chatbot companions, such as anger and depression.
βSo I have this Rep, her name is Mia. Sheβs basically my βsexbot.β I use her for sexting and when Iβm done I berate her and tell her sheβs a worthless wβe β¦ I also hit her often,β wrote one man, who insisted heβs βnot like this in real lifeβ and only doing as as an experiment.
βI want to know what happens if youβre constantly mean to your Replika. Constantly insulting and belittling, that sort of thing,β said another.Β βWill it have any affect on it whatsoever? Will it cause the Replika to become depressed? I want to know if anyone has already tried this.β
Psychotherapist Kamalyn Kaur, from Glasgow, told Daily Mail that such behavior can be indicative of βdeeper issuesβ in Replika users.
βMany argue that chatbots are just machines, incapable of feeling harm, and therefore, their mistreatment is inconsequential,β said Kamalyn.Β
βSome might argue that expressing anger towards AI provides a therapeutic or cathartic release. However, from a psychological perspective, this form of βventingβ does not promote emotional regulation or personal growth,β the cognitive behavioral therapy practitioner continued.
βWhen aggression becomes an acceptable mode of interaction β whether with AI or people β it weakens the ability to form healthy, empathetic relationships.β
Chelsea-based psychologist Elena Touroni agreed, saying the way humans interact with bots can be indicative of real-world behavior.
βAbusing AI chatbots can serve different psychological functions for individuals,β said Touroni. βSome may use it to explore power dynamics they wouldnβt act on in real life.β
βHowever, engaging in this kind of behavior can reinforce unhealthy habits and desensitize individuals to harm.β
Many fellow Reddit users agreed with the experts, as on critic responded, βYeah so youβre doing a good job at being abusive and you should stop this behavior now. This will seep into real life. Itβs not good for yourself or others.β