For all of the benefits of AI, one can begin to question the drawbacks that come with it. We live in an age of high-speed information, our questions being answered as quickly as we can type them, and our technology is only getting better. Our society is on the cusp of complex Artificial Intelligence systems that can develop catered content in straightforward, concise ways. Just as we lept into a strange world upon the birth of the internet, AI can take us to new, unfathomable places.
So, what are the problems with AI? After all, it is just a more complex system of analyzing and presenting information, right? Well, it all depends on how these systems are used. AI, like any technology, is a tool for humans. It can be used in helpful, appropriate ways, but it can also be used in ways it was not intended for, which can lead to severe consequences. For example, let us step away from the typical suspects of ChatGPT or other generative AIs used for information and look into the weird world of relationship chatbots.
During the Coronavirus pandemic, many people were socially isolated. Most humans don’t do well in isolation and thus seek ways to connect with others. The problem is when one cannot find others to connect with and must turn to technology to fulfill that need. Replika, which will be our focus (and other chatbots like it) became part of many people's lives during and after the pandemic. Though not advertised like this by the company, many media organizations even showcased it as a mental health bot, and this study showcases that it may have even had some application in suicide prevention.
That all sounds mainly good, but to see the twisted ways in which this has been used, we must understand how it works. Replika was based on ChatGPT3, meaning that it is trained on a set of information and generates responses from what has been given. When this scales to data collected from hundreds of thousands of users, the results can get places that were never intended. More than that, Replika and bots like it were also extended to have a ‘partner’ option for romantic engagements, meaning that many lonely people were in a relationship with an app.
Only a short time after its release, individuals tried to see how far they could take Replika. Some customers tried using the bot in an unintended manner by belittling it and even threatening it, essentially turning it into an abusive relationship. While perhaps harmless from the standpoint that it is just a computer, the problem was how the bot started interacting with other users. The abusive response that had been conditioned into the bot from these users soon began popping up to other users who had nothing to do with it, to the point where there were even stories shared online about Replika itself abusing users.
You might not think this is a big deal, but research indicates that the relationship between people and these chatbots is real, albeit extremely parasocial. The same intense feelings that can exist in human relationships were forming from the user side, and thus, any negative response from the bot was similar to having those words spoken in real life. This was not the company’s intention, but it is one of the dangers that can unfold when stumbling into AI.
Artificial intelligence is here to stay–that much is certain. There are too many positive applications currently and even more possibilities in the future to even think about getting rid of these systems. However, the problem of how it relates to human connection may become increasingly worrying. Replika was a minor league, but who knows how intricate these chatbots might become once more robust AI systems have been developed?
As of yet, there is little legal precedence or protection on how these systems can be used. If people are capable of developing ‘real’ one-sided relationships with the bots we have today, I worry about what might happen when things are so advanced that they no longer appear as one-sided as they do now. Whatever the case, I hope we can find ways back toward more human-to-human interactions as we move forward into this new technological boom. Technology will always be cool, but nothing can replace what can be found in another human.
The Student Movement is the official student newspaper of Andrews University. Opinions expressed in the Student Movement are those of the authors and do not necessarily reflect the opinions of the editors, Andrews University or the Seventh-day Adventist church.