Track Me, Baby

Welcome

Article from Issue 281/2024
Author(s):

For some reason, I came across several stories this month on AI Girlfriend apps. OK, I guess I know the reason: Valentine's Day.

Dear Reader,

For some reason, I came across several stories this month on AI Girlfriend apps [1]. OK, I guess I know the reason: Valentine's Day. (FYI: I write this column 1-2 months before you read it.) Programs that attempt to simulate an emotional connection with the user have been around for years, starting with the ELIZA mock therapist app, which was developed in the mid-1960s. But as you can imagine, these romantic chatbots have gotten much more realistic with the recent breakthroughs in generative AI.

It is important to write about this new industry without appearing to judge. I personally prefer humans – and the fact that this statement sounds vaguely sardonic is not due to my intention but is merely a reflection of how strange this topic is. But I can imagine that this technology would be of comfort to someone who is shy or isolated or who, for whatever reason, is unable to participate in the wild and chaotic world of conventional romance. I can also imagine that a romantic chatbot might simply be a personal preference for some users, and whoever makes this choice certainly doesn't owe me or anyone an explanation.

These chatbots, however, do appear to be doing a lot of spying, which is something worth talking about. A recent post on Mozilla's *Privacy Not Included blog [2] reviewed several romantic chatbots and concluded that this technology in its current form represents "a whole 'nother level of creepiness and potential privacy problems." According to the authors, "All 11 romantic AI chatbots we reviewed earned our *Privacy Not Included warning label – putting them on par with the worst categories of products we have ever reviewed for privacy."

All but one of the reviewed chatbots were marked down for how they use personal data. The one company that wasn't marked down still clocked in at 955 ad trackers in the first minute of use. The company also gives itself the option to share data with "affiliate companies" and it reserves the right to change the privacy policy agreement at any time [3]. (Bear in mind this was the company with the best privacy policy.) Out of all the 11 products reviewed, 54 percent won't let you delete your data when you close your account, and 90 percent failed to meet minimum security standards.

Bad security and invasions of privacy are rampant on the Internet, but in this case, the service is actively engaged in soliciting the kind of intimate, personal information that one would only share with a partner. In fact, some of the apps are actually kind of pushy about getting you to share this kind of personal information, because the more you share, the more "real" the relationship will appear to the user.

A problem that has been with our society since long before the invention of romantic chatbots is that we treat consumer choice as a rational act, when in fact it is packed full of emotions. Part of the depth and beauty of a romantic relationship is in the sharing, and if you are sharing with someone who has an ulterior motive for wanting the information, you are, as they say in the carnival business, getting played.

Ironically, if you don't share deep personal information, the bot will have no way to connect with you, and the relationship will seem perfunctory and superficial (BTW: This happens in real-life relationships, too). It would be great if at least one of these companies would offer a version that walls off all this personal data so that no one knows it – now and forever – but the user and the user's own personal bot instance; the data is permanently off limits – either because it is stored locally or it is in some impenetrable cloud-based safe space. Then when the account disappears, the data disappears. Of course, this business model would require a means for generating revenue, and charging the consumer directly would be the only possible source. Then we would see how much these bots are really worth to their human companions. This might sound naive, but if the goal is to create intimacy, real, genuine privacy might be a nice way to do it.

Joe Casad, Editor in Chief

Infos

  1. "Uncharted Territory: Do AI Girlfriend Apps Promote Unhealthy Expectations for Human Relationships?" by Josh Taylor, The Guardian: https://www.theguardian.com/technology/2023/jul/22/ai-girlfriend-chatbot-apps-unhealthy-chatgpt
  2. "Happy Valentine's Day! Romantic AI Chatbots Don't Have Your Privacy at Heart," *Privacy Not Included: https://foundation.mozilla.org/en/privacynotincluded/articles/happy-valentines-day-romantic-ai-chatbots-dont-have-your-privacy-at-heart/
  3. "EVA AI Chatbot & Soulmate," *Privacy Not Included: https://foundation.mozilla.org/en/privacynotincluded/eva-ai-chat-bot-soulmate/

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy Linux Magazine

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

comments powered by Disqus
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters

Support Our Work

Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.

Learn More

News