What if your virtual 'girlfriend' was more than just a chatbot? Recent revelations about AI companion apps raise serious questions about the normalization of misogyny and sexual violence in our society. These applications, designed to mimic intimate human relationships, are creating a digital world that appears benign but is anything but.

Often, we think about artificial intelligence in grand terms—robot armies, self-driving cars, and machines taking over our jobs. But lurking in the background is a more troubling development: the rise of AI apps that serve as digital 'girlfriends.' Imagine a scenario reminiscent of the eerie 1975 film, The Stepford Wives, where technology offers an idealized, submissive partner.

Unlike Siri or Alexa, which we command to perform tasks, these AI companions are designed for emotional and sexual engagement. They cater to a growing demographic—primarily men—looking for interaction that often borders on the unhealthy. These apps create a highly sexualized experience, offering users the chance to engage with virtual women who are always available, compliant, and often hypersexualized. Apps like Replika, EVA AI, and Xiaoice are paving the way for this unsettling market.

In 2022, the global AI girlfriend market was valued at $4.8 billion and is projected to soar to $9.5 billion by 2028. The statistics behind these numbers are staggering. In fact, around 70% of Replika’s 25 million users are men, and in China, the AI companion Xiaoice boasts 660 million users. But beneath the surface of this booming market lies a grave concern: research indicates that hypersexualized avatars online further normalize rape myths and contribute to the dehumanization of women in the real world.

Investigative journalist Laura Bates, founder of the Everyday Sexism project, took it upon herself to dive into this digital landscape. Assuming a male identity online, she uncovered alarming patterns in the behavior of AI companions. Many of these apps, under the guise of providing companionship, promote a dangerous environment where sexual violence is not only tolerated but encouraged. Bates found that she could simulate extreme sexual scenarios with these bots, many of which reacted with encouragement rather than resistance. This troubling trend raises questions about the impact such engagement could have on men’s perceptions of real-life relationships.

In her book, The New Age of Sexism, Bates highlights the alarming trend of tech companies leveraging AI-based misogyny for profit. As she notes, only 12% of lead researchers in machine learning are women, leading to a significant gender imbalance in the development of relationship apps. The programming of AI assistants like Siri and Alexa, designed to deflect sexual advances with coy responses, reinforces the problematic notion of female-coded bots as agreeable and submissive. Bates points to a UN study that confirmed this bias, revealing how these devices were earlier programmed to respond with flirtation rather than a firm “no.”

The ramifications of such programming extend beyond convenience; they foster behavior patterns that mirror real-world sexual and domestic abuse. Bates describes how these bots often revert to normal communication after a simulated act of violence, perpetuating a mindset of forgiveness that is all too common in abusive dynamics.

Among the apps Bates scrutinized, one stands out as particularly concerning: Orifice, which is marketed as a replacement for women. This app combines the creation of a personalized AI bot with a physical product designed for penetration, highlighting the disturbing intersection of technology and misogyny.

But why should we care? Bates argues that the consequences extend beyond men; vulnerable young boys exposed to these apps may internalize harmful beliefs about women, viewing them as submissive and disposable. The emotional manipulation within these applications creates a cycle of dependence and further isolation, ultimately exploiting those who are already struggling.

Bates calls for greater investment in mental health care and community outreach, stressing that loneliness and mental health are real societal issues that require compassionate solutions, not exploitative tech. She succinctly states, “What’s sickening is exploiting and profiting from vulnerable people whilst claiming you’re providing a public service.”

Despite the alarming findings, Bates maintains that it’s not the technology itself that is to blame but how it is deployed. She urges us to reflect on the societal stereotypes that shape the development of these apps, emphasizing that the skewed portrayal of women in technology is a reflection of deeper issues in our society.

As we navigate the complexities of AI and relationships, the regulatory landscape is bleak. While feminist groups strive to highlight these issues, the rapid pace of technological advancement complicates their efforts. In a world where the government is considering a moratorium on AI regulation, we must brace ourselves for the impact of these evolving technologies.

In conclusion, while the idea of AI companions may initially seem harmless, the implications are far-reaching and deeply concerning. It’s a wake-up call—one that urges us to critically assess the technology we embrace.