This month, OpenAI launched what it calls the GPT Store, a platform where developers can sell custom-built AI apps and tools. The millions of GPTs on offer include games, productivity helpers, graphic design tools, tools for specific writing projects, and more. One thing OpenAI doesn’t want you to find, however, is an AI girlfriend.

People really want AI romance bots, and developers really want to sell them. But rather than embrace the world of digital love, Sam Altman’s company is trying to snuff it out. Or, to be more specific, OpenAI has a no-AI girlfriend policy—one that the company isn’t really enforcing.

Advertisement

You’re reading Dumbest Tech News of the Week, Gizmodo’s Monday column where we dive into the stupidest things on the internet so you don’t have to. AI girlfriends flooded the GPT store just hours into its existence, with names including “Korean Girlfriend,” “Judy,” “Your AI girlfriend, Tsu,” and “Virtual Sweetheart.” In an insult to girlfriends everywhere, OpenAI immediately started cracking down.

Advertisement

The GPT Store’s rules prohibit “GPTs dedicated to fostering romantic companionship or performing regulated activities.” It’s hard to say exactly what “regulated activities” means, but the company also has a broad policy against “sexually explicit or suggestive content. As such, OpenAI spent a few days last week hunting for girlfriends and kicking them out of the store.

Advertisement

Developers then pivoted to using more clandestine language, giving their girlfriend AIs slightly less obvious names like “sweetheart.”

But after the initial banning spree last week, OpenAI’s girlfriend cleanup efforts slowed. As of press time, there are still tons of robot lovers in the GPT Store. Judy and Your AI girlfriend, Tsu, are gone (RIP, virtual ladies). But there are at least six Korean girlfriends to choose from at this point, and Virtual Sweetheart is still available for romantic connections. For now, it seems like even OpenAI is unsure about enforcing its ban on artificial love. We’d love to tell you why, but OpenAI didn’t respond to a request for comment.

Advertisement

Altman and his cronies aren’t breaking new ground. The AI girlfriend issue falls into two long traditions: people wanting robots that simulate sex and love, and tech companies forcing investor-friendly puritanism on their users.

The idea of sex robots dates back to ancient Greece, and in the millennia since we’ve been hard at work building them. Sexbots cropped up almost immediately after this generation of AI hit the scene. The Marc Andreessen-backed chatbot platform Character.AI is full of avatars that will get down and dirty with you (or sweet and romantic if that’s your thing). Replika, an “AI companion,” filled the internet with ads specifically marketing its service as a horny robot machine, but then quickly pivoted and banished sex from its platform, leaving users who’d become dependent on their lovebots hurt and confused. Replika then launched a new app called Blush that’s specifically made for flirty purposes.

Advertisement

But the bigger tech companies are less comfortable letting their users get naughty. Basically, every mainstream AI text, image, and video generation platform has guardrails built in to block lewd or even mildly romantic requests. Meta added a variety of chatbots to WhatsApp, Facebook Messenger, and Instagram DMs, including one called Carter, which is described as a “practical dating coach.” Carter will talk to you about dating, but even with a robot designed for dating help, Gizmodo found that Meta’s robot will kink shame you if your questions take one step off the path of vanilla sex. Though, weirdly enough, Carter was happy to talk about foot stuff.

The best path forward isn’t exactly clear. The world is facing an epidemic of loneliness that does seem connected to the rise of the internet. Our world is filled with tools like food delivery apps and self-checkout machines that actively encourage you to avoid other human beings. It’s easier than ever to spend an entire day completely isolated.

Advertisement

On the other hand, there are plenty of people who will tell you that technology is the answer to the mental health issues caused by technology. Some are more legitimate than others. BetterHelp will connect you with a licensed therapist from the comfort of your own phone, for example. (Just don’t ask about their privacy problems.) Gizmodo’s own Kyle Barr even found that an AI chatbot called Pi gave him temporary relief during a difficult period.

But the AI girlfriend question is a separate issue. Human companionship can be hard to come by. There are plenty of robots that will love you on command. That may not pass the smell test, but if we’re talking about public health issues, our approach should be evidence-based.

Advertisement

Of course, the people making AI companions generally don’t have psychological training, nor are they necessarily incentivized to have your best interests in mind. A lot of people think an AI lover is inherently a bad thing. Maybe it is, or maybe a robo-girlfriend can offer respite in a cruel world. This stuff is brand new. We have no idea what the effects will be or under what circumstances. The answer is probably nuanced, as frustrating as that may be.

The AI girlfriend debate is important, but OpenAI isn’t having it. Instead, the company’s going for the least satisfying of both worlds: putting on a front of white-bread corporate moralism and then not even bothering to stick to its sexless guns.

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums