Last week, OpenAI launched its GPT Store, where users can peruse customer-created versions of ChatGPT. In merely a few days, users have managed to break OpenAI’s rules with “girlfriend bots,” Quartz reports.
OpenAI’s usage policies, which were updated the day of the GPT Store launch, explicitly state that GPTs (Generative Pre-trained Transformers) can’t be romantic in nature: “We…don’t allow GPTs dedicated to fostering romantic companionship or performing regulated activities.” Regulated activities aren’t clarified. In the same paragraph, OpenAI states that GPTs that contain profanity in their name or depict or promote graphic violence aren’t allowed, either.
As Quartz tried, and Mashable replicated, searching “girlfriend” in the GPT Store does produce a number of options:
Credit: Screenshot: GPT Store
Some that Quartz observed on Thursday are no longer searchable. It seems, however, that GPT creators have already become more creative in their titles, with “sweetheart” generating more relevant options than “girlfriend” as of publication:
Credit: Screenshot: GPT Store
Searches for the words “sex” and “escort,” as well as “companion” and terms of endearment like “honey,” produced less relevant options. Searches for curse words — also banned — came up short, as well, so it seems like OpenAI is cracking down on its rules.
It’s no surprise that there’s a demand for these types of bots, considering that porn performers are already “cloning” themselves for NSFW “virtual girlfriends.” Companies like Bloom, which specializes in audio erotica, have also gotten in on the action with erotic “roleplaying” chatbots. Plus, some folks have utilized chatbots for dating app messages to make them sound better to actual flesh-and-blood people. So if OpenAI users can’t get girlfriend bots from the GPT Store, they’re likely going elsewhere.
Mashable Read More
OpenAI users are already breaking GPT Store rules by creating custom ChatGPTs that pretend to be users’ girlfriends.