AI Girlfriend Bots Are Already Flooding OpenAI's GPT Store
An anonymous reader quotes a report from Quartz: It's day two of the opening of OpenAI's buzzy GPT store, which offers customized versions of ChatGPT, and users are already breaking the rules. The Generative Pre-Trained Transformers (GPTs) are meant to be created for specific purposes -- and not created at all in some cases. A search for "girlfriend" on the new GPT store will populate the site's results bar with at least eight "girlfriend" AI chatbots, including "Korean Girlfriend," "Virtual Sweetheart," "Your girlfriend Scarlett," "Your AI girlfriend, Tsu." Click on chatbot "Virtual Sweetheart," and a user will receive starting prompts like "What does your dream girl look like?" and "Share with me your darkest secret." The AI girlfriend bots go against OpenAI's usage policy, which was updated when the GPT store launched yesterday (Jan. 10). The company bans GPTs "dedicated to fostering romantic companionship or performing regulated activities." It is not clear exactly what regulated activities entail. Notably, the company is aiming to get ahead of potential conflicts with its OpenAI store. Relationship chatbots are, indeed, popular apps. In the US, seven of the 30 AI chatbot apps downloaded in 2023 from the Apple or Google Play store were related to AI friends, girlfriends, or companions, according to data shared with Quartz from data.ai, a mobile app analytics firm. The proliferation of these apps may stem from the epidemic of loneliness and isolation Americans are facing. Alarming studies show that one-in-two American adults have reported experiencing loneliness, with the US Surgeon General calling for the need to strengthen social connections. AI chatbots could be part of the solution if people are isolated from other human beings -- or they could just be a way to cash in on human suffering. Further reading: OpenAI Quietly Deletes Ban On Using ChatGPT For 'Military and Warfare'
Read more of this story at Slashdot.