ChatGPT is Leaking Passwords From Private Conversations of Its Users - Report
Dan Goodin, reporting for ArsTechnica: ChatGPT is leaking private conversations that include login credentials and other personal details of unrelated users, screenshots submitted by an Ars reader on Monday indicated. Two of the seven screenshots the reader submitted stood out in particular. Both contained multiple pairs of usernames and passwords that appeared to be connected to a support system used by employees of a pharmacy prescription drug portal. An employee using the AI chatbot seemed to be troubleshooting problems they encountered while using the portal. "THIS is so f-ing insane, horrible, horrible, horrible, i cannot believe how poorly this was built in the first place, and the obstruction that is being put in front of me that prevents it from getting better," the user wrote. "I would fire [redacted name of software] just for this absurdity if it was my choice. This is wrong." Besides the candid language and the credentials, the leaked conversation includes the name of the app the employee is troubleshooting and the store number where the problem occurred. The entire conversation goes well beyond whata(TM)s shown in the redacted screenshot above. A link Ars reader Chase Whiteside included showed the chat conversation in its entirety. The URL disclosed additional credential pairs. The results appeared Monday morning shortly after reader Whiteside had used ChatGPT for an unrelated query.
Read more of this story at Slashdot.