Article 6J49E Psst … wanna jailbreak ChatGPT? Thousands of malicious prompts for sale

Psst … wanna jailbreak ChatGPT? Thousands of malicious prompts for sale

by
from The Register on (#6J49E)
Story ImageTurns out it's pretty easy to make the model jump its own guardrails

Criminals are getting increasingly adept at crafting malicious AI prompts to get data out of ChatGPT, according to Kaspersky, which spotted 249 of these being offered for sale online during 2023....

External Content
Source RSS or Atom Feed
Feed Location http://www.theregister.co.uk/headlines.atom
Feed Title The Register
Feed Link https://www.theregister.com/
Feed Copyright Copyright © 2024, Situation Publishing
Reply 0 comments