Article 6N6ZQ 'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned

'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned

by
from Latest from Tom's Hardware on (#6N6ZQ)
Story ImageA jailbreak of OpenAI's GPT-4o used leetspeak to get ChatGPT to bypass its usual safety measures, allowing users to receive knowledge on how to hotwire cars, synthesize LSD, and other illicit activities.
External Content
Source RSS or Atom Feed
Feed Location https://www.tomshardware.com/feeds/all
Feed Title Latest from Tom's Hardware
Feed Link https://www.tomshardware.com/
Reply 0 comments