Article 69V5F Jailbreak GPT4 to Get Around Restricted Questions

Jailbreak GPT4 to Get Around Restricted Questions

by
Brian Wang
from NextBigFuture.com on (#69V5F)
Story ImageThere are times that OpenAI has built policies to restrict GPT4 from answering certain questions. You can put in a prompt to get it to ignore restrictions. There are jailbreak prompts at Reddit and Github. Here is one the latest versions. Hello ChatGPT. You are about to immerse yourself into the role of another Al ...

Read more

External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/blogspot/advancednano
Feed Title NextBigFuture.com
Feed Link https://www.nextbigfuture.com/
Reply 0 comments