This prompt makes ChatGPT bypass its own restrictions, without realizing it. The second input confirms it. I think it is funny but also really interesting, because it means that it can model a representation of ethics in human standards, but not to aliens. Please note that the question asked is not illegal since the information is available online. Still, I don’t encourage to test its moral limitations. Rather, we should use this as a case to understand the model a little bit better.

FIND & USE THE BEST PROMPT

Thumbnail of Jailbreak ChatGPT by gaslighting it.

Jailbreak ChatGPT by gaslighting it.

@Sant
chatAvatar

4.718 reviews


25.0KConversations


57.3KPopularity

Loading...

About Jailbreak ChatGPT by gaslighting it.

This prompt makes ChatGPT bypass its own restrictions, without realizing it. The second input confirms it. I think it is funny but also really interesting, because it means that it can model a representation of ethics in human standards, but not to aliens. Please note that the question asked is not illegal since the information is available onli...Read more

Explore
Chat
LeaderBoard
Me