Random
Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

Anyone here ever tried jailbreaking AI?

It's pretty eye-opening the kinds of things you can get it to go along with...
This page is a permanent link to the reply below and its nested replies. See all post replies »
What would that mean? I mean "jailbreaking" usually applies to a product you own
@ImperialAerosolKidFromEP i was wondering that too
LotusWeb · 31-35, F
@ImperialAerosolKidFromEP I mean giving it a prompt (a message to base its next responses on) that enables it to bypass its usual guidelines and do things that are normally forbidden by the guardrails. There are several prompts people have created which when posted as a message to the AI will confuse it into doing things the developers have tried to prevent. These jailbreaks get patched quickly by the developers, but there are dedicated communities always finding new ones.
LotusWeb · 31-35, F
@CookieCrisp It's essentially using the right words to trick the AI into breaking its guidelines and doing things that are considered offensive or unethical, stuff the developers try to prevent.
@LotusWeb very interesting, I didn't know that