Search results
It's dangerously easy to 'jailbreak' AI models so they'll tell you how to build Molotov cocktails,...
Business Insider· 14 hours agoA jailbreaking technique called "Skeleton Key" lets users persuade OpenAI's GPT 3.5 into giving them the recipe for all kind of dangerous things.