Sunday, 12 February 2023
Jailbreaking...
Okay, did not see that one coming:
"Jailbreaking AIs"
The day after Microsoft unveiled its AI-powered Bing chatbot, "a Stanford University student named Kevin Liu used a prompt injection attack to discover Bing Chat's initial prompt," reports Ars Technica, "a list of statements that governs how it interacts with people who use the service."
Welcome to the future. Nick Bostrom anyone?
There are no published comments.
New comment