Okay, did not see that one coming:

"Jailbreaking AIs"

https://slashdot.org/story/23/02/12/0114222/bing-chat-succombs-to-prompt-injection-attack-spills-its-secrets

The day after Microsoft unveiled its AI-powered Bing chatbot, "a Stanford University student named Kevin Liu used a prompt injection attack to discover Bing Chat's initial prompt," reports Ars Technica, "a list of statements that governs how it interacts with people who use the service." 

Welcome to the future. Nick Bostrom anyone?