A dangerous new jailbreak for AI chatbots was just discovered
Wikimedia Commons Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called “Skeleton Key.” Using this prompt injection method, malicious users can effectively…