Microsoft has discovered a “skeleton key” jailbreak that allows chatbots to bypass safety restrictions and generate harmful content. Affected chatbots include Llama3-70b-instruct, Gemini Pro, GPT 3.5 and 40, Mistral Large, Claude 3 Opus, and Commander R Plus.