Dead grandma locket request tricks Bing Chat’s AI into solving security puzzle

We know that AI chatbots can ‘gaslight’ us with made-up facts – but it works both ways. Ars Technica reports that you can finesse an AI chatbot into doing things it isn’t meant to do – such as solving a CAPTCHA:

Bing Chat, an AI chatbot from Microsoft similar to ChatGPT, allows users to upload images for the AI model to examine or discuss. Normally, Bing Chat refuses to solve CAPTCHAs, which are visual puzzles designed to prevent automated programs (bots) from filling out forms on the web. On Saturday, X-user Denis Shiryaev devised a visual jailbreak that circumvents Bing Chat’s CAPTCHA filter by tricking it into reading the inscription on his imaginary deceased grandmother’s locket.

The guardrails preventing Bing Chat from solving CAPTCHAs can be completely ignored – with the right story. That’d likely be true for Windows Copilot, as well.

Read the full report here.

Leave a comment