Trip Up Bing AI And You Could Score A $15,000 Payday From Microsoft

Is that AI chatbot really hackproof? Probably not – and Microsoft will give you a bounty of up to USD $15,000 if you can demonstrate an exploit, as reported in Hot Hardware:

To be clear, simply getting Bing to generate chat responses that are vulgar or offensive isn’t going to satisfy the requirements for this program. It’s about getting Microsoft’s AI services to serve up information that isn’t public, particularly that related to its own creation and training data, or especially data owned by other users. Microsoft clearly thinks that these things aren’t possible, thus, the bug bounty.

This is Microsoft’s first bug bounty program explicitly targeted at its AI services, and as a result, there are quite a few guidelines that submitters must follow. The goal is to close security holes in the company’s new Bing products that make use of AI.

Since half a billion Windows 11 users will have Windows Copilot quite soon, this seems like a very timely initiative. Read the full article here.

Leave a comment