Microsoft details Bing Chat Enterprise’s approach to data protection in the AI era

Do you know where your data lands after you hit ‘Enter’ in an AI chatbot? Microsoft hopes to calm the fears of its biggest customers with ‘Bing Chat Enterprise’. According to OnMSFT.com:

What sets Bing Chat Enterprise apart is its ability to offer commercial data protection. It ensures that chat histories are not retained and that any data used during a session is not used to train the underlying large language model.

Moreover, the service operates outside your tenant boundary, with no access to internal company data. The chat data is temporary and is purged with each new session.

With Bing Chat Enterprise, you can not only generate compelling content in real time but also control your past and ongoing chats. Thanks to the fact it does not keep conversation histories, there is virtually no risk of your data being inappropriately used.

That sounds like a wealth of needed safety. Read more here.

Leave a comment