Microsoft Keeping Tabs on Your Bing Chats? Could Be!

Chat with Caution!

Bing

Hold onto your virtual hats, because Microsoft might be doing more than just facilitating your Bing chats—they could be keeping them. Interesting? Definitely. A little unsettling? Perhaps. If you thought your Bing conversations were just fleeting digital whispers, it’s time to think again. Microsoft’s updated terms of service are here to shake things up, and it’s all centered around the world of AI. Let’s dive into what this means and why your casual Bing banter might be more permanent than you thought!

So, here’s the deal: if you’ve been chatting on Bing and thought those conversations were as fleeting as your latest TikTok post, think again. Microsoft has updated its terms of service, and it’s all about AI. The update, which kicked off on July 30 and will hit the gas on Sept. 30, says that Microsoft’s going to keep those chats of yours. Why? Apparently, keep an eye out for any shady or abusive content.

Bing Chat
Source: Microsoft

What we don’t know is how long Microsoft’s keeping these chat records. They’re being a bit hush-hush about that part. When some tech folks tried to get a comment, Microsoft played it cool and simply said that they’re always working on updating their terms to match their tech.

Now, besides holding onto your chat history, these new terms have some other interesting bits. You can’t dig into their AI models, algorithms, and all that jazz, and you can’t use their AI to create or train other AIs. Plus, if someone’s got beef with your use of AI, it’s on you, buddy.

If this is a bit too much Big Brother for you, there’s an out. Switch over to Bing Enterprise Chat mode. Microsoft has promised they won’t keep those conversations. So go ahead, chat away, but maybe keep those secrets on the down low for now. Cool? Cool.

en_USEnglish