Bing Chat shown on a laptop.
Jacob Roach / Digital Trends

Microsoft has announced Bing Chat Enterprise, a security-minded version of its Bing Chat AI chatbot that’s made specifically for use by company employees.

The announcement comes in response to a large number of businesses implementing wide-reaching bans on the technology — including companies like Apple, Goldman Sachs, Verizon, and Samsung. ChatGPT was the main target, but alternatives like Bing Chat and Google Bard were included in the bans.

The most commonly quoted reason for these bans is security and privacy, which is exactly what Bing Chat Enterprise aims to address.

According to a blog post published as a part of Microsoft Inspire, the point of Bing Chat Enterprise is to assure organizations that both user and business data never leaks outside of the company. “What goes in — and comes out — remains protected,” as the blog post states. This also includes chat data, which Microsoft says it cannot see or use to train the models.

Unfortunately, Microsoft didn’t provide much in the way of explaining how exactly this differs from the standard version of Bing Chat, only stating that “using AI tools that aren’t built for the enterprise inadvertently puts sensitive business data at risk.” Microsoft says Bing Chat Enterprise still uses web data and provides sourced answers, along with citations to web links. It can be accessed right from the standard Bing Chat places, including at Bing.com, through the Microsoft Edge sidebar, and eventually, through Windows Copilot.

Bing Chat Enterprise is available in preview today and is a free inclusion to Microsoft 365. Microsoft even says the application will also soon be available as a stand-alone tool for $5 per month per user.

Editors’ Recommendations

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums