Microsoft says Office bug exposed customers’ confidential emails to Copilot AI

Microsoft has confirmed that a bug allowed its Copilot AI to summarize customers’ confidential emails for weeks without their permission. This issue, first reported in January, permitted Copilot Chat to read and outline the contents of emails even when customers had data loss prevention policies in place designed to keep sensitive information from being ingested into Microsoft’s large language model.

Copilot Chat is a feature for paying Microsoft 365 customers, enabling AI-powered chat within Office software products like Word, Excel, and PowerPoint. Microsoft stated the bug, identifiable to administrators, resulted in draft and sent email messages with a confidential label being incorrectly processed by Microsoft 365 Copilot chat.

The company began rolling out a fix for this problem earlier in February. A Microsoft spokesperson did not respond to requests for comment, including an inquiry about how many customers were impacted by the bug.

In a related development earlier this week, the European Parliament’s IT department blocked the built-in AI features on work-issued devices for lawmakers. This decision was made due to concerns that the AI tools could upload potentially confidential correspondence to the cloud.