Copilot summarises emails it has been specifically told not to read

Microsoft has some sort of apology (at the bottom) saying that copilot permissions did not extend beyond the user permissions, but that merrily skips along the fact that copilot permissions are not equal to user permissions: this is a governance issue: data ingested by copilot is used as training data. MS cannot guarantee that this will not be moved to a US server, where the data can be (and is!) read by the US government and given to competitors.

Microsoft 365 Copilot Chat has been summarizing emails labeled “confidential” even when data loss prevention policies were configured to prevent it.

Though there are data sensitivity labels and data loss prevention policies in place for email, Copilot has been ignoring those and talking about secret stuff in the Copilot Chat tab. It’s just this sort of scenario that has led 72 percent of S&P 500 companies to cite AI as a material risk in regulatory filings.

Redmond, earlier this month, acknowledged the problem in a notice to Office admins that’s tracked as CW1226324, as reposted by the UK’s National Health Service support portal. Customers are said to have reported the problem on January 21, 2026.

“Users’ email messages with a confidential label applied are being incorrectly processed by Microsoft 365 Copilot chat,” the notice says. “The Microsoft 365 Copilot ‘work tab’ Chat is summarizing email messages even though these email messages have a sensitivity label applied and a DLP policy is configured.”

Microsoft explains that sensitivity labels can be applied manually or automatically to files as a way to comply with organizational information security policies. These labels may function differently in different applications, the company says.

The software giant’s documentation makes clear that these labels do not function in a consistent way.

“Although content with the configured sensitivity label will be excluded from Microsoft 365 Copilot in the named Office apps, the content remains available to Microsoft 365 Copilot for other scenarios,” the documentation explains. “For example, in Teams, and in Microsoft 365 Copilot Chat.”

DLP, implemented through applications like Microsoft Purview, is supposed to provide policy support to prevent data loss.

“DLP monitors and protects against oversharing in enterprise apps and on devices,” Microsoft explains. “It targets Microsoft 365 locations, like Exchange and SharePoint, and locations you add, like on-premises file shares, endpoint devices, and non-Microsoft cloud apps.”

In theory, DLP policies should be able to affect Microsoft 365 Copilot and Copilot Chat. But that hasn’t been happening in this instance.

The root cause is said to be “a code issue [that] is allowing items in the sent items and draft folders to be picked up by Copilot even though confidential labels are set in place.”

In a statement provided to The Register after this story was filed, a Microsoft spokesperson said, “We identified and addressed an issue where Microsoft 365 Copilot Chat could return content from emails labeled confidential authored by a user and stored within their Draft and Sent Items in Outlook desktop. This did not provide anyone access to information they weren’t already authorized to see. While our access controls and data protection policies remained intact, this behavior did not meet our intended Copilot experience, which is designed to exclude protected content from Copilot access. A configuration update has been deployed worldwide for enterprise customers.” ®

Source: Copilot Chat bug bypasses DLP on ‘Confidential’ email • The Register

Robin Edgar

Organisational Structures | Technology and Science | Military, IT and Lifestyle consultancy | Social, Broadcast & Cross Media | Flying aircraft

 robin@edgarbv.com  https://www.edgarbv.com