Silent Access: Critical Flaw in Microsoft Copilot Bypasses All Audit Logs
While Microsoft has been vigorously promoting its Copilot AI product line, promising users greater convenience and productivity, a troubling flaw has been uncovered in the M365 ecosystem—one that undermines the very foundations of security and legal transparency. The issue lies in the fact that Copilot could access user files without leaving any trace in audit logs—and Microsoft chose not to inform its customers of this vulnerability.
The flaw was discovered by chance. On July 4, a security researcher from Pistachio observed that when Copilot was used to generate a summary from a file, the request was correctly recorded in the audit log. However, if the query was phrased differently—such that Copilot returned no link to the file—the record of access vanished entirely from the log. This loophole effectively allowed a malicious actor to read the contents of a document without leaving behind a single digital footprint.
It later emerged that the Chief Technology Officer of Zenity had identified the same issue a year earlier. Nevertheless, Microsoft only addressed the bug in August 2025, following a second independent report. Even then, despite acknowledging the problem, the company declined to issue a public advisory or assign a CVE—the standard identifier for vulnerabilities. Instead, Microsoft’s Security Response Center (MSRC) explained that the fix had been deployed automatically and required no action from customers.
This stance perplexed many experts. First, Microsoft blatantly violated its own incident-handling guidelines: although a formal process exists, the company failed to provide status updates on the report and behaved as though those procedural stages were merely for appearances, bearing no relation to reality.
Second, by categorizing the flaw as “important” rather than “critical,” Microsoft found a convenient pretext to avoid disclosure. Yet this ignores a crucial fact: missing audit records could occur inadvertently, without malicious intent, simply due to Copilot’s peculiar behavior.
The implications are sweeping, affecting any organization that relied on M365 Copilot prior to August 18, 2025. For companies that depend on audit logs to meet regulatory obligations such as HIPAA, or for internal incident investigations, the absence of complete records could lead to flawed or even disastrous decisions. The risk is especially acute for enterprise clients, where evidence of access to sensitive files can prove decisive in compliance checks, court proceedings, or audits.
In light of ongoing criticism of Microsoft for monetizing audit capabilities and restricting access to logs behind paywalls, the refusal to disclose such a critical weakness has drawn sharp rebuke. Auditing is not merely an optional service; it is the bedrock of trust between an IT platform and its clients. When a major provider conceals the fact that its logging system may have been malfunctioning for an extended period, it undermines its own assurances of security and transparency.
As Microsoft continues to expand the reach of AI within its products, one pressing question remains: how many more of these “silent failures” are lurking behind Copilot’s polished interfaces?