CompanyScope
by Janus Compliance

Embedded productivity AI

Microsoft 365 Copilot compliance: GDPR, AI Act, DPA, training, transfers

Independent compliance research from Janus Compliance. Reviewed by Michael K. Onyekwere, CIPP/E. Last reviewed 2026-04-29. Not legal advice.

Share this Microsoft 365 Copilot profile:Share on XBluesky

TL;DR. Strong contractual no-training position for the paid Copilot product. EU Data Boundary default weakened on 2026-04-17 — "flex routing" allows EU tenant data to process outside the EU during peak demand, on by default. Anthropic enabled as subprocessor 2026-01-07 — Anthropic processing is explicitly out of EU Data Boundary scope. Two products share the Copilot name (Microsoft 365 Copilot vs Copilot Chat / consumer Copilot) with very different data models.

DPO action: review your tenant's flex routing setting and Anthropic models toggle now. If you have not reviewed since 2026-03-25, your data is processing outside the EU Data Boundary by default.

What the tool does

Microsoft 365 Copilot is an AI assistant embedded in Word, Excel, PowerPoint, Outlook, Teams, and the wider M365 stack. It draws on the user's tenant data (emails, files, chats) via Microsoft Graph and combines that with large language models to generate text, summaries, action items, and draft documents. There is also a separate Microsoft 365 Copilot Chat product, which has a different data model. Buyers must distinguish them in any DPIA. Copilot Studio (the agent platform) is a separate compliance evaluation again.

Data processed

The hard part of this profile is that Copilot processes a lot:

Special-category likelihood: Very high. Copilot reads anything the signed-in user can read. In a typical SME, that includes HR records, performance reviews, financial data, and customer correspondence. DPIA is not optional for any deployment.

Default geographic processing: EU Data Boundary applies for EU tenants — but with significant caveats (see EU/UK transfer position below). The "flex routing" change effective 2026-04-17 materially weakens the default.

DPA availability

Microsoft 365 Copilot is governed by the Microsoft Product Terms and Data Protection Addendum (DPA), both publicly available and incorporated by reference into all M365 commercial agreements.

The DPA establishes Microsoft as processor for tenant content and includes SCCs and the UK Addendum.

Subprocessor list

Microsoft maintains a list of "Subprocessors and Data Locations" for online services. Notable subprocessors / sub-models for Copilot specifically:

Crucial nuance: when Copilot uses Anthropic models, the processing falls outside the Microsoft EU Data Boundary. Microsoft documentation states verbatim:

"Anthropic models deployed in Microsoft offerings (including Microsoft 365 Copilot, Researcher, Copilot Studio, Power Platform, Agent Mode in Excel, and Word, Excel, and PowerPoint agents) are currently excluded from the EU Data Boundary, and when applicable, in-country processing commitments."

This is not optional behaviour — by default, EU tenants now have Copilot inferencing happening with a subprocessor that does not honour the boundary, and the exclusion applies across the broader Microsoft product surface, not just the Copilot chat experience.

Buyers can disable Anthropic models in Microsoft 365 admin centre, but the default is on.

Training-on-customer-data position

Microsoft 365 Copilot does not use prompts, responses, or data accessed through Microsoft Graph to train foundation LLMs, including those used by M365 Copilot itself. This is a contractual commitment in the Product Terms.

Source: Microsoft Learn — "Data, Privacy, and Security for Microsoft 365 Copilot."

This is a meaningful difference from consumer Copilot products and is one of the strongest commercial-tier no-training commitments in the market.

EU / UK transfer position

This is the most important section of this profile and where buyers most often have an outdated mental model.

Historical position (pre-April 2026): Microsoft committed to processing EU tenant data within the EU Data Boundary. This was a strong selling point.

Current position (effective 2026-04-17): Microsoft has enabled "flex routing" by default. Flex routing allows Copilot LLM inferencing to occur outside the EU Data Boundary during peak demand — your tenant's prompts and Graph-derived content can be processed in non-EU infrastructure when EU capacity is constrained.

Combined with the Anthropic subprocessor change (out of EU Data Boundary by design), the realistic default for an EU Copilot tenant in 2026 is that some of their data processing happens outside the EU.

SCCs apply to non-EU processing under the Microsoft DPA. UK Addendum applies for UK customers.

The 2026-04-17 default change is a material risk that EU/UK DPOs need to evaluate immediately. "Copilot is EU Data Boundary by default" is no longer accurate. If you have not reviewed your tenant settings since March 2026, you may be processing personal data outside the EU without a current decision having been made. — My read

Security documentation

Microsoft's compliance posture is broad and well-documented. Relevant for Copilot:

Strong, well-evidenced security baseline. The compliance gaps are not in standards coverage; they are in defaults and configuration.

AI Act role + risk classification

Microsoft publishes AI Act readiness materials. Useful evidence for your audit file but does not relieve your own obligations.

DPIA prompts (for your use case)

  1. Have you reviewed your flex routing setting since 2026-03-25? If not, it is currently on by default and your tenant data may be inferencing outside the EU.
  2. Have you decided whether to allow Anthropic models? They were enabled by default 2026-01-07 and process outside the EU Data Boundary.
  3. Have you mapped which roles can access Copilot? Copilot reads anything the user can read. Over-permissioned mailboxes and SharePoint sites become AI-exposure points.
  4. Have you implemented Microsoft Purview or equivalent DLP to control what Copilot can surface? Default deployments have minimal guard rails.
  5. AI Act Annex III applicability: is Copilot being used in any HR, hiring, performance, or decision-support flow that touches Annex III? If so, deployer high-risk obligations engage.
  6. Copilot Studio agents: if you've built or enabled agents, treat each as a separate AI system for AI Act purposes — risk-classify each independently.

Unresolved questions / red flags

Related profiles

Sources checked

<!-- All Phase C residual items resolved (browser agent run 2026-05-02). aka.ms/DPA resolves to Microsoft Products and Services Data Protection Addendum (most recent April 2025). Microsoft 365 Copilot privacy doc confirmed contains "Prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation LLMs" verbatim. Anthropic-EU Data Boundary exclusion language confirmed and broadened (applies across M365 Copilot, Researcher, Copilot Studio, Power Platform, Agent Mode in Excel/Word/PowerPoint). ISO 42001 confirmed for both Copilot and Copilot Chat. -->
Share this Microsoft 365 Copilot profile:Share on XBluesky

Need a reviewed note for your specific use case?

For when the public profile isn't enough — your sector is regulated, your procurement gate is real, your use case is unusual. Tell us the situation and we'll come back with a CIPP/E-reviewed Vendor Risk Note (typically £149, depending on scope).

Your context goes only to Michael. We don't share with the vendor or anyone else. Privacy notice.

AI vendor compliance updates

New profiles, regulatory deadline reminders, and the occasional AI vendor red flag. Written by Michael K. Onyekwere, CIPP/E. Free.

We don't share your address. Unsubscribe any time. Privacy notice.

For ongoing AI compliance support, work with Janus DPO-as-a-Service. For other vendors, browse the full index.