Enterprise Data Protection in Microsoft 365 Copilot
How Microsoft protects business data in Microsoft 365 Copilot: encryption, GDPR, EU Data Boundary, sensitivity labels, copyright and the exceptions for web searches.
Enterprise Data Protection (EDP) describes how Microsoft handles business data when using Microsoft 365 Copilot and Microsoft 365 Copilot Chat. These services are covered by the Microsoft Data Protection Addendum (DPA) and Product Terms, where Microsoft acts as a processor of data.
Prompts and responses are business data
Everything users type (prompts) and everything Copilot returns (responses) is protected under the same enterprise terms as email in Exchange and files in SharePoint. This means the same contractual and technical protections apply.
Data is well protected
Microsoft protects business data with:
- Encryption at rest and in transit
- Strict physical security of data centers
- Strict tenant isolation — data from different organizations is never mixed
Data stays private
| Aspect | Protection |
|---|---|
| Data usage | Microsoft only uses your data per your instructions |
| GDPR | Fully supported |
| EU Data Boundary | Supported for Copilot processing |
| ISO/IEC 27018 | Certified |
| AI model training | Prompts and responses are not used to train AI models |
This is a crucial difference from free AI tools: with Microsoft 365 Copilot, your business conversations are never used to improve the model.
Existing security and compliance policies apply to Copilot
Copilot works within your existing Microsoft 365 security framework:
- User permissions and identity — Copilot only accesses data the user themselves can access
- Sensitivity labels — Labels on documents and emails are inherited and respected by Copilot
- Retention policies — Existing retention and deletion policies also apply to Copilot interactions
- Auditing — Copilot activities are logged and searchable
- Tenant and admin settings — Administrator policies are respected
> Note: The exact capabilities depend on your Microsoft license. E5 offers more comprehensive compliance tools than Business Premium.
Protection against AI and copyright risks
Microsoft offers three layers of protection:
- Protection against harmful content — Built-in filters against unwanted or harmful output
- Detection of protected material — Copilot detects and flags content that may be copyrighted
- Customer Copyright Commitment — Microsoft assumes legal responsibility if Copilot generates copyrighted material, provided you use the built-in safeguards
Web searches (Bing) — different rules apply
Copilot can use web information to keep answers current. For this, search queries are sent to Bing. This is an important consideration with different privacy rules:
| Aspect | Copilot itself | Web searches |
|---|---|---|
| Microsoft role | Processor | Controller |
| DPA applies | Yes | No |
| EU Data Boundary | Yes | No |
| HIPAA | Yes | No |
| User/tenant ID shared | Yes (internally) | No (anonymized) |
| Shared with advertisers | No | No |
| Used for AI training | No | No |
What does this mean in practice?
- Web queries are anonymized — no user or tenant ID is sent to Bing
- Queries are not shared with advertisers
- Queries are not used to train AI models
- However: web queries are not covered by the DPA, EU Data Boundary or HIPAA
For organizations with strict compliance requirements, it's advisable to disable web searches in Copilot via the admin settings.
Copilot Agents
When using Copilot Agents, an additional consideration applies: always check the privacy statement and terms of the agent itself. Third-party agents may handle data differently than Microsoft 365 Copilot.
Important: Anthropic models and EU Data Boundary
Microsoft uses Anthropic models (Claude) for certain Copilot features. These models currently fall outside the EU Data Boundary. This is relevant for organizations required to keep all data processing within the EU.
EU/EFTA tenants (Netherlands)
For EU tenants the following applies:
- Anthropic is disabled by default
- Anthropic falls outside the EU Data Boundary
- An admin must explicitly opt in to enable Anthropic via the Microsoft 365 Admin Center
Unless you or your IT team has deliberately enabled this, you are not using Anthropic. No action is required.
How do you know if Copilot uses Anthropic (Claude)?
If Anthropic is enabled for your tenant, you will see this explicitly in Copilot Chat:
- Copilot Chat shows a model selector (e.g. GPT vs Claude Sonnet)
- You will literally see "Claude" or "Anthropic" as an option
- If you see no model selector, Anthropic is not being used for your tenant
Summary
Microsoft 365 Copilot treats prompts and responses as full business data with the same security, compliance and privacy protections as the rest of Microsoft 365. The main exception: web searches via Bing have a separate privacy context where Microsoft acts as controller instead of processor.
Checklist for IT administrators
- [ ] Verify sensitivity labels are correctly configured
- [ ] Review Copilot admin settings in the Microsoft 365 Admin Center
- [ ] Assess whether web searches should be enabled or disabled
- [ ] Check retention policies for Copilot interactions
- [ ] Inform users about the difference between Copilot and free AI tools
- [ ] Check if Anthropic models fit within your compliance framework
Chat With Us
Get instant help from our support team
News & Insights
Laptop prices are rising fast: order now before it's too late
2026-03-16
Microsoft 365 E7: the new all-in-one suite for AI and productivity
2026-03-12
uWebChat Voice Portal — sneak preview of new features
2026-03-06
Related Articles
Need More Help?
Can't find what you're looking for? Our support team is ready to assist.
Submit a Support Ticket