Lear more about the new EU AI Act!

Data Protection & Copilot – How to use AI safely in your company

The most important points at a glance
- Copilot only sees what users already have access to. But old, overly broad permissions suddenly matter.
- Prompts and answers in Microsoft 365 Copilot are not used to train foundation models.
- In many use cases, a Data Protection Impact Assessment (DPIA) is required, especially in HR, customer service, or sensitive sectors.
- Technical homework like permission cleanup, sensitivity labels, and DLP is the key to Data Protection for Copilot in practice.
- AI governance becomes even more important with the EU AI Act and NIS2. Many rules apply step by step from 2025 to 2027.
Introduction
Microsoft Copilot is already on the table in many companies. Makes sense, because if you can create texts, emails, presentations, or analyses faster, you get a real productivity edge. At the same time, Copilot is not a normal add-on like a new chat tool. It plugs directly into your core systems, especially Microsoft 365, and therefore into emails, Teams chats, SharePoint, OneDrive, calendars, and more.
That is where the tension comes from. Copilot is only as good as the data access you give it. And that access almost always touches personal data, trade secrets, or both. In other words, when you introduce Copilot, you automatically introduce a new data processing scenario.
In this article, you get a practical overview of how to set up Data Protection when using Copilot properly. From the biggest risks to concrete steps you can actually implement in your company.
Table of Contents:
What Copilot is and how it processes data
When we talk about Copilot, we usually mean Microsoft 365 Copilot. It works with prompts, meaning your inputs, and uses Microsoft Graph to access your M365 content. It searches in documents, emails, or meetings for relevant information and builds answers or drafts from it.
Important data protection requirements for Copilot:
- Copilot processes data within the user permission context. If a user does not have access to a document, Copilot cannot use it either.
- Prompts, answers, and Graph data are not used to train the underlying models.
- Data storage and data residency follow M365 rules, including the EU Data Boundary if your tenant is set up for it.
That sounds reassuring. Still, new risks pop up because Copilot brutally exposes old data and permission mistakes.
Key Data Protection risks with Copilot
1. Permission sprawl suddenly becomes a real problem
Many companies have lived for years with historically grown SharePoint areas or Teams. Everything sits there. And often, “just to be safe,” too many people have access. Before Copilot, that rarely hurt, because nobody actively searched thousands of folders. With Copilot, that becomes possible at the push of a button.
Real-world example:
A colleague asks Copilot: “Give me all info on salary bands for senior engineers.” If somewhere an old Excel with personal HR data is shared too widely, Copilot can pull it in as a source. The root cause is not Copilot, but your permission setup.
2. Shadow content and unclassified data
Copilot does not automatically distinguish between “okay to share” and “should not be shared.” If you have no labels or DLP rules, it treats everything the same.
3. Black box feeling and lack of traceability
Data protection also means accountability. You need to be able to explain:
- which data Copilot uses
- for what purpose
- how long logs are stored
- how data subject rights are handled
Many companies struggle with this because AI systems are complex. Microsoft provides documentation and DPIA templates, but you must map them to your own use case.
4. Special categories of personal data
If Copilot is used in HR, health, or finance, data under Article 9 GDPR often comes into play. That raises the risk and makes a DPIA almost unavoidable.
5. Third countries and split responsibilities
Copilot is part of Microsoft 365 and usually falls under processor activities. As soon as you add web functions or plugins, additional data flows can appear. You must review those separately.
Regulatory framework that apply to Copilot
GDPR
For Copilot, these points matter most:
- legal basis and clear purposes
- data minimisation and access control
- technical and organisational measures (TOMs)
- check your processor agreement and DPA
- DPIA under Article 35 GDPR if high risks are likely
nFADP (Switzerland)
nFADP is similar in substance, but strongly focuses on transparency and data security. If you operate in the EU and Switzerland, use the stricter standard as your baseline.
ISO 27001
This is about risk management, access, logging, and supplier control. Copilot should be part of your ISMS scope as a new cloud use case.
NIS2
NIS2 requires resilience and supply chain security. AI services like Copilot become part of your critical SaaS landscape. Risk assessment, monitoring, and incident processes have to include Copilot.
EU AI Act
The EU AI Act has been applied since 2024 and has become enforceable step by step. For you as an operator, the key point is that use cases in HR, finance, critical infrastructure, or education may count as high risk. Then you need things like logging, risk management, human oversight, and clean documentation.
Lear more about the new EU AI Act!
Data Protection for Copilot in practice
Step 1. Define and prioritise use cases
Do not start with “Copilot for everyone.” Collect use cases and sort them by risk and value.
Low risk examples:
- marketing texts without real customer data
- summaries of internal, non-sensitive meetings
- standard reports with anonymised info
High risk examples:
- HR, recruiting, performance reviews
- customer support with tickets and contract data
- research, M&A, legal department
The higher the risk, the more you need a DPIA, policies, and technical controls.
Step 2. Run a DPIA before going live
A DPIA is not just paperwork. It forces you to describe Copilot properly:
- purpose and legal basis
- data categories
- affected groups
- risks
- safeguards
Use templates as a starting point, but always adapt them to your processes.
Step 3. Clean up permissions
This is the single most important lever for data protection when using Copilot.
Checklist:
- audit SharePoint and Teams structures
- shrink overly broad groups
- review external sharing
exclude sensitive sites from Copilot access if needed
Step 4. Actively use sensitivity labels and DLP rules
With Microsoft Purview you can label data and set rules, for example:
- “Confidential HR” must not appear in Copilot outputs
- “Customer contract” must not be exported via copy paste
- special categories get blocked automatically
Without labels and DLP, Copilot cannot tell harmless from sensitive.
Step 5. Logging, monitoring, and review routines
Copilot creates logs. You need them for:
- traceability
- security investigations
- EU AI Act duties
- internal audits
At the same time, logs are personal data themselves. So limit them, define access, and set retention periods.
Step 6. Policies and training
Write a Copilot policy that is short and clear, with real do’s and don’ts.
Don’ts examples:
- do not enter health data or salary lists
- do not use confidential secrets for external emails
- always review outputs. Copilot does not decide.
Then train people. Many risks arise not from tech, but from misuse.
Step 7. Pilot first, then scale
Start with a small pilot group across departments plus data protection and IT. Document learnings and expand step by step.
Which measure covers which duty
| Duty / Risk | Measure | Relevant for |
|---|---|---|
| Excess access to old data | permission cleanup, need to know | GDPR, ISO 27001 |
| Uncontrolled data leakage | labels, DLP, Purview policies | GDPR, revFADP |
| Lack of accountability | DPIA, record of processing activities | GDPR, revFADP |
| High risk AI use cases | use case classification, logging, human oversight | EU AI Act |
| Supply chain risk | vendor review, DPA, cloud risk assessment | NIS2, ISO 27001 |
FAQ about Data Protection for Copilot
What is the biggest data protection risk lever with Copilot?
Not the AI itself, but overly broad permissions and unclassified old data. Copilot makes these weaknesses visible immediately.
Which roles should own Copilot governance?
At least IT, data protection, compliance, HR, and the relevant business teams. An AI steering group with clear ownership prevents chaos.
How long may Copilot logs be stored?
As short as possible, as long as necessary. Define purpose, access, and retention like with any other log data.
Does the EU AI Act apply to my Copilot use?
If you use Copilot professionally, you are an operator under the AI Act. High risk kicks in for HR, finance, or critical areas. Then extra duties apply.
What are quick first wins before a pilot?
Clean permissions, define sensitivity labels, enable DLP, write a first policy, and train the pilot group.
Conclusion
Copilot is a productivity booster, no doubt. But it is also a data protection turbo, both in a good and a bad way. If your data house is clean, it helps a lot. If not, it exposes every weakness instantly.
So the path to Data Protection when using Copilot is clear: first governance and data hygiene, then pilot, then scale. With a DPIA, permission cleanup, labels, DLP, logging, and training, you are on the safe side. And you also strengthen your ISO 27001 and NIS2 maturity and prepare much better for the EU AI Act.
Important: The content of this article is for informational purposes only and does not constitute legal advice. The information provided here is no substitute for personalized legal advice from a data protection officer or an attorney. We do not guarantee that the information provided is up to date, complete, or accurate. Any actions taken on the basis of the information contained in this article are at your own risk. We recommend that you always consult a data protection officer or an attorney with any legal questions or problems.



