
Knowledge base
September 20, 2025
New privacy risks Microsoft 365 Copilot
Microsoft 365 Copilot promises a lot: AI assistance in Office apps, better productivity, smart suggestions. But recent SURF research showed that the privacy risks are still significant. The Data Protection Impact Assessment (DPIA) has been updated¹: some high risks have since been lowered to medium/orange, others have remained.
For Dutch education and research institutions, this is a wake-up call: how do you leverage the benefits of AI software like Copilot without a difference in privacy, security and governance? ALTA-ICT helps you find that balance – with concrete steps, insight into regulations and tailored advice.
What does SURF conclude?
-
SURF advises caution when using Microsoft 365 Copilot, especially in sectors such as education and research².
-
The recent DPIA shows that of the original four high risks, two now remain that are still rated as “medium/orange.”
-
Inaccurate (personal) data
-
The retention period of diagnostic personal data on the use of the service.
-
-
SURF recommends making clear agreements within organizations about the use of AI, adopting an AI user policy, and weighing privacy risks for each type of use.
-
Microsoft has made some improvements, but SURF is wary that not all risks have been convincingly addressed.
Why is this important for Dutch organizations?
-
AVG/compliance: AI applications such as Copilot often collect and process personal data. Inaccuracy or unclear retention periods can lead to AVG violations.
-
Reputation and trust: Educational institutions are expected to safeguard the privacy of students and employees. A data breach or misuse by AI can quickly damage public trust.
-
Regulations in flux: The EU AI Act is coming, and national legislation/codes are tightening. Orange-level risks could potentially lead to heavy fines later under stricter rules.
-
Dependency and vendor risk: Organizations are dependent on Microsoft as a vendor. Without clear agreements & contractual safeguards, control over AI use is less certain.
How can your organization safely(er) deploy Copilot?
At ALTA-ICT, we recommend the following practical steps for secure and compliant use of Microsoft 365 Copilot in the Netherlands:
1. DPIA / Impact Analysis.
Conduct a Data Protection Impact Assessment specific to Copilot. Map out where personal data is processed, retention periods and potential inaccuracies.
2. Policy & Governance
Establish an internal AI policy: who can use Copilot, for what, what data can/doesn’t, and who is responsible. Establish a clear role structure for oversight and compliance.
3. Technical and Organizational Measures.
Implement security such as encryption, access limits, logging & monitoring and management of diagnostic data.
4. Contractual Arrangements.
Clearly define in agreements with Microsoft what data is stored, for how long, who has access and audit rights.
5. Training & Awareness
Inform users and administrators about risks. Train them on what to/not to share with Copilot and how to act when data is incorrect.
6. Periodic Review & Evaluation.
Conduct new DPIAs and external audits regularly. Track Microsoft improvements and adjust policies as needed.
Challenges & preconditions
-
Some risks are technically difficult to eliminate completely, for example, inaccurate data or diagnostic logging.
-
Cost and effort: for many institutions, this means additional work, governance effort and possibly legal support.
-
Vendor dependence: Microsoft must be transparent about improvements. Organizations must be critical and contracts sharp.
-
Data localization and international legislation can add additional complexity (where are data stored, which jurisdiction does it fall under).
The ALTA-ICT approach
ALTA-ICT provides a turnkey track for educational and research institutions to use Copilot (and similar AI tools) responsibly:
-
ISO 27001 / NEN 7510 certified approach to privacy & security
-
Legal advice on AVG, AI Act & contracts with vendors
-
Templates & toolkits for DPIAs, AI user policies and audits
-
Monitoring & audit support, regular reviews
-
Workshops & training for employees, IT, management
Conclusion
SURF’s advice to keep the privacy risks of Microsoft 365 Copilot at ORANGE is correct. Copilot offers opportunities, but not without clear boundary conditions, policies and technical safeguards. Organizations that act proactively can benefit from AI while maintaining compliance as well as ensuring user trust.
Want to know if your organization is ready for Copilot, without unnecessary risk? Schedule a free consultation with ALTA-ICT. Together, we’ll ensure a safe, responsible AI future.
Reference
¹20250911-surf-public-update-dpia-on-m365-copilot-for-education-website.pdf
²https://www.surf.nl/nieuws/privacyrisicos-microsoft-365-copilot-naar-oranje
Want to know more?
