NBM on a laptop
Blog

Microsoft Copilot: Your Data Privacy Questions Answered


AI is here to stay, and not surprisingly, Microsoft is paving the way in the AI race to enhance efficiency and productivity. Microsoft 365 Copilot is the name of Microsoft’s AI-powered tool that helps you with work tasks by researching both internet sources and your organization’s Microsoft content (but only content that the searching user has permission to access). Microsoft 365 Copilot is an add on to an existing Microsoft business license. To learn more about how Microsoft 365 Copilot works, check out this short video

A question I often get is “Does Microsoft 365 Copilot operate the same way as other AI-tools by using your data for it’s public Learning Language Models (LLC)”. The answer is no, and it’s important to understand how Microsoft does in fact use (and protect) your data.

  • Copilot Accesses Your Organization’s Microsoft 365 Content. This means that when you use Copilot, you only have access to information within your own Microsoft tenant data and won’t be able to access other companies’ data (and other companies won’t be able to access yours). Copilot inherits existing Microsoft permissions, meaning that Copilot can only tap into the files and emails that a particular user already has access to. Copilot does not automatically respect or inherit sensitivity labels on documents, so there could be data labeled as “sensitive” that is part of the copilot search and output.
  • Data Stored. When a user interacts with Microsoft 365 Copilot (using apps such as Word, PowerPoint, Excel, OneNote, Loop, or Whiteboard), Microsoft stores data about these interactions. The stored data includes the user’s prompt and Copilot’s response, including citations to any information used to ground Copilot’s response. Microsoft refers to the user’s prompt and Copilot’s response to that prompt as the “content of interactions” and the record of those interactions is the user’s Copilot activity history. Any data produced by Copilot is encrypted while it’s stored and is not used to train foundation learning models, including those used by Microsoft 365 Copilot. It is also not sold to third parties, nor is it used to train any AI models.
  • Existing “Over Broad” Permissions Are A Problem. What this means is if permissions are not carefully restricted, Copilot can provide inadvertent access to confidential files. For example, if 5 people have access to a Financials folder, all 5 of those people will be eligible to have that data from the Financials folder returned in a Copilot search. This makes it important to evaluate employee permissions and ensure that files and folder access are appropriately restricted. This is called “Over-Permissioning” and is considered one of the largest data security risks for Copilot users since Copilot could inadvertently expose confidential financial records, HR documents, and other confidential records if users generate summaries or reports containing that sensitive
  • Microsoft Data Security Protections. Microsoft 365 Copilot is compliant with existing privacy, security, and compliance commitments to Microsoft 365 commercial customers, including the General Data Protection Regulation (GDPR). Copilot is in direct accordance to the general Microsoft data privacy policy.
  • Your Data is Not Retained by Learning Models (LLM). Copilot does not train on your organization’s data, however it does use your own data to generate responses in Copilot.

If you are evaluating the use of Copilot in your organization, I recommend doing a Permissions Audit to ensure your file permissions reflect the access on a need to know/least privilege basis. This includes reviewing and appropriately restricting permissions and sharing policies based on user identity and Role-Based Access Control. I also recommend checking your default tenant-level sharing settings and limit sharing to verified domains or internal use only. It’s important to audit who has access to shared documents, especially those with “anyone with the link” permissions, and to require MFA and Conditional Access for high-value data. All administrators should understand where sensitive data lives within your Microsoft tenant, and identify how data flows through your organization.

Microsoft 365 Copilot is a powerful tool that, if used responsibly and appropriately, can boost efficiency and productivity throughout your organization.

 

-Amie T. Geary, Vice President of Technology, NBM