With access to sensitive data stored across customers’ Microsoft ecosystems, what security risks does Co-Pilot pose? We sat down with our VP of Engineering, Artem Tabalin, to find out.
Microsoft 365 Co-Pilot integrates AI assistance directly into Microsoft's productivity suite, accessing organizational data across platforms. According to recent research, 67% of enterprise security teams report concerns about AI tools potentially exposing sensitive information (Gartner, 2024). In the UK market, 58% of financial services firms have implemented additional security controls when deploying Co-Pilot (Financial Conduct Authority Tech Survey, 2024), while US healthcare organizations have seen a 43% increase in data classification initiatives before Co-Pilot deployment (HIMSS Analytics, 2023).
In November 2023, Microsoft launched its new AI tool for Enterprise users - Microsoft 365 Co-pilot. With its announcement, Microsoft said, ‘It combines the power of large language models (LLMs) with your data in the Microsoft Graph and the Microsoft 365 apps to turn your words into the most powerful productivity tool on the planet.’
Embedded in the ‘Microsoft 365 apps you use every day — Word, Excel, PowerPoint, Outlook, Teams and more,’ the tool can produce first drafts of blog posts (it didn’t write this one, we promise!), create beautiful presentations, and analyse trends to create data visualisations.
However, with access to sensitive data stored across customers’ Microsoft ecosystems, what security risks does Co-pilot pose?
We sat down with our VP of Engineering, Artem Tabalin, to find out.
Microsoft Co-pilot is an AI-powered tool assisting users with everyday tasks like creating documents, summarising content, analysing data or writing code. It’s similar to ChatGPT, but the key difference is its tight integration with Microsoft 365 applications and Microsoft ecosystem, which means Co-pilot can leverage users’ data, such as documents, emails and calendars to provide better personalised answers.
Microsoft Co-pilot is powered by a combination of advanced AI technologies, which include:
The tool can be used outside of Microsoft Ecosystem, but its key differentiator is how seamlessly Co-pilot integrated with other Microsoft Services, which enables wide range of capabilities from real-time recommendations while working on documents, emails, or presentations in Office 365 to converting a Word document into a Powerpoint presentation with key points and visuals.
Microsoft Co-pilot adheres to the existing privacy and security commitments for Microsoft 365 customers. The actual users’ data is not used to train the Machine Learning models, which means organisational data doesn’t influence the underlying models. The data is encrypted both in transit and at rest, which significantly reduces the risk of unauthorised access.
Co-pilot complies with GDPR (General Data Protection Regulation) and California Consumer Privacy Act (CCPA), guaranteeing that users’ data is processed, stored, and protected in accordance with recognised legal standards. It also complies with the European Union (EU) Data Boundary, making sure that EU customers data doesn’t leave EU boundaries.
First, Co-pilot follows existing data permissions and policies set up for an organisation, which means users will only see responses based on data they personally have access to, and the data won’t leak between users and groups. Second, Microsoft uses data anonymisation techniques to remove Personally Identifiable Information (PII) from the data used for training its AI models, also ensuring that only the minimum necessary data is processed. Finally, users are provided with robust privacy controls that allow them to manage data (including the ability to delete or export it for personal use), adjust privacy settings, and opt out of certain data processing activities.
Indeed, there are implications to be aware of. Imagine a user has access to some sensitive information in the organisation, say a spreadsheet with everyone’s salary information. Even though Co-pilot follows the existing data permissions and policies, as the user has access to the spreadsheet, Copilot has access to it too. This can lead to such sensitive information being exposed, as AI models might include sensitive data in their outputs.
Integrating Co-pilot requires careful access controls management to make sure that only authorised users can leverage its capabilities, especially when dealing with sensitive data.
As for compliance, it often requires detailed auditing and reporting capabilities, which can be challenging when AI models process data in opaque ways. That's why it’s critical that Co-pilot's operations meet the regulatory standards.
The first vulnerability is potential data leakage due to incorrect access controls, when a user has access to sensitive information, which allows Co-pilot to access this data and can lead to unexpected exposure.
Another attack vector, known as model inversion attacks, is shared by all AI-powered solutions. This is when a model itself is susceptible to attacks designed to manipulate its behaviour or extract information from it.
Co-pilot integrates with Microsoft 365 services, which means all vulnerabilities in those services and their integrations could be also potentially exploited.
Microsoft offers Data Processing Agreements that outline how data is processed on behalf of customers, ensuring compliance with GDPR requirements. Customers are able to choose data residency options, ensuring that data is stored in specific geographic locations to comply with regulations.
For healthcare organisations concerned with HIPAA compliance, Microsoft provides an option to enter into a Business Associate Agreement, which specifies how Protected Health Information (PHI) is handled in compliance with HIPAA.
As mentioned, organisations can configure access controls and permissions, which are respected by Co-pilot, to ensure that only authorised personnel can access sensitive data. Also data is encrypted both in transit and at rest to minimise the risks for data processed and generated by Co-pilot.
As a part of a broader security and privacy framework, Microsoft has protocols designed to facilitate incident response and mitigate potential damage, which applies across all products and services, including Co-pilot. There is monitoring and advanced threat detection technologies in place across all cloud services, including Co-pilot.
Every incident is investigated to understand its scope and impact, identifying how the breach occurred and which data or systems were affected. Then all the affected customers are notified and get all the details about the nature of the breach and the measures taken in response.
Here are a few general recommendations to minimise risks and protect your data:
The best approach to secure your sensitive information is to use a Data Loss Prevention (DLP) solution like Metomic, which allows organisations to discover sensitive data across Microsoft 365 services and set up automatic rules that take care of the information sharing and minimise the risk of a data breach.
While Microsoft Co-pilot can be a powerful tool when it comes to productivity, the security risks are apparent, particularly when it comes to sensitive data.
Using a modern DLP tool can be beneficial for identifying where sensitive data is stored across your Microsoft ecosystem, and allowing you to minimise it with automated redaction rules in place.
Metomic can help organisations use SaaS, AI and Cloud tools while keeping their team secure. To find out more about how Metomic can support your data security policy, book a personalised demo or get in touch with one of our team.
Microsoft 365 Co-Pilot adapts its data handling based on regional requirements. For European Union users, data is processed within EU boundaries due to strict GDPR requirements and data sovereignty laws. In the United Kingdom, post-Brexit data protection frameworks apply additional oversight. North American implementations prioritize sector-specific compliance frameworks such as HIPAA for healthcare and GLBA for financial institutions. These regional variations reflect Microsoft's commitment to meeting local regulatory requirements while maintaining consistent security standards.
Financial services organizations should conduct thorough data classification and risk assessments before implementing Co-Pilot. In European markets, firms must maintain detailed data processing registers compliant with both GDPR and industry-specific regulations. Organizations operating in London's financial district often implement additional monitoring for market-sensitive information. North American financial institutions must ensure Co-Pilot deployments align with SEC requirements and state-specific banking regulations, particularly regarding customer financial data protection and trading information.
Healthcare providers face unique challenges with Co-Pilot implementation due to strict patient data regulations. Organizations must configure specialized data boundaries to prevent patient information from being processed inappropriately. NHS-affiliated institutions in the UK require additional safeguards aligned with NHS Digital security standards. US healthcare providers must ensure Co-Pilot configurations comply with both HIPAA and state-level healthcare privacy laws. All healthcare implementations should include comprehensive audit trails for any AI interactions with clinical or patient data.
Multinational companies should adopt a region-specific deployment approach for Co-Pilot. This includes establishing a global governance framework with local variations to meet regional requirements, conducting separate risk assessments for each operational region, implementing data residency controls to ensure information remains within appropriate jurisdictions, creating region-specific user training programs addressing local data protection laws, and deploying monitoring tools that account for different compliance requirements across international operations.
Co-Pilot integration varies based on existing security ecosystems. Organizations with Microsoft Defender deployments benefit from streamlined threat detection and response capabilities. European enterprises typically implement additional layers of data loss prevention specific to GDPR requirements. North American implementations often focus on sector-specific compliance integration. Co-Pilot security operates within existing conditional access policies, leveraging modern authentication protocols and adapting to regional requirements for multi-factor authentication, privileged access management, and security monitoring.