Preparing the security and compliance needed to safely and securely leverage Copilot
Copilot is one of the next big things from Microsoft and will be interacting with almost every solution that Microsoft currently publishes. The current suite of products for Copilot include:
- Windows Copilot – part of the Windows 11 operating system and is focused on being able to increase productivity and creativity within the Windows ecosystem. (Copilot in Windows & Other AI-Powered Features | Microsoft)
- Microsoft 365 Copilot – AI capabilities that are centered around helping to perform business tasks, increase collaboration between coworkers, and gain access to business information faster within an organizations tenant more efficiently. (Microsoft 365 Copilot – Microsoft Adoption)
- GitHub Copilot – a version of Copilot that is focused on helping programmers be more productive and write better code. This version of CoPilot has the potential of saving significant time on coding for developers. (GitHub Copilot · Your AI pair programmer · GitHub)
- Security Copilot – a solution that goes across all of the available telemetry from Microsoft and an organizations tenant to help look for connections between alerts, logs, and other telemetry to help minimize the likelihood of missing signals of concerning behaviors. The solution also helps to simplify the multiple feeds of alerts and logs that are coming in so that they are easier to understand and take action on. (Microsoft Security Copilot | Microsoft Security)
For this particular blog, we will be focused on Microsoft 365 Copilot (M365 Copilot).
The potential for M365 Copilot to save users time, get them the data that they are looking for faster, and even completing tasks or communication automatically or through simple conversational interaction, is huge. There is also an entire world of custom API connections that can be made to extend the capabilities of M365 Copilot to cover Line of Business (LOB) applications or integrate into other 3rd party solutions. The issue is that for M365 Copilot to do all of this, it needs access, visibility, and permissions to perform the actions. There are times where M365 Copilot may actually do this outside the actual intended or believed access and/or permissions of the user themselves that is asking Copilot to perform the work.
So, what do we do to help resolve this problem and concern? (This is potentially a HUGE problem for some organizations). The unintended consequences for this are that a user may suddenly view and interact with data that they were not intended to have access to or have Copilot generate a document for them that inadvertently publishes confidential information into a document that was going to be publicly available on a website. While there is no check box or silver bullet to fix this, the answer is actually quite simple in theory: Data Governance (DG) and Identity and Access Management (IAM).
Neither of these answers should come as a shock to IT professionals as they are the cornerstone components of any secured system and processes. In fact, many security controls are built off of dealing with components of these two ideas, so when it comes to M365 Copilot, what should we be looking at precisely? This answer is not as simple as we might hope and can be highly complicated depending on environmental configurations such as:
- Where my data is sitting
- How people access that data
- If I am dealing with internal only users or if I have external users involved as well
- What my existing Identity governance and IAM processes (including reviews) look like today
- How mature I am already in regard to Data Governance
- How far down the path of Zero Trust I am
For many, the answer to these questions might be, “I don’t know some of this information.” That is ok, but it does mean that any enablement of M365 Copilot should be to a very small group of users to prove use cases and begin to define or improve the answers to the configuration ideas listed above. Additionally for DG, Purview solutions including the following should be reviewed, implemented, and fine-tuned as necessary across all Microsoft 365 solutions (Exchange, SharePoint, OneDrive, Yammer (Now Viva Engage), Teams, etc.):
- Data Loss Prevention (DLP) including Endpoint DLP
- Retention Policies
- eDiscovery
- On-premises file share scans
- Labelling (Retention and Sensitivity)
- Data Map
- Data Catalog
- Insider Risk Management
- Communication Compliance
- Process for Data Subject Requests (DSR)
In addition, an IAM review should be conducted across file shares, SharePoint, OneDrive, Databases (and other structured data sources), devices, 3rd party applications, SaaS applications, etc. This will also include making sure that any relevant Entra ID licensing (formerly Azure AD) is leveraged to help reduce permissions and access including but not limited to:
- Privileged Identity Management (PIM)
- Entitlement Management
- Access Reviews
- Azure MFA
- Conditional Access
- Cross Tenant Access/Synchronization
- External Identities
- Permissions Management
- RBAC/ABAC
- Certificate Management
The bottom line is that there is work that must be done before M365 Copilot is enabled for the environment to make sure that proper due diligence is performed. This will maximize the likelihood that the enablement of the solution will not compromise any compliance or security controls. The good news is that Spyglass is ready to assist in any way possible including scheduling an Envisioning Session that helps to plan for enablement of M365 Copilot while also seeing what the huge potential benefits of the enablement. Contact us today to discuss further!