Copilot is one of the next big things from Microsoft and will be interacting with almost every solution that Microsoft currently publishes. The current suite of products for Copilot include:
For this particular blog, we will be focused on Microsoft 365 Copilot (M365 Copilot).
The potential for M365 Copilot to save users time, get them the data that they are looking for faster, and even completing tasks or communication automatically or through simple conversational interaction, is huge. There is also an entire world of custom API connections that can be made to extend the capabilities of M365 Copilot to cover Line of Business (LOB) applications or integrate into other 3rd party solutions. The issue is that for M365 Copilot to do all of this, it needs access, visibility, and permissions to perform the actions. There are times where M365 Copilot may actually do this outside the actual intended or believed access and/or permissions of the user themselves that is asking Copilot to perform the work.
So, what do we do to help resolve this problem and concern? (This is potentially a HUGE problem for some organizations). The unintended consequences for this are that a user may suddenly view and interact with data that they were not intended to have access to or have Copilot generate a document for them that inadvertently publishes confidential information into a document that was going to be publicly available on a website. While there is no check box or silver bullet to fix this, the answer is actually quite simple in theory: Data Governance (DG) and Identity and Access Management (IAM).
Neither of these answers should come as a shock to IT professionals as they are the cornerstone components of any secured system and processes. In fact, many security controls are built off of dealing with components of these two ideas, so when it comes to M365 Copilot, what should we be looking at precisely? This answer is not as simple as we might hope and can be highly complicated depending on environmental configurations such as:
For many, the answer to these questions might be, “I don’t know some of this information.” That is ok, but it does mean that any enablement of M365 Copilot should be to a very small group of users to prove use cases and begin to define or improve the answers to the configuration ideas listed above. Additionally for DG, Purview solutions including the following should be reviewed, implemented, and fine-tuned as necessary across all Microsoft 365 solutions (Exchange, SharePoint, OneDrive, Yammer (Now Viva Engage), Teams, etc.):
In addition, an IAM review should be conducted across file shares, SharePoint, OneDrive, Databases (and other structured data sources), devices, 3rd party applications, SaaS applications, etc. This will also include making sure that any relevant Entra ID licensing (formerly Azure AD) is leveraged to help reduce permissions and access including but not limited to:
The bottom line is that there is work that must be done before M365 Copilot is enabled for the environment to make sure that proper due diligence is performed. This will maximize the likelihood that the enablement of the solution will not compromise any compliance or security controls. The good news is that Spyglass is ready to assist in any way possible including scheduling an Envisioning Session that helps to plan for enablement of M365 Copilot while also seeing what the huge potential benefits of the enablement. Contact us today to discuss further!