The Cyber Security Risks of Copilot

Copilot is the new AI assistant that promises to unlock greater efficiency and productivity in the workplace. However, businesses must be aware of the risks before implementing this tool. Read on to discover the cyber security risks of Copilot.
13.02.24 Charles Griffiths

The average organisation has more than $28 million (£22.2 million) in Software-as-a-Service (SaaS) data breach risk.

That’s data stored in applications and services like Asana, HubSpot, Slack, Salesforce – and, yes, Microsoft. In the average organisation using Microsoft 365, 1 in 10 sensitive files are open to all employees.

It’s here that Microsoft’s new product, Copilot, presents a potential risk to businesses. Because it can access potentially every file an organisation has created and stored in Microsoft 365, it’s crucial that business leaders rolling out Copilot do so without compromising data security.

How Copilot Works

Copilot is an AI assistant that takes tedious daily workloads off a worker’s shoulders so they can focus on creative problem-solving.

What makes this AI more powerful and useful than Large Language Models (LLMs) like ChatGPT is that Copilot can access everything a user has worked on in 365. Documents, presentations, meeting notes – if a 365 user has access to it, Copilot can use it. Copilot then uses LLMs to generate better content.

For instance, if you wanted to create a market analysis report, the workflow would look something like this:

  • Open a new Word document and input a prompt.
  • Copilot gathers the relevant data from your Microsoft Graph (all 365 content you have permissions for, like meetings and files), the web and other services.
  • The information is sent to an LLM, which generates an output.
  • The output is sent back to Copilot for post-processing (including responsible AI checks, security, compliance and privacy reviews, amongst other checks).
  • Copilot returns the output to you for review and assessment.

(For a full breakdown on how Copilot works, see this infographic.)

Microsoft 365 CoPilot Image

The Risks

Integrating any new technology into a business carries data security risks – remember the concerns around sensitive data being used in ChatGPT?

Copilot is no different. It can use everything a 365 user has permission to view, so businesses need to take extra care that users can only access the right content needed for their job. What’s worrying is that more than 50% of identities are ‘super admins’, meaning they have access to all permissions and resources in an organisation.

Lax security controls could have costly consequences. Workers may accidentally generate content containing sensitive information that they shouldn’t have access to, which could then be shared with a wider group.

Alternatively, Copilot could inadvertently write content for one client containing sensitive information from another client. If that’s shared, the business faces a potential data or regulation breach. GDPR is an ever-present burden in modern business, with heavy fines for non-compliance.

What You Need to Manage

Permissions

Controlling what data users have access to is crucial for maintaining security when rolling out Copilot. The risks of sensitive data being accidentally shared are too significant to ignore.

Before implementing Copilot, businesses should review permissions for everyone in their organisation to ensure that staff can only access data appropriate for their jobs.

Editing

AI-generated content is better than ever. However, as mentioned above, Copilot may pull the wrong information into its content – whether that’s ‘hallucinations’ that are still prevalent in LLMs or sensitive data that shouldn’t be shared.

Copilot allows everyone to generate content at the press of a button. It’s the responsibility of businesses to ensure that editing standards are maintained and that checks and balances are in place to review and approve changes.

Labels

Sensitivity labels are a security feature used by Microsoft to classify and protect content based on its sensitivity level. This allows businesses to enforce protection actions like encryption, access restrictions, and visual markings based on the classified sensitivity of the data.

However, Copilot content doesn’t inherit the labels from source files. With AI making it easier to generate more content, there’s a greater risk of new files being mislabelled – and, therefore, accessible to the wrong people.

Get Copilot-Ready

Copilot’s powerful feature set makes it a great addition to any workforce. But setting up Copilot securely can feel daunting.

AAG’s comprehensive support helps you get the most out of Copilot. An initial consultation and readiness assessment ensures that the new services can be accessed securely, while customised training helps your team understand Copilot features and its applications in their workflows. We’ll even run regular updates based on your usage to keep your Copilot services optimised.

Contact us today to get your Copilot readiness assessment.

 

Download our free guide: IT Services Buyer's Guide - 2024 Edition

Our comprehensive 50 page guide details everything you need to know about outsourcing your IT.
Download your Guide

Related insights

Browse more articles from our experts and discover how to make better use of IT in your business.

Business
Microsoft 365 CoPilot Image

What is Microsoft Copilot?

13.02.24

Microsoft Copilot is the new AI-powered assistant that promises to enhance productivity for businesses using 365 products. Read More

Business
CoPilot on screen

The Cyber Security Risks of Copilot

13.02.24

Copilot is the new AI assistant that promises to unlock greater efficiency and productivity in the workplace. However, businesses must be aware of the risks before implementing this tool. Read on to discover the cyber security risks of Copilot. Read More

Business
CoPilot on screen

Microsoft 365 Copilot Release Date

09.02.24

Copilot is the new AI assistant that's set to revolutionise the workplace. Find out the Microsoft 365 Copilot release date here. Read More