So, what’s Copilot?
As observers of Microsoft will know, they partnered with the AI tour-de-force ChatGPT some time ago. Copilot represents the integration of ChatGPT-like technology into Microsoft’s productivity suite. It’s the applications we use every day, like Word, Outlook, Teams and yes that bete noir for many, PowerPoint.
Copilot combines large language models (LLMs) with your personal (and business) data in Microsoft Graph / Microsoft 365 apps, for a natural language interface capable of doing multiple useful things. For example, ask for a PowerPoint presentation based on the sales data in a Word document; it does it for you. Having recently spent 6 hours wrestling with PPT, the value here is obvious!
That’s the power of Copilot. As part of Microsoft 365, it works across your calendar, emails, chats, documents, meetings, and contacts. Ask it things like ‘Share the most recent go-to-market campaign’, and it generates a status update based on recent meetings, emails and chat threads. This helps drive down the 57% of time we currently spend communicating, and improve on the 43% we spend creating (more details in the Microsoft Work Trend Index).
Along with these benefits, comes new governance and structural challenges – some of which we can’t even anticipate.
AI – the ultimate nosy parker
This leads us to important questions around governance and information access. Copilot will do exactly what is asked of it, with no moral or ethical compass. It’s the ultimate nosy parker, for one thing immediately ending the comfort of ‘security through obscurity’.
Let’s take the CEO’s payslip as a trite, but potentially very real example. Ask Copilot to tell you how much the CEO’s paid, and Copilot, using your permissions, won’t rest until it’s gone through every source available to you. It will present all evidence found. And among the digital paperwork you have access to, it may well discover this information.
This is a massive danger with multiple implications. This includes issues which may arise with onboarding / offboarding, with permissions a person has in one position, but should not have if moved into another. The common practice of using ‘templated’ provisioning of employees in similar roles also needs scrutiny.
These are just ‘top of mind’ issues with Copilot. Be assured, though, that this is the top of a potentially deep rabbit hole. What is certain is that in the Copilot age, a more rigorous, nuanced approach to security and governance is necessary.
Data governance principles still apply
The good news, of course, is that data governance is far from new, conceptually. Existing principles must and do apply, including the ‘pillars’, which are:
- Data quality
- Data security and privacy
- Data architecture and integration
- Metadata management
- Data lifecycle management
- Regulatory compliance
- Data stewardship
- Data literacy
What changes is rigour
With a relentless (AI) nosy parker, audits are likely to be required more regularly. The classic Kiwi ‘she’ll be right’ approach won’t work. Vigilance with data is vital as these issues emerge (and they will emerge in practice, as more Copilot deployments expose them).
Fortunately, the tools are there for better data management and improved governance – including things like Data Loss Prevention, and Data Protection. In fact, like Copilot, these tools are built into Microsoft 365. However, for full effect, their use depends on an effective and appropriate policy first, and then configuration and deployment.
The bottom line?
It does look like truly helpful AI is on the way. Making the most of it, though, requires careful consideration of just how AI is used, and any governance or data security implications AI exposes.