We’ve come a long way, eh?

From ChatGPT being used for poems to Microsoft holding a 27% stake in OpenAI and now a variety of Copilots across a range of Microsoft applications, the generative AI tool we once mocked and wrote off is permanently here to stay.

Okay, so there’s still plenty of mocking when it comes to output. But many have become accustomed to writing lengthy prompts and implementing guardrails. What’s more, we’re not just using Copilot (or any GPT-type platform) as a simple search-based app anymore. Instead, we’re integrating with line of business systems, creating autonomous workflows, and getting creative to drive genuine value from this powerful tool.

With that said, there remain challenges and pitfalls. These, along with success stories, era-defining stats, and some opinions from people who matter, are what’s contained inside this State of Microsoft Copilot report.

Table of Contents

Untangling the Names

Microsoft hasn’t made it easy to talk about Copilot without everyone getting confused. So before we get into the good, the bad, and the “why is this button here?”, we need to pin down what we actually mean by Microsoft 365 Copilot.

Some people will point to the chat-style Copilot app on the left rail. Others mean the little sparkle icon that pops up in Word, Excel, Outlook, PowerPoint, or Teams. A few still think of “GitHub Copilot” and wonder why Visual Studio isn’t launching.

At a high level, “Microsoft 365 Copilot” is the paid AI assistant that runs inside the Microsoft 365 stack. It has rights to work with your content in OneDrive, SharePoint, Exchange, Teams, and the rest of the Graph, subject to your permissions and licenses. When most IT teams say “we’re rolling out Copilot”, this is what they mean rather than the free, consumer-flavored Copilot you get in a browser.

  • The Copilot app (a standalone chat experience in the browser, in Windows, on mobile, and pinned into Teams).
  • In-app Copilot (the “sparkles” you see in Word, Excel, PowerPoint, Outlook, Teams, and others).
  • Other branded Copilots (Sales, Finance, Security, Viva, etc.) that sit on top of the same idea but with extra domain-specific features.

A note on naming conventions before we begin: There’s a whole conversation to be had regarding product names. For some, it’s still hard to figure out the difference between Copilot, Microsoft 365 Copilot, and a plethora of agent-type Copilots. If you’re at this stage of your Copilot journey, I suggest reading How To Choose The Right Microsoft Copilot: Chat vs M365 Copilot vs Copilot Agents. For the sake of simplicity in this report, I will refer to many different Copilots simply as Copilot unless it’s not implicit which type I am referring to.

Copilot in Microsoft Word

Recent Feature Releases Worth Calling Out

In the State of Microsoft Teams, I included a lengthy section covering what was new in the past 18 months. While I will refrain from doing so in this report, mostly due to the newness of Copilot itself, I feel it important to highlight some gamechangers and items lots of users and stakeholders have been waiting for.

 Note: To keep track of the constant changes and feature releases in Copilot (and the rest of Microsoft 365), I recommend signing up to ChangePilot.

Use ChangePilot to keep track of changes in Microsoft 365

Latest features updates in Microsoft 365 Copilot

Microsoft has started to close the gap between the Copilot you see in marketing materials and the one people actually use every day in production tenants. Recent releases and early 2026 rollouts highlighted in Microsoft’s own blogs and message center posts cluster around a few key themes:

  • Deeper inbox and calendar intelligence in Outlook: Copilot is gaining richer reasoning over your inbox and calendar, allowing it to better summarize threads, understand priorities, and automate basic meeting logistics instead of just paraphrasing messages.
  • Agent‑style behavior in the core Office apps: Microsoft is extending agent‑like capabilities into Word, Excel, and PowerPoint, so Copilot can perform multi‑step tasks using a mix of your files and web data, rather than acting like a one‑shot prompt/response tool.
  • Richer Copilot Chat experiences: Updates to Copilot Chat include improved grounding over enterprise data, more control over content sources, and better persistence of chat history so follow‑up questions feel more like an ongoing conversation than a series of disconnected prompts.

All this is arriving alongside broader AI and security enhancements in Microsoft 365, meaning Copilot is increasingly tied to improvements in data classification, compliance, and admin‑side management rather than existing as a “nice to have” add‑on.

Copilot roadmap

The pace of innovation within the Copilot sphere is fast. Unlike communications technology of old, we’re not talking about a set and forget system. Rather, it’s an evolving beast that continues to gain intelligence and is impossible to tame.

Looking ahead, Microsoft describes Copilot as moving from assistant to AI productivity platform inside Microsoft 365, with 2026 focused on three big directions:

  • Broader availability across the suite: Copilot experiences continue to expand across Word, Excel, Outlook, Teams, SharePoint, and mobile, with UI entry points and voice input becoming enabled by default for more tenants rather than gated behind manual configuration.
  • Mature agents and automation: Agent capabilities are being promoted from preview features to mainstream, allowing organizations to build repeatable, governed workflows on top of Copilot rather than relying purely on ad‑hoc prompting by individual users.
  • Stronger management and governance controls: New admin and security capabilities are landing alongside the AI features, giving IT more levers to control where Copilot can access data, how it behaves with sensitive content, and how usage is monitored over time.

In other words, Copilot is no longer what cynics have deemed  “a chat box with sparkles”. It’s becoming part of the fabric of Microsoft 365, with a roadmap that ties feature releases directly to licensing, governance, and long‑term platform strategy.

The Good: Where Copilot Actually Delivers Value and Productivity

Onto the analysis, which I’ve tried to keep unbiased and factual. (There is a section of one-liners below which may or may not be a little more fun.)

Everyday wins: the boring but brilliant stuff

Copilot is at its best when it’s doing work you never wanted to do in the first place. The unglamorous, admin-shaped tasks that pile up and make you feel behind before 10am.

  • Email triage: Outlook is where many people feel Copilot’s impact first. Long threads become short summaries. You can ask “What do I actually need to do here?” and get a list of actions instead of scrolling through 27 replies. That’s the difference between dreading your inbox and being able to clear the decks in a few minutes.
  • Meeting recaps: In Teams, Copilot turns “I missed that call” into “I can catch up in five minutes.” It pulls out decisions, actions, owners, and open questions from recordings and transcripts. You don’t need someone frantically typing notes; people can be present in the meeting knowing they’ll get a decent recap.
  • “First draft” work: In Word, PowerPoint, and the Copilot app, it’s the blank-page problem that disappears. Copilot will happily create a messy first pass of a proposal, report, email, or slide deck from a short prompt or existing document. It won’t be perfect, but editing is almost always easier than starting from zero.
  • Summarizing chaos: Whether it’s a project channel in Teams, a folder full of docs, or a painful email chain, Copilot is good at turning “too much information” into something you can actually act on. Summaries, bullet-point updates, and quick explanations of “what changed since last week” are where it earns its keep.

None of this makes headlines, but it’s where people quietly start to miss Copilot when it’s not there. And, let’s be honest, this is where the most genuine adoption and usage will always be in Copilot.

Where Copilot is Quietly Excellent

There’s a layer of Copilot that doesn’t get talked about as much in demos, but matters a lot in real tenants: how well it understands your context.

  • Search that understands work, not just keywords: Because it can see your emails, meetings, files, and chats (within your permissions), Copilot can answer questions like “Show me the latest version of the proposal we sent to Contoso” or “What did we say about timelines in the last steering meeting?” You’re not hunting through SharePoint libraries; you’re asking in plain language.
  • Context stitching: Copilot is genuinely useful when you ask it to combine sources. “Draft a project update using the notes from last week’s meeting and the risk log in Excel” is the sort of task that used to mean an hour of copying, pasting, and rephrasing. Now it becomes a single prompt and a few rounds of refinement.
  • Working across apps: The fact that you can start in the Copilot app, pull in an email thread, reference a Word document, and then push the result straight into a Teams channel or Outlook draft makes it feel like one surface rather than six separate tools. That’s what people mean when they say Copilot “sits across” their work.

When it’s doing this kind of cross-app, context-aware work, Copilot starts to feel less like a chatbot and more like a genuine assistant.

What “good” looks like in a normal tenant

In a normal, slightly messy organization (not a perfect demo environment), the success stories tend to look like this:

  • A customer success manager starts Monday by asking Copilot in Outlook to summarize their key accounts’ emails from last week and pull out any risks or escalations. They get a quick view of who needs attention before their first coffee.
  • A project manager misses a project stand-up but still ships a solid update to stakeholders by asking Copilot in Teams to summarize the meeting, then using that summary to draft an email and a slide for the next steering committee.
  • A sales lead uses the Copilot app to prepare for a renewal call: “Summarize the last three meetings with this customer and highlight any open actions or pricing discussions.” No hunting through calendars, OneNote, and chat history.
  • An operations lead asks Copilot in Word to turn a scrappy process document and a few email threads into a more formal SOP, then iterates on clarity and tone instead of starting from scratch.
  • An exec assistant uses Copilot across Outlook and Teams to pull together a weekly briefing for the leadership team: key decisions, upcoming deadlines, and any red flags from key projects.

None of these rely on exotic customization or perfect data hygiene. They work because Copilot is embedded where people already spend their time, and because “good enough, quickly” is often far more valuable than “perfect, eventually.”

The Bad: Copilot May Be Using Confidential Data

Let’s not bury the lede here. In Feb 2026, TechCrunch ran the headline, Microsoft says Office bug exposed customers’ confidential emails to Copilot AI.

Zach Whittaker, Security Editor, wrote, “Microsoft has confirmed that a bug allowed its Copilot AI to summarize customers’ confidential emails for weeks without permission. The bug, first reported by Bleeping Computer, allowed Copilot Chat to read and outline the contents of emails since January, even if customers had data loss prevention policies to prevent ingesting their sensitive information into Microsoft’s large language model.”

While this is rather a red flag for IT and security staff, it all the more points toward planned implementations and security reviews conducted by an accredited Microsoft partner.

Note: This report’s sponsor, Cloud Revolution, was a nominated finalist for Microsoft Partner of the Year from 2022 – 2025, winning the overall award in 2023. You deserve to get the absolute most from Copilot, so choose wisely when implementing.

Some industry analysts were in uproar about this, suggesting E5 licenses (somewhat a contentious subject as businesses still fail to extract full value from E5) should be refunded as the security elements have failed. On the other hand, some commentators posited that this was more of a non-issue, citing “Copilot data still stays in the customer tenant and is not used to train underlying AI foundation models”.

Copilot adoption can easily stall

The main barrier to adoption and return on investment is expecting Copilot to fix broken processes. Roll it into a messy tenant and people either get poor results or give up after one bad experience.

  • Copilot Prerequisites: You need the right technical, licensing, and data foundations before scaling. If identity, permissions, and content locations are chaotic, Copilot will surface the wrong things or nothing useful at all.
  • Copilot Security: You must understand how Copilot respects and exposes existing permissions. It rarely grants new access; it simply reveals oversharing and old content you never tidied up.
  • Copilot Gotchas: You should expect behavioral and change‑management traps. People will write vague prompts, leaders will assume “on” means “adopted”, and clever agents may go unused without training, champions, and feedback loops.

Once you’ve taken care of these items, as highlighted in this post in the Microsoft community group on LinkedIn, you must roll out extremely methodically.

Big bang will not work with Copilot.

Copilot adoption framework

Hallucinations are still a thing

Even when adopted, as with all AI engines (ChatGPT, Gemini, Perplexity all still guilty), Copilot isn’t free from hallucinations. While the mature AI user has got used to these and will start a prompt with “Don’t hallucinate” and other clever ways to avoid them, you can’t expect Sue in Accounts to start working this way.

Here’s a synthesized version of a blog post by Shelf.io that explains how to prevent Copilot from hallucinating:

  • Use high‑quality data: Give Copilot accurate, current, and trustworthy information so it has a reliable base to work from.
  • Understand Copilot’s data limits: Avoid prompts that require real‑time, niche, or highly specialized knowledge outside its scope.
  • Provide trusted data sources: Direct Copilot to reputable authorities (like official reports or standards bodies) to anchor its answers.
  • Stay within Copilot’s strengths: Focus on summarization, rewriting, and general guidance rather than expert‑level legal, medical, or regulatory detail.
  • Test prompt variations: Try alternative phrasings to see which produces the most accurate and consistent output.
  • Keep prompts simple and clear: Use direct, single‑focus instructions to reduce ambiguity and incorrect assumptions.
  • Use few‑shot prompting: Provide examples of the format or style you want so Copilot can follow the pattern.
  • Provide only relevant information: Include just the essential context so Copilot doesn’t get distracted by unnecessary details.
  • Avoid high‑risk prompt types: Steer clear of tasks involving complex calculations, niche analysis, or subjective judgments that increase error rates.
  • Assign roles: Tell Copilot what perspective to adopt (for example, “Act as a cybersecurity analyst”) to narrow its reasoning path.
  • State what you don’t want: Explicitly instruct Copilot to avoid assumptions, projections, or irrelevant content.
  • Ask it to double‑check: Request that Copilot review and refine its own output for clarity and accuracy.
  • Verify important topics: Fact‑check critical or sensitive information with reliable human‑validated sources.
  • Watch for repeated hallucinations: If Copilot keeps struggling with certain topics, adjust your prompts or expectations and provide clearer context.

The issue here for adoption is that people don’t want to spend time learning how to write a hallucination-free prompt. As an IT department, you’re selling a time saver and productivity enabler, so why does this take so long to possibly end up being wrong? 

Do humans have too high expectations of artificial intelligence?

The Million Dollar Question (Quite Literally in Some Cases)

The license sticker shock vs perceived value

For many organizations, Copilot is the single biggest per‑user uplift they’ve added to Microsoft 365 in years. It often increases the cost of a license by a hefty percentage, especially at the lower end of the stack, which is why finance leaders reach for the red pen long before the first prompt is written. 

On paper, the business case looks compelling: independent studies and Microsoft‑commissioned analysis talk about triple‑digit Copilot ROI over three years, modest reductions in operating costs, and measurable gains in revenue and time to market. In practice, those numbers depend heavily on getting the foundations, targeting, and change management right.

This also opens up a wider conversation into the ROI of the E5 license. Copilot might be one element but what about the often unused security protocols, Purview, and compliance controls?

Take stock of what you’ve got, what you’re using, and what you’re missing out on by using this E5 ROI calculator below.

Where the real ROI actually comes from

The value rarely comes from “everyone saves an hour a day” spreadsheets. It comes from a smaller set of clearly defined workflows where time saved or quality improvement can be tracked. (Think proposal generation, project reporting, service desk responses.)

Organizations seeing real returns tend to do three things well:

  • Start with high‑value roles and teams rather than blanket‑licensing everyone.
  • Measure outcomes (faster cycle times, more proposals sent, fewer manual reports) instead of vanity metrics like number of prompts.
  • Continuously refine use cases and training based on what users actually do in Word, Excel, Outlook, and Teams, not just what was in the launch deck.

Why Copilot ROI feels so uneven

The uncomfortable truth is that Copilot ROI is lumpy. 

Power users and knowledge workers who live in documents, slides, spreadsheets, and meetings will feel the benefit quickly. Frontline staff in locked‑down environments may hardly notice it’s there. 

Example: A sales leader who suddenly gets three extra good proposals out the door each week will not question the license fee whereas a back‑office worker who uses Copilot twice a month absolutely will. 

The organizations that avoid buyer’s remorse are the ones that accept this unevenness, design their license strategy around it, and are willing to remove or reassign licenses where they’re clearly not delivering value.

What The Experts Say About Copilot: One-Liners

I asked the Microsoft and wider unified comms ecosystem for a one-liner about the current state of Microsoft Copilot. It made for some interesting reading…

  • Chad McGreanor, CEO at Cloud RevolutionThe real business risk with Copilot is no longer implementation, but hesitation.
  • Patrick Watson, Research Director at Cavell: There’s a real GTM challenge for communication providers when Copilot’s promise is AI-powered productivity, but the customer conversation is still stuck on why Teams Phone costs what it does.
  • Senior Copilot Cloud Solution Architect at Microsoft: Microsoft Copilot is steadily improving—now better aligning with its marketing thanks to new OpenAI models, Claude integration, native Graph grounding, flexible LLM options, and Agent mode in Office apps—though lingering bugs and slow fixes remain valid concerns.
  • David Danto, Principal Analyst at TalkingPointzMicrosoft Copilot is a brilliantly engineered product still searching for consistent direction amid unnecessarily constant reinvention.
  • David Maldow, Founder and CEO at Let’s Do Video : Adopting Copilot is like navigating the Hogwarts staircases: you step onto what looks like the current product, but the floor moves, the name changes, and suddenly you’re standing on a licensing model that didn’t exist when you lifted your foot.
  • Dom Black, Growth Director and Principal Analyst at CavellCopilot licensing and other AI solutions are becoming a core value add service for MSPs, telcos and the reseller community. The question is, do they have the skills and personnel to sell AI services?
  • Kevin Kieller, Co-Founder and Lead Analyst at enable UCCopilot Confusion is real. Seemingly without a unifying vision, Copilot was hastily added to Microsoft Office apps in different ways surfacing divergent capabilities, resulting in poor reliability, complex licensing, and dismal paid adoption.
  • Tim Banting, Founder and Principal Analyst of So What, Now What, : Where there’s mystery there is margin!

Microsoft’s Copilot Strategy: Coherent Vision or Feature Sprawl?

On the surface, Microsoft’s Copilot story is simple: one AI assistant, everywhere you work. In reality, it can feel like a tangle of brand names, entry points, and overlapping features that change faster than most organizations can keep up.

PerspectiveHow it looks on paperHow it (sometimes) feels in reality
Name / conceptOne Copilot layer that sits on top of Microsoft 365.Multiple “Copilots”: Windows Copilot, Microsoft 365 Copilot, Sales, Security, Viva flavors.
Where it shows upEmbedded wherever you already work: Word, Excel, Outlook, Teams, PowerPoint, Copilot app.Copilot buttons scattered across apps, plus a separate Copilot app on the left rail, all behaving slightly differently.
Core promiseUses your organizational data and Microsoft Graph to automate real work, not just answer prompts.Feels like lots of disconnected features and previews rather than one joined‑up assistant.
Extensibility storyAgents, plugins, and connectors turn it into a platform for workflows and automation.Extra jargon that blurs the picture: “Is this a plugin, a connector, an agent, or just another feature someone turned on?”
Governance and securityOne AI layer, one security model, one place for admins to manage policies, access, and controls.Constant new entry points to review, control, explain, and justify to security and compliance teams.
Dependency on foundationsWorks brilliantly if licenses, data, and governance are in good shape.Highlights every messy permission, overshared site, and half‑finished project the moment it’s switched on.
Overall impressionCoherent vision: Copilot as the AI fabric across Microsoft 365, centrally managed and governed.Feature sprawl: Copilot feels like a set of things being done to users and admins, not a single, deliberate choice they’ve made and understood.

Where both stories can be true

Under the hood, the strategy is increasingly coherent: one AI fabric across Microsoft 365, wired into the Graph, with governance and compliance on top. 

At the edges, it still feels like feature sprawl: too many entry points, too little clarity on “which Copilot are we talking about?”, and a constant risk of over‑promising based on whatever was in last week’s keynote.

What this means for you

For customers, the job is to impose your own coherence on top of Microsoft’s:

  • Decide which Copilot experiences you actually care about. 
  • Map them to real jobs to be done. 
  • License intentionally. 
  • Turn off what you don’t need. 
  • Communicate clearly which Copilot your users are getting, where they’ll see it, and what it’s for.

The State of Microsoft 365 Copilot in 2026

So is Microsoft’s Copilot strategy coherent vision or feature sprawl? 

The honest answer: it’s a coherent vision wrapped in messy execution

That’s survivable. 

The organizations that win with Copilot aren’t the ones waiting for Microsoft to tidy everything up. They’re the ones who accept the chaos, pick their battles, and design a practical Copilot story that makes sense for their own users.