Microsoft 365 Copilot – Office 365 for IT Pros https://office365itpros.com Mastering Office 365 and Microsoft 365 Fri, 20 Jun 2025 14:40:14 +0000 en-US hourly 1 https://i0.wp.com/office365itpros.com/wp-content/uploads/2024/06/cropped-Office-365-for-IT-Pros-2025-Edition-500-px.jpg?fit=32%2C32&ssl=1 Microsoft 365 Copilot – Office 365 for IT Pros https://office365itpros.com 32 32 150103932 Outlook’s New Summarize Option for Email Attachments https://office365itpros.com/2025/06/23/summarize-attachment-outlook/?utm_source=rss&utm_medium=rss&utm_campaign=summarize-attachment-outlook https://office365itpros.com/2025/06/23/summarize-attachment-outlook/#comments Mon, 23 Jun 2025 07:00:00 +0000 https://office365itpros.com/?p=69699

Summarize Attachment Feature is an Example of New Features Needed to Maintain Customer Interest

Introducing a new technology is hard. The great expectations created at the initial launch soon meets the hard reality of deployment and things don’t get better until the technology has had time to bake. This is as true for Microsoft 365 Copilot as for any other major technology. I see people questioning whether the $30/user/month really delivers any benefits, with real concern over whether people use any of the purported time saved through Copilot interventions doing anything more valuable than drinking more coffee.

News that the U.S. Better Business Bureau forced Microsoft to change some of the claims it makes about how Microsoft 365 Copilot affects user productivity doesn’t help the case for AI-based assistance. And lukewarm or mildly enthusiastic (but independent) reports about Copilot usage in organizations, like the recent UK Government report based on a 3-month trial for 20,000 employees don’t bolster the case much either.

All Microsoft can do is continue to push out updates and new AI-based features to keep customer interest while Copilot matures to become more useful in day-to-day activities. The result is a flood of new Copilot-related features, not all of which seem valuable except in specific cases. I don’t know whether AI-informed People Skills will become popular (some HR professionals that I know like People Skills a lot). Those in the Power Platform world (now with 56 million monthly active users according to data made available at Microsoft’s FY25 Q3 results) see lots of changes to make Copilot agents more productive. I do like the ability to upload documents to agents for the agents to reason over.

Summarizing Attachments

All of which brings me to the update described in message center notification MC1073094 (13 May 2025, Microsoft 365 Roadmap item 475249). It’s an example of a recent Copilot enhancement to help users process “classic” email attachments faster. Even though cloudy attachments are preferable in many respects, many people still send files instead of links.

Copilot has been able to summarize cloudy attachments for email for quite a while. Now, when a message with one or more classic file attachments arrives, users with a Microsoft 365 license see a new summarize option for Office and PDF attachments. The feature is available in the New Outlook for Windows, OWA, Outlook mobile, and Outlook for Mac, but not for Outlook classic. Microsoft is rolling out the update now with estimated completion by late August 2025.

Figure 1 shows the general idea. A Word file is attached to a message. Clicking the summarize option from the drop-down menu beside the attachment causes Copilot to create and display the summary for the file inside the Summary by Copilot panel (or card). If a message has multiple file attachments, the summarize option must be invoked separately.

The summarize option for a file attachment for a message opened in OWA.
Figure 1: The summarize option for a file attachment for a message opened in OWA

Copilot cannot process encrypted attachments (using sensitivity labels or another encryption mechanism).

No Archived Messages

My archive mailbox is full of attachments from long-forgotten projects, including files related to some legal cases that I was involved with. I was curious to see what sense Copilot might extract from some of the PDFs and Word documents from those cases. Oddly, Outlook won’t summarize any of the attachments for messages stored in an archive mailbox. To generate a summary for these files, you must open download and open Office files in a desktop or web app and use the Copilot options available in the app.

Thinking about why this might be so, I guess the logic is that attachments for archived messages probably aren’t of very high interest, and if someone goes to the trouble of finding an archived message, they have a purpose for doing so and won’t mind opening attachments to view content. On the other hand, I could be overthinking things and Microsoft simply designed the feature to work only with messages from the primary mailbox.

The Value of Small Changes

Over my many years of work, I cannot say how many emails I have received with file attachments. Being able to see a quick summary of an attachment is a good example of how AI can be effective. The feature works well because the AI has just one file to process, so it’s unlikely that hallucinations or other issues will occur. You might disagree with points made in the summary, but having the summary is a timesaver and a great starting point for understanding whether a file contains anything important.

Another example of a small but interesting change is the ability to create a meeting from an Outlook email thread (MC1090693, 9 June 2025, Microsoft 365 roadmap item 494154). The idea is that Copilot scans an email thread to determine the topic for a meeting and its participants and creates a meeting invitation ready to go. This kind of thing doesn’t need AI because existing Graph APIs can do the work, but Copilot integrates the work into a new Schedule with Copilot option (only for email threads with sufficient data to base a meeting upon). According the roadmap item, this feature is for the mobile clients, but I bet it will be available in the new Outlook and OWA too.

In the overall scheme of Copilot, delivering Outlook features to make small tasks easier is not important. However, changes that reduce friction for users are important and collectively a bunch of changes like this might just be enough to convince an organization that they really can’t live without Copilot.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/06/23/summarize-attachment-outlook/feed/ 1 69699
Using a Copilot Agent in SharePoint to Interact with Office 365 for IT Pros https://office365itpros.com/2025/06/16/copilot-studio-agent-knowledge/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-studio-agent-knowledge https://office365itpros.com/2025/06/16/copilot-studio-agent-knowledge/#comments Mon, 16 Jun 2025 07:00:00 +0000 https://office365itpros.com/?p=69542

Use Office 365 for IT Pros PDF Files as Knowledge Sources for Copilot

The announcement in message center notification MC1078671 (20 May 2025) that Copilot Studio can deploy agents to SharePoint Online sites (in Copilot Studio terms, SharePoint Online is a channel) gave me an idea. SharePoint has supported agents since October 2024, but those agents are limited to reasoning over the information contained in a site. Copilot Studio can create more flexible and powerful agents that can consume different forms of knowledge, including external web sites and files. Uploaded files are stored in the Dataverse, or the mysterious SharePoint Embedded containers that appeared in tenants recently.

My idea is to use the Office 365 for IT Pros eBook as a source for a Copilot agent. Our subscribers can download updated book files every month in PDF and EPUB format. Copilot can consume text files, including PDFs, as knowledge sources (message center notification MC1058260, last updated 9 June 2025, Microsoft 365 roadmap item 489214). If you have Microsoft 365 Copilot licenses, it seems logical to create an agent that uses the PDFs for the Office 365 for IT Pros and Automating Microsoft 365 with PowerShell eBooks as knowledge sources.

You could even expand the set of knowledge sources to https://office365itpros.com and https://practical365.com to include articles written by our author team. Once the agent is configured, it can be published to a SharePoint Online site for users to interrogate. Sounds good? Let’s explore what you need to do to make the idea come alive.

Adding Files to a Copilot Agent

During an investigation of the various ways to create Copilot agents, I created an agent in Copilot Studio called the Microsoft 365 Knowledge Agent. The agent already reasoned over office365itpros.com and practical365.com. I uploaded the PDF files for the two books to the agent so that the agent now reasons over the two websites and two PDF files (Figure 1). You might notice that I have disabled the options for the AI to use its LLMs and to search public websites when composing answers. That’s because I want the agent to limit its responses to the set of defined knowledge sources.

Adding files as knowledge sources for the Copilot agent.
Figure 1: Adding files as knowledge sources for the agent

The upload dialog says that files cannot be “labeled Confidential or Highly Confidential or contain passwords.” This might reflect old information as Microsoft has support for files protected by sensitivity labels in preview. The implementation seems very like support for sensitivity labels in Bizchat in that a user cannot access a file protected by a label if the label doesn’t grant them access to the content. I also assume that Copilot Studio will eventually support the DLP policy for Microsoft 365 to stop confidential files with specific labels being used as knowledge sources.

It can take some time for Copilot Studio to process uploaded files to prepare their content for reasoning, depending on their size. Office 365 for IT Pros is a 1,280-page 27 MB eBook, so it took several hours before Copilot Studio declared the file to be ready. You can upload a maximum of 500 files as knowledge sources for an agent.

Updating the Copilot Agent Instructions

Next, I adjusted the instructions for the agent. Here’s what I used:

  • Respond to requests using information from specific curated websites and the files uploaded as knowledge sources.
  • Ensure the information is accurate and relevant to the topic.
  • Provide well-structured and engaging content.
  • Avoid using information from unverified sources.
  • Maintain a professional and informative tone.
  • Be responsive and prompt in handling requests.
  • Focus on topics related to Microsoft 365 and Entra ID technology.
  • Write in a professional, clear, and concise manner.
  • Output PowerShell code formatted for easy copying and use by readers.
  • Ensure the PowerShell code is accurate and functional.
  • Do not guess when answering and create new PowerShell cmdlets that don’t exist. Always check that a cmdlet exists before using it in an answer.

Coming up with good instructions for an agent is an art form. I’m sure that these can be improved, but they work.

Publish the Copilot Agent to SharePoint Online

The next task is to publish the agent. To publish the agent to a SharePoint Online site, I selected SharePoint as the target channel (Figure 2) and then selected the site that I wanted to host the agent. I suspect that Copilot Studio caches site information because it wasn’t possible for search to find a new site for several hours after the site’s creation. Publishing to a site creates an .agent file in the default document library in the site.

Selecting SharePoint as the publication channel for the Copilot agent.

Copilot Studio.
Figure 2: Selecting SharePoint as the publication channel for the Copilot agent

An agent can only be deployed to a single site. If you make a mistake and deploy the agent to the wrong site, you’ll need to undeploy and remove the site from the agent configuration and then deploy the agent to the correct site.

Out of the box, the only person who can use the agent at this point is the publisher. To make the agent available to all site members, a site administrator needs to mark the agent as approved. The agent then shows up in the list of agents accessed through the Copilot button in the meu bar. Any user with a Microsoft 365 Copilot agent can use the agent as part of their license. Access for other users must be paid for on a pay-as-you-go basis.

Using the Copilot Agent in SharePoint

Interacting with the agent to ask questions from the knowledge contained in Office 365 for IT Pros is just like any other Copilot interaction. Compose a prompt and submit it to the agent, which contemplates the request and responds based on the knowledge available to it (Figure 3).

Using the agent in a SharePoint site.
Figure 3: Using the agent in a SharePoint site

SharePoint Online is not the only publication channel available to an agent. I also connect the agent to Microsoft 365 and Teams. Figure 4 shows how to chat with the agent in Teams.

Copilot agent working in Teams chat
Figure 4: Copilot agent interacting in Teams chat

The Only Downside is Our Monthly Updates

We know that Office 365 for IT Pros is a big eBook. Sometimes it’s hard to find the precise information that you’re looking for using the standard search facilities. Being able to have an agent reason over the books (and optionally, associated web sites) is an excellent way to have AI do the heavy lifting of finding and extracting knowledge in a very accessible way. The only downside is that you need to update the agent with the new files and republish to the target channels after we release our monthly updates. But that’s not a difficult task – and I am sure that a way will be found to automate the step in the future.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/06/16/copilot-studio-agent-knowledge/feed/ 4 69542
Microsoft Launches the Copilot Interaction Export API https://office365itpros.com/2025/05/30/aiinteractionhistory-api/?utm_source=rss&utm_medium=rss&utm_campaign=aiinteractionhistory-api https://office365itpros.com/2025/05/30/aiinteractionhistory-api/#comments Fri, 30 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69376

aiInteractionHistory Graph API Available in June 2025

Microsoft 365 message center notification MC1080682 (22 May 2025, Microsoft 365 Roadmap item 491631) announces that the new Microsoft 365 Copilot Interaction Export API (aka, the aiInteractionHistory API) will roll out in June 2025. This is the same API that I covered in a Practical365.com article last December and the documentation still says that the API is available through the Graph beta endpoint. Perhaps the intention is to move the API to the V1.0 (production) endpoint when it’s officially released.

I don’t see much change in how the API works or the retrieved data since I last looked at it. A welcome change is that it is now possible to fetch a maximum of 100 records per request rather then ten. Fetching ten interaction records at a time made the API very slow. Although faster than before, the API is still slow, especially for an API designed to allow third-party apps and ISVs “to export Copilot user interaction data for processing in their security and compliance (S+C) applications.”

Other audit APIs support fetching up to a thousand records at a time. Maybe a V1.0 version of the API will support a higher value. Details of how the API works and an example script can be found in the original article.

Licenses and Permissions

The AiEnterpriseInteraction.Read.All Graph permission needed to access interaction data is not available as a delegated permission, meaning that the only way to access the data is through an app (including app-only interactive Microsoft Graph PowerShell SDK sessions). Weirdly, accounts used to run apps using the API to fetch interaction records must have a Microsoft 365 Copilot license.

What the aiInteractionHistory API Captures

According to Microsoft, the API “captures the user prompts and Copilot responses in Copilot private interactions chat and provides insights into the resources Copilot has accessed to generate the response.” This statement does not mean that the data lays bare the details of Copilot interactions. Some of the information needs to be mined and interpreted to make sense. For instance, here are the details of an interaction record:

Name                           Value
----                           -----
locale                         en-us
body                           {[content, [AutoGenerated]undefined<attachment id="fd3a9044-309c-4ec9-a568-676f1d521f24"></attachment><attachment id="01TAGX3U2ESA5P3HBQZFHKK2DHN…
from                           {[@odata.type, #microsoft.graph.chatMessageFromIdentitySet], [user, System.Collections.Hashtable], [application, ], [device, ]}
appClass                       IPM.SkypeTeams.Message.Copilot.Word
attachments                    {02 Managing Identities.docx, unknown-file-name}
contexts                       {02 Managing Identities.docx, unknown-file-name}
createdDateTime                25/04/2025 09:27:05
conversationType               appchat
interactionType                userPrompt
mentions                       {}
links                          {}
sessionId                      19:t67NyrXsxDyC8qGGCtSQZYjC3TV1lYvq3IkjzpXquUc1@thread.v2
id                             1745573225046
requestId                      GTbr3lBouCMpcP7L1qVv8Q.20.1.1.1.4
etag                           1745573225046

The appClass property tells us what Copilot app the interaction is for. In this case, it’s Copilot for Word. The attachments property tells us if any reference files are used. One is mentioned here, and given that the body property mentions AutoGenerated, we can conclude that this interaction occurred when Copilot for Word generated an automatic summary for a document.

The interactionType tells us that this record is for a user prompt. Responses from Copilot have aiResponse in the interactionType property. User prompts that aren’t for automatic summaries have the text of the prompt in the body property. For example:

Name                           Value
----                           -----
content                        What functionality isn't available with a Microsoft 365 retention policy
contentType                    text

aiInteractionHistory API requests require the identifier for a user account and the response is the records for that user. Details of the user are in the from property, but you’ll have to navigate to from.user.id to see the identifier for the user. A DisplayName property is available in the from structure but doesn’t hold the display name of the user.

Assuming that a third-party application wanted to retrieve the ai interaction history records and process the records for its own purposes, it’s obvious from this brief discussion that the application has some work to do to interpret the raw data to make it useful for compliance investigations or other purposes. The script published with the December article referenced above shows how to approach the task, which is like the parsing of audit records to extract useful content. Figure 1 shows the kind of data that can be extracted from the aiInteractionHistory API records.

Data extracted using the aiInteractionHistory API
Figure 1: Data extracted using the aiInteractionHistory API

The Many Sources of Information About Copilot Interactions

It’s hard to know how useful the aiInteractionHistory API will turn out to be. Other sources of information can be mined to discover how people use Copilot, including usage data, audit records, and the compliance records held in user mailboxes. I guess it all depends on what you’re looking for.


]]>
https://office365itpros.com/2025/05/30/aiinteractionhistory-api/feed/ 1 69376
The Case of the Mysterious SharePoint Embedded Containers https://office365itpros.com/2025/05/28/sharepoint-embedded-containers-km/?utm_source=rss&utm_medium=rss&utm_campaign=sharepoint-embedded-containers-km https://office365itpros.com/2025/05/28/sharepoint-embedded-containers-km/#comments Wed, 28 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69322

Oddly Named SharePoint Embedded Containers Show Up for Copilot Studio

Microsoft 365 tenant administrators can be swamped with message center notifications, reports about service health issues, and automated email generated by Entra ID and other workloads. Other more important things usually get in the way and often no great harm is done. Right now, there are 830 notifications in the message center for my tenant, and probably only 20% of the notifications are what I consider important. For instance, knowing that a new channel update is available for the Office apps isn’t usually a critical event.

In any case, some gems do appear, and it’s important that tenant administrators keep an eye on what’s happening. Let’s discuss an example involving SharePoint Embedded and Copilot Studio to illustrate the point.

The Set of SharePoint Embedded Containers with GUID Names

At first glance, message center notification MC1058260 (last updated 12 May 2025, Microsoft 365 roadmap item 489214), titled “Microsoft 365 Copilot: Admin controls and user file uploads for agent knowledge sources” didn’t seem too worrying. Given Microsoft’s current preoccupation with AI, it’s unsurprising that flood of notifications describing various Copilot enhancements appear weekly. As I don’t use Copilot Studio much, it was easy to assume that a development won’t impact my tenant.

When investigating how Loop workspaces connected to Teams standard channels, I noticed a bunch of strange containers for the Declarative Agent app had appeared in SharePoint Embedded (Figure 1). Some process had created these containers in three batches on April 27 (3:25am), 8 May (1:53am), and 15 May (2:21pm). All the containers appeared to be empty. The only clue was the application name, indicating that the containers are related to some form of agents.

Some of the mysterious SharePoint Embedded Containers created for Copilot agents.
Figure 1: Some of the mysterious SharePoint Embedded Containers created for Copilot agents

Agents process information from knowledge sources like SharePoint Online sites. MC1058260 explains that users will soon be able to upload up to 20 documents for agents to use as knowledge sources, and when this happens, the uploaded files are stored in “tenant-owned Microsoft SharePoint Embedded (SPE) containers.” MC1058260 goes on to note that “As part of this rollout, we will pre-provision a limited set of SPE containers in your tenant.” The mystery is solved because these containers are the pre-provisioned containers mentioned by MC1058260. I assume that Microsoft creates the containers to make it faster for users to upload documents (because they don’t have to wait for an agent to create a container).

Adding Files as Knowledge Sources for Agents

My tenant ended up with 80 pre-provisioned containers (so far – I have no idea if more provisioning cycles will happen in the future). As far as I can tell, the provisioning operation didn’t generate any audit records. At least, audit log searches for the creation times for the containers turn up nothing of interest.

My tenant doesn’t have 80 agents in use (the number is more like 8), so I assume that the pre-provisioned containers are a pool that agents can use. To test the theory, I edited an agent that I created with Copilot Studio a couple of months ago and added the source Word document for the Automating Microsoft 365 with PowerShell eBook as a knowledge source (Figure 2).

Adding a file as a knowledge source for a Copilot agent.
Figure 2: Adding a file as a knowledge source for a Copilot agent

What I expected to happen is an allocation of one of the pre-provisioned containers to the agent and an update to the container name to change it from the GUID used by the pre-provisioning routine to the name of the agent. Updates don’t happen quickly in the SharePoint admin center and site and containers data is usually at least two days behind real time, so I was prepared to wait. However, no change showed up over the next few days.

The Mysterious SharePoint Embedded Containers Disappear

And then, Microsoft hid the pre-provisioned containers. I had chatted to some Microsoft contacts and complained about the mysterious containers, so I guess they acted. In any case, there’s now no trace of the containers and I can’t find out if the updated agent took over a container. And as I don’t know the application identifier for the Declarative Agent app, I can’t use the Get-SPOContainer cmdlet to retrieve any details like the storage consumption (or name) to check if anything had changed in the set of containers.

It’s probably best that Microsoft hides these containers when they are newly created and empty. However, once a container is used by an agent, I think it should show up in the set of active containers displayed in the SharePoint admin center, if only because the storage consumed by the container is charged against the tenant SharePoint Online storage quota. It’s the kind of detail that Microsoft needs to deliver for tenant-wide agent management.

The mystery is solved, and I learned how to add a file as a knowledge source for an agent. Keep an eye on the notifications posted to the message center. You might even learn something too!


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/05/28/sharepoint-embedded-containers-km/feed/ 1 69322
Why Copilot Access to “Restricted” Passwords Isn’t as Big an Issue as Uploading Files to ChatGPT https://office365itpros.com/2025/05/20/microsoft-365-copilot-pen-test2/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-pen-test2 https://office365itpros.com/2025/05/20/microsoft-365-copilot-pen-test2/#comments Tue, 20 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69290

Unless You Consider Excel Passwords to be Real Passwords

I see that some web sites have picked up the penetration test story about using Microsoft 365 Copilot to extract sensitive information from SharePoint. The May 14 Forbes.com story is an example. The headline of “New Warning — Microsoft Copilot AI Can Access Restricted Passwords” is highly misleading.

Microsoft 365 Copilot and penetration tests.

Unfortunately, tech journalists and others can rush to comment without thinking an issue through, and that’s what I fear has happened in many of the remarks I see in places like LinkedIn discussions. People assume that a much greater problem exists when if they would only think things through, they’d see the holes in the case being presented.

Understanding the Assumptions made by the Penetration Test

As I pointed out in a May 12 article, the penetration test was interesting (and did demonstrate just how weak Excel passwords are). However, the story depends on three major assumptions:

  • Compromise: The attacker has control of an Entra ID account with a Microsoft 365 Copilot license. In other words, the target tenant is compromised. In terms of closing off holes for attackers to exploit, preventing access is the biggest problem in the scenario. All user accounts should be protected with strong multifactor authentication like the Microsoft authenticator app, passkeys, or FIDO-2 keys. SMS is not sufficient, and basic authentication (just passwords) is just madness.
  • Poor tenant management: Once inside a tenant and using a compromised account, Microsoft 365 Copilot will do what the attacker asks it to do, including finding sensitive information like a file containing passwords. However, Copilot cannot find information that is unavailable to the signed-in user. If the tenant’s SharePoint Online deployment is badly managed without well-planned and well-managed access controls, then Copilot will happily find anything that the user’s access allows it to uncover. This is not a problem for Copilot: it is a failure of tenant management that builds on the first failure to protect user accounts appropriately.
  • Failure to deploy available tools: Even in the best-managed SharePoint Online deployment, users can make mistakes when configuring access, Users can also follow poor practice, such as storing important files in OneDrive for Business rather than SharePoint Online. But tenants with Microsoft 365 Copilot licenses can mitigate against user error with tools available to them such as Restricted Content Discovery (RCD) and the DLP policy for Microsoft 365 Copilot. The latter requires the tenant to deploy sensitivity labels too, but that’s part of the effort required to protect confidential and sensitive information.

I’m sure any attacker would love to find an easily-compromised tenant where they can gain control over accounts that have access to both badly managed SharePoint Online sites that hold sensitive information and Microsoft 365 Copilot to help the attackers find that information. Badly-managed and easily-compromised Microsoft 365 tenants do exist, but it is my earnest hope that companies who invest in Microsoft 365 Copilot have the common sense to manage their tenants properly.

Uploading SharePoint and OneDrive Files to ChatGPT

Personally speaking, I’m much more concerned about users uploaded sensitive or confidential information to OpenAI for ChatGPT to process. The latest advice from OpenAI is how the process works for their Deep Research product. Users might like this feature because they can have their documents processed by AI. However, tenant administrators and anyone concerned with security or compliance might have a different perspective.

I covered the topic of uploading SharePoint and OneDrive files to ChatGPT on March 26 and explained that the process depends on an enterprise Entra ID app (with app id e0476654-c1d5-430b-ab80-70cbd947616a) to gain access to user files. Deep Research is different and its connector for SharePoint and OneDrive is in preview, but the basic principle is the same: a Graph-based app uploads files for ChatGPT to process. If that app is blocked (see my article to find out how) or denied access to the Graph permission needed to access files, the upload process doesn’t work.

Set Your Priorities

I suggest that it’s more important to block uploading of files from a tenant to a third-party AI service where you don’t know how the files are managed or retained. It certainly seems like a more pressing need than worrying about the potential of an attacker using Microsoft 365 Copilot to run riot over SharePoint, even if a penetration test company says that this can happen (purely as a public service, and not at all to publicize their company).

At least, that’s assuming user accounts are protected with strong multifactor authentication…


]]>
https://office365itpros.com/2025/05/20/microsoft-365-copilot-pen-test2/feed/ 1 69290
Microsoft 365 Copilot Gets Viva Insights Service Plans https://office365itpros.com/2025/05/19/microsoft-365-copilot-license-sp/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-license-sp https://office365itpros.com/2025/05/19/microsoft-365-copilot-license-sp/#comments Mon, 19 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69239

Two Workplace Analytics Service Plans to Enable Viva Insights

Microsoft message center notification MC1009917 (last updated 25 April 2025, Microsoft 365 roadmap item 471002) announced the inclusion of Viva Insights in the Microsoft 365 Copilot license. The mechanism used is the addition of two “Workplace Analytics” service plans to join the existing eight service plans (table 1) that make up the Copilot license. More information is available in the documentation for the Copilot features made available by these service plans.

Service PlanService Plan SKUService Plan Part Number
Microsoft Copilot with Graph-grounded chat (Biz Chat)3f30311c-6b1e-48a4-ab79-725b469da960M365_COPILOT_BUSINESS_CHAT
Microsoft 365 Copilot in Productivity Appa62f8878-de10-42f3-b68f-6149a25ceb97M365_COPILOT_APPS
Microsoft 365 Copilot in Microsoft Teamsb95945de-b3bd-46db-8437-f2beb6ea2347M365_COPILOT_TEAMS
Power Platform Connectors in Microsoft 365 Copilot89f1c4c8-0878-40f7-804d-869c9128ab5dM365_COPILOT_CONNECTORS
Graph Connectors in Microsoft 365 Copilot82d30987-df9b-4486-b146-198b21d164c7GRAPH_CONNECTORS_COPILOT
Copilot Studio in Copilot for Microsoft 365fe6c28b3-d468-44ea-bbd0-a10a5167435cCOPILOT_STUDIO_IN_COPILOT_FOR_M365
Intelligent Search (Semantic search and dataverse search)931e4a88-a67f-48b5-814f-16a5f1e6028d)M365_COPILOT_INTELLIGENT_SEARCH
Microsoft 365 Copilot for SharePoint0aedf20c-091d-420b-aadf-30c042609612M365_COPILOT_SHAREPOINT
Workplace Analytics (backend)ff7b261f-d98b-415b-827c-42a3fdf015afWORKPLACE_ANALYTICS_INSIGHTS_BACKEND
Workplace Analytics (user)b622badb-1b45-48d5-920f-4b27a2c0996cWORKPLACE_ANALYTICS_INSIGHTS_USER

Table 1: Microsoft 365 Copilot Service Plans

The last update from Microsoft said that updates to add the Viva Insights service plans completed in mid-April 2025.

Viva Insights and Microsoft 365 Copilot

According to Microsoft, access to Workplace Analytics allows “IT admins and analysts can tailor advanced prebuilt Copilot reports with their business data or create custom reports with organizational attributes, expanded Microsoft 365 Copilot usage metrics, and more granular controls.” The data is exposed in Viva Insights (web), the Viva Insights Teams app (Figure 1), and the Viva Insights mobile apps.

Copilot Dashboard in the Viva Insights Teams app.
Figure 1: Copilot Dashboard in the Viva Insights Teams app

Everyone running a Copilot deployment is intimately aware of the need to track and understand how people use AI in different apps. The API behind the Copilot usage report in the Microsoft 365 admin center delivers sparse information. It’s possible to enhance the usage report data with audit data and use the result to track down people who don’t make use of expensive licenses, but that requires custom code. Hence the insights reported in the Copilot Dashboard in Viva Insights.

A note in the announcement says that access to the Copilot Dashboard now requires a minimum of 50 Viva Insights (Copilot) licenses. As obvious from Figure 1, my tenant has fewer than 50 licenses, but can still use Viva Insights because it’s not a new tenant.

What Service Plans Do

As you’re probably aware, a license (product, or SKU) is something that Microsoft sells to customers. A service plan enables or disables specific functionality within a license. For example, the Copilot license includes the Copilot Studio in Copilot for Microsoft 365 service plan, which in turn allows users to create agents in Copilot Studio. If you don’t want people to be able to access Copilot Studio, you can disable the service plan.

Disabling a service plan can be done by updating a user’s licenses through the Microsoft 365 admin center. Options are available to do this through User Accounts or License Details (Figure 2).

Amending service plans for a user’s Microsoft 365 Copilot license.
Figure 2: Amending service plans for a user’s Microsoft 365 Copilot license

If you use group-based licensing, you can amend the options for the Copilot license to remove service plans. However, this affects every user in the group, so you might end up with one group to assign “full” Copilot licenses and another to assign “restricted” licenses.

Be Careful When Disabling Copilot Service Plans

One potential issue with some Copilot service plans is that you’re never quite sure what removing a service plan will do. Removing the Microsoft 365 Copilot in Productivity Apps service plan seems straightforward because it disables the Copilot options in the Office desktop apps (all platforms). But disabling the Intelligent Search service plan will mess up any app that uses Copilot to search.

Blocking Copilot Studio is problematic. Removing the service plan only removes the ability of a user to sign in to use Copilot Studio. They can still sign in for a 60-day trial, just like anyone else with an email address who doesn’t have a Copilot Studio license.

Disabling Copilot Service Plans with PowerShell

Disabling service plans through a GUI can rapidly become tiresome. I wrote a PowerShell script to (downloadable from GitHub) to demonstrate how to use the Set-MgUserLicense cmdlet from the Microsoft Graph PowerShell SDK to disable a Copilot service plan. Another variation on removing service plans is explained here.

The script checks for group-based license assignment for Copilot licenses and if found, creates an array of excluded accounts that it won’t process. It then scans for accounts with a Microsoft 365 Copilot license and if the account isn’t excluded, runs Set-MgUserLicense to disable the Copilot Studio service plan. It’s just an example of using PowerShell to automate a license management operation and is easily amended to process any of the Copilot service plans. Enjoy!!


Stay updated with developments across the Microsoft 365 ecosystem by subscribing to the Office 365 for IT Pros eBook. We do the research to make sure that our readers understand the technology. The Office 365 book package includes the Automating Microsoft 365 with PowerShell eBook.

]]>
https://office365itpros.com/2025/05/19/microsoft-365-copilot-license-sp/feed/ 5 69239
Penetration Test Asks Questions About Copilot Access to SharePoint Online https://office365itpros.com/2025/05/12/copilot-for-microsoft-365-pentest/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-microsoft-365-pentest https://office365itpros.com/2025/05/12/copilot-for-microsoft-365-pentest/#comments Mon, 12 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69177

Can Attackers Use Copilot for Microsoft 365 to Help Find Information?

An article by a UK-based security penetration test company titled “Exploiting Copilot AI for SharePoint” drew my attention to see what weaknesses testing had found. I was disappointed. Although the article makes some good points, it doesn’t tell reveal anything new about the potential issues that can arise due to poor protection of information stored in SharePoint Online sites. Let’s discuss the points raised in the article.

A Compromised Account

Copilot for Microsoft 365 always works as a signed in user. Before an attacker can use Copilot for Microsoft 365, they must be able to sign into a licensed user’s account. In other words, that account is compromised. That’s bad for a tenant because any compromise can lead to data loss or other damage, and it’s probably indicative of other problems that attackers can exploit without going near Copilot.

Organizations should protect themselves with strong multifactor authentication (MFA). That message seems to be slowly getting through, and you’d imagine that any tenant willing to invest in Copilot is also willing to protect themselves by insisting that all accounts are protected by MFA.

Seeking Sensitive Information

The authors make a good point that people often store sensitive information in SharePoint Online. Attackers like to search for information about passwords, private keys, and sensitive documents. Copilot undoubtedly makes it much easier for attackers to search, but I don’t think that the default site agents create any vulnerability because these agents are constrained to searching within the sites they belong to.

Custom agents might be more problematic, but that depends on the information accessed by the agents. It also depends on the penetrated user being able to run the custom agents. The big thing to remember here is that Copilot can only access data available to the account being used. Custom agents in the hands of an attacker can’t automagically get to some hidden data. Anyway, organizations should monitor the creation of agents and have some method to approve the use of those agents.

Accessing Password Data

The penetration team reported that they had found an interesting file (an encrypted spreadsheet) that appeared to contain passwords that SharePoint blocked access to because “all methods of opening the file in the browser had been restricted.” This sounds like SharePoint’s block download policy was in operation for the site. However, Copilot was able to fetch and display the passwords stored in the file.

It’s likely that the spreadsheet was “encrypted” using the default Excel protection applied when a user adds a password to a spreadsheet. However, the encryption is no match for Microsoft Search, which can index the information in the file, and that’s what Copilot for Microsoft 365 Chat was able to display (Figure 1).

Copilot for Microsoft 365 reveals some passwords stored in a password-protected Excel worksheet.
Figure 1: Copilot for Microsoft 365 reveals some passwords stored in a password-protected Excel worksheet

Excel’s encryption is very poor protection in the era of AI. Sensitivity labels should be used to secure access to sensitive information, specifically labels that do not allow Copilot to extract and display information from files found by searching against Microsoft Search. Even better, use the DLP policy for Microsoft 365 Copilot to completely hide sensitive files against Copilot so that not even the file metadata is indexed.

Alternatively, use Restricted Content Discovery (RCD) to hide complete sites so that casual browsing by attackers (or anyone else looking for “interesting” information). Apart from RCD, Microsoft makes other SharePoint Advanced Management (SAM) features available to Microsoft 365 Copilot tenants. There’s no excuse for failing to use the access control and reporting features to secure sensitive sites.

Copilot for Microsoft 365 is a Superb Seeker

Copilot for Microsoft 365 is superb at finding information stored in SharePoint Online and OneDrive for Business. With good prompting, an attacker with access to a compromised account can retrieve data faster than ever before, and unlike previous methods of trawling through SharePoint files, Copilot access doesn’t leave breadcrumbs like entries in the last files accessed list.

Copilot access can be constrained by making sure that suitable permissions are in place for documents, deploying the DLP policy for Microsoft 365 Copilot, and limiting access to confidential sites through Restricted Content Discovery. The DLP policy and RCD are recent Copilot control mechanisms that I don’t think the authors of the penetration test report considered (even though they refer to blocking agents with RCD). But available mechanisms are worthless unless implemented, and the real value of reports like this is to prompt administrators to use available tools, including MFA to reduce the likelihood of a compromised account.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/05/12/copilot-for-microsoft-365-pentest/feed/ 1 69177
How to Enhance Copilot Usage Data https://office365itpros.com/2025/05/09/copilot-usage-data-accounts/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-usage-data-accounts https://office365itpros.com/2025/05/09/copilot-usage-data-accounts/#comments Fri, 09 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69127

Combine Copilot Usage Data with User Account Details to Gain Better Insight for Deployments

Discussing the usage data that’s available for Microsoft 365 Copilot (in the Microsoft 365 admin center and via a Graph API), a colleague remarked that it would be much easier to leverage the usage data if it contained the department and job title for each user. The usage data available for any workload is sparse and needs to be enhanced to be more useful.

Knowing what data sources exist within Microsoft 365 and how to combine sources with PowerShell or whatever other method you choose is becoming a valuable skill for tenant administrators. I’ve been down this path before to discuss combining usage data with audit data to figure out user accounts who aren’t using expensive Copilot licenses. Another example is combining Entra ID account information with MFA registration methods to generate a comprehensive view of user authentication settings.

Scripting a Solution

In this instance, the solution is very straightforward. Use a Graph API call (complete with pagination) to download the latest Copilot usage data, Find the set of user accounts with a Microsoft 365 Copilot license and loop through the set to match the user account with usage data. Report what’s found (Figure 1).

Copilot usage datacombined with user account details.
Figure 1: Copilot usage data combined with user account details

Obfuscated Data and Graph Reports

The thing that most people trip over is matching usage data with user accounts. This is impossible if your tenant obfuscates (anonymizes) usage data. This facility has been available since late 2020 and if the obfuscation setting is on in the Microsoft 365 admin center, all usage data, including the data used by the admin center and Graph API requests is “de-identified” by replacing information like user principal names and display names with a system-generated string.

It’s therefore important to check the settings and reverse it if necessary for the duration of the script to make sure that you can download “real” user information. If you don’t, there’s no way of matching a value like FE7CC8C15246EDCCA289C9A4022762F7 with a user principal name like Lotte.Vetler@office365itpros.com.

Fortunately, I had a lot of code to repurpose, so the script wasn’t difficult to write. You can download the complete script from the Office 365 for IT Pros GitHub repository.

Finding Areas for Focus

Getting back to the original question, I assume the idea of including job titles and departments with Copilot usage data is to figure out where to deploy assistance to help people understand how to use Copilot in different apps. You could do something like this to find the departments with Copilot users who have no activity in the report period (90 days).

    Group-Object -Property Department | ForEach-Object {
        [PSCustomObject]@{
            Department = $_.Name
            UserCount  = $_.Group.Count
        }
    }

$GroupedReport | Sort-Object -Property Department | Format-Table -AutoSize

Department               UserCount
----------               ---------
Analysis and Strategy            3
Business Development             1
Core Operations                 57
Editorial                        1
Group HQ                         1
Information Technology           3
Marketing                       22
Planning & Action                1
Project Management               1
Research and Development         1

With this kind of output, the team driving Copilot adoption and use for the organization would be wise to spend some time with the Core Operations and Marketing departments to ask why so many of their users don’t appear to be using Copilot.

As noted above, understanding how to use PowerShell to mix and match data sources to answer questions is a valuable skill. There’s lots of data available in a Microsoft 365 tenant. That data is there to be used!


Need some assistance to write and manage PowerShell scripts for Microsoft 365? Get a copy of the Automating Microsoft 365 with PowerShell eBook, available standalone or as part of the Office 365 for IT Pros eBook bundle.

]]>
https://office365itpros.com/2025/05/09/copilot-usage-data-accounts/feed/ 1 69127
How Microsoft 365 Copilot Tenants Benefit from SharePoint Advanced Management https://office365itpros.com/2025/05/06/sharepoint-advanced-management-2/?utm_source=rss&utm_medium=rss&utm_campaign=sharepoint-advanced-management-2 https://office365itpros.com/2025/05/06/sharepoint-advanced-management-2/#respond Tue, 06 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69011

Ignite Announcement About SAM for Copilot Customers Misinterpreted by Many

At the Ignite 2024 conference, Microsoft announced that “Microsoft 365 Copilot will now include built-in content governance controls and insights provided by SharePoint Advanced Management.” At the time, and still broadly believed, the assumption was that Microsoft would provide customers with Microsoft 365 Copilot licenses with SharePoint Advanced Management (SAM) licenses. Maybe even a single SAM license would be sufficient to license SAM technology alongside Copilot. That’s not the case.

If you’ve been waiting for a SAM license to appear in your tenant, you’ll be disappointed and won’t see SAM listed in the set of tenant subscriptions. Don’t be swayed by the banner in the SharePoint Online admin center to announce that your SharePoint Advanced Management subscription is enabled (Figure 1). It’s not. Access to SAM features is granted through a check enabled in code for the presence of Copilot. The necessary update is now broadly available to customers.

SharePoint Advanced Management options in the SharePoint admin center.

SAM
Figure 1: SharePoint Advanced Management options in the SharePoint admin center

SAM Features for Microsoft 365 Copilot Customers

The facts are laid out in the SAM documentation. Customers with eligible Copilot licenses can use some, but not all, SAM functionality without a SAM license. Here’s the list:

  • Site Lifecycle Policy
    • Inactive SharePoint sites policy
    • Site Ownership Policy
  • Data Access Governance (DAG) Insights
    • “Everyone Except External Users” (EEEU) insights
    • Sharing Links and Sensitivity Labels
    • PowerShell: Permission state report for SharePoint and OneDrive Sites, and Files
    • Sharing links report
  • Site Access Review
  • Restricted Content Discovery (RCD – enabled via PowerShell)
  • Restricted Access Control (RAC) for SharePoint and OneDrive for Business.
  • Recent Admin Actions and Change History
  • Block Download Policy
    • SharePoint and OneDrive sites
    • Teams recordings

There’s some good stuff here, particularly Restricted Content Discovery (RCD), the Site Lifecycle Policy to manage inactive sites, and the Block download policy. Every tenant with Microsoft 365 Copilot should consider enabling RCD to block Copilot access to sites containing sensitive Office and PDF files and sites containing old and obsolete material (the digital rot or debris that clutters up so many tenants).

The problem with Copilot reusing sensitive material in its responses is obvious. The issue with Copilot reusing old, obsolete, and potentially misleading content in its responses is equally problematic, especially if human checks don’t catch errors in responses. Copilot doesn’t know when a Word document written ten years ago is outdated and inaccurate. All Copilot sees is words that can be processed and reused.

When SAM is Needed

All of which brings me to a point where a SAM license is required. In my case, I wanted to test the extend SharePoint protections with a default sensitivity label feature. The idea here is to make sure that unlabeled files receive protection when downloaded by applying a sensitivity label with equivalent rights to those enjoyed by site users. Defining a default sensitivity label for a document library already requires an Office 365 E5 license or equivalent. Why this slight extension wanders into the need to have SAM is another example of bizarre Microsoft licensing.

The documentation notes that Copilot can’t currently open files with sensitivity labels applied in this manner. This means that Copilot cannot extract the protected content to use in its responses because it doesn’t have the right to do so. However, Copilot can search the metadata of labeled files and show that metadata to those who perform searches. Restricted Content Discovery is the right way to block Copilot access to files.

Anyway, without a SAM license, I can’t test. Do I want to pay Microsoft for a license for the privilege of testing their software? I don’t think so.

Copilot in Word for iOS

In closing, I attempted to use a new feature in Word for iOS (and Android) to dictate some notes for this article for Copilot to reason over and produce a draft. The feature is covered in MC1060866 (23 April 2025) and deployment has begun, which is why I guess I could use it. The dictation part worked, even if some of my words were misunderstood (Figure 2). But any attempt to have Copilot do some magic failed utterly. I guess that AI can’t help me…

Dictating text in Word for iOS for Copilot to process.
Figure 2: Dictating text in Word for iOS for Copilot to process


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365.

]]>
https://office365itpros.com/2025/05/06/sharepoint-advanced-management-2/feed/ 0 69011
Microsoft Extends DLP Policy for Copilot to Office Apps https://office365itpros.com/2025/05/05/dlp-policy-for-copilot2/?utm_source=rss&utm_medium=rss&utm_campaign=dlp-policy-for-copilot2 https://office365itpros.com/2025/05/05/dlp-policy-for-copilot2/#respond Mon, 05 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69146

Same DLP Policy for Copilot Used to Block BizChat

On May 1, Microsoft announced that the public preview of the DLP policy for Microsoft 365 Copilot is effective for the Office apps (MC1059677, 21 April 2025, Microsoft 365 roadmap item 423483). The new functionality is an extension of the DLP policy introduced in March 2025. At that time, the policy only covered Microsoft 365 Copilot Chat (BizChat). Its extension to cover the Office apps (desktop and web) is logical, even if the implementation is different. We’ll get to what those differences are shortly.

How the DLP Policy for Copilot Works

As a quick refresher, the DLP policy for Copilot works by checking if a file is assigned a specific sensitivity label. If true, the Copilot functionality built into the app is limited and the content of the file cannot be used in Copilot responses, such as creating a document summary.

Apps are responsible for checking if a DLP policy is active within the tenant and what sensitivity labels are associated with the policy, so the announcement marks the inclusion of the necessary code in the Office apps to check for the DLP policy. I tested with Microsoft 365 Enterprise Apps version 2504 (build 18730.20122).

Like any other DLP policy, the policy can have multiple rules. In this case, rules for the DLP policy for Copilot block access for a sensitivity label, so if you want to block access for multiple sensitivity labels, the DLP policy has a separate rule for each label. If you created the DLP policy for Copilot to use with BizChat, you don’t need to do anything to extend the policy to cover the Office apps.

Using the DLP Policy for Copilot in Word

As an example, I created a Word document and tested that all the Copilot functionality worked as expected. I saved the document and reopened it to force Copilot to generate the automatic summary.

I then applied one of the sensitivity labels covered by a rule in the DLP policy for Copilot and tried out some of the Copilot features. As you can see from Figure 1, the automatic summary was not removed (but the summary cannot be updated), and asking Copilot to explicitly summarize the document fails because “your organization’s policy doesn’t allow it.” However, it looks like Copilot can query the content of the document to answer questions in chat.

Copilot in Word with DLP block.

DLP policy for Copilot.

In their announcement, Microsoft says that “Copilot actions like summarizing or auto-generating content directly in the canvas are blocked.” They also say that chatting with Copilot is also blocked, but as you can see in Figure 1, Copilot answered a predefined question (“What is the purpose of DLP for M365 Copilot”) quite happily. On the other hand, if you go to the Message Copilot section and input the same question, Copilot refuses to answer. The block on chat worked in the web app but not always in the desktop version of Word (but this is preview software, so some bugs are expected).

Finally, Copilot cannot reference a file protected by one of the sensitivity labels covered by the DLP policy (an action that forces Copilot to extract the content of the referenced document).

Maybe Just Turn Copilot Off

I’ve used Copilot for nearly two years, and I was initially confused by the effect the DLP policy for Copilot has on the Office apps. To me, it would be simpler and more understandable to disable Copilot completely for documents within the scope of the DLP policy. I would remove the Copilot button from the menu bar and make sure that no UI elements that expose any Copilot feature, like the automatic summary appear. Right now, the UI is a confusing mishmash of things that work and stuff that doesn’t that needs to be cleaned up.


Learn how to exploit the data available to Microsoft 365 tenant administrators through the Office 365 for IT Pros eBook. We love figuring out how things work.

]]>
https://office365itpros.com/2025/05/05/dlp-policy-for-copilot2/feed/ 0 69146
Copilot’s Solution to Fix Grammar and Spellings https://office365itpros.com/2025/04/25/fix-grammar-and-spelling-errors/?utm_source=rss&utm_medium=rss&utm_campaign=fix-grammar-and-spelling-errors https://office365itpros.com/2025/04/25/fix-grammar-and-spelling-errors/#respond Fri, 25 Apr 2025 07:00:00 +0000 https://office365itpros.com/?p=69025

Fix Grammar and Spelling Problems with a Single Click

I am indebted to Paul Robichaux for pointing out a notable example of where Copilot for Word could help authors rewrite text to be clearer and more concise. The offending article is in the Microsoft Graph documentation for a topic called “enhanced personalization control overview.” It’s part of a set of controls for Microsoft 365 features such as name pronunciation and user pronouns in an area that’s been under development for several years. The text is turgid, unclear, and difficult to understand (Figure 1).

This text is perfectly clear, concise, and highly readable.

Fix grammar and spelling errors.
Figure 1: This text is perfectly clear, concise, and highly readable

The Copilot Rewrite

I asked Copilot for Word to make the original text more consise and understandable. Copilot produced this text:

Microsoft 365 Copilot customizes itself for each user, assisting with daily tasks based on their work knowledge. Both tenant administrators and users can manage this personalization using advanced control features.

Discover how Microsoft 365 Copilot enhances personalization, ensures privacy, and provides control through improved settings.

The corrected text is still not good. Human intervention is necessary to explain how Copilot “can become personalized to each user” (original text) or how Copilot “customizes itself for each user” (revision), and so on. It is a stark warning of the danger of accepting AI-generated text without question. Not only can customer communications end up with absolute rubbish, but the danger also exists that AI-generated incorrect and misleading text ends up being stored in a file and reused ad nauseum by Copilot when it generates responses to future user prompts.

You might wonder why the Microsoft writers did not ask Copilot to refine their text. Well, I do not know if they did or not, but it could be that because the text is about a new feature that does not exist, Copilot could not find anything better to say in the Graph or in its LLMs. Remember, generative text depends on what has gone before. Copilot can rehash material it knows about, but it cannot write material about a new topic from scratch.

The Copilot Promise to Fix Grammar and Spelling Errors

Which brings me neatly to message center notification MC1060868 (23 April 2025, Microsoft 365 roadmap item 483954), which promises a new Copilot “fix grammar and spellings” feature that will address all grammar and spelling problems found in text with a single click. General availability of the feature is due in late April 2025 with deployment scheduled to complete worldwide by mid-June 2025.

Microsoft doesn’t say what languages are supported, but I assume that the feature will appear in all the languages supported by Copilot. MC1060868 contains no detail about which Copilot apps will benefit. Copilot for Word is an obvious target, and I assume that Copilot for Outlook will also receive help to tidy up email communications. As to the other apps, I guess we will see after the feature arrives.

It is a logical progression to have a single-pass process to find and remedy common errors in documents. Word has options to check for spelling and grammar errors as user type text into documents. The difference here is that Word suggests and nudges people when it detects potential errors whereas Copilot will go ahead and rewrite text to remove errors. It is then up to the user to decide whether to keep or discard the Copilot rewrite. Overall, Copilot’s one-click solution is a more proactive approach to helping people generate better text.

But is it Possible to Fix Grammar and Spelling with One Click?

That is, if everything works. The history of software designed to help people write better text is littered with dead ends. Does anyone pay much attention to the recommendations of Microsoft Editor? Why do people continue to subscribe to services like Grammarly when Microsoft offers spell and grammar checking in its products. Perhaps we are heading to a new golden age of beautiful text created by humans and enhanced by AI. Maybe, and I am sure the prospect will be welcomed by those who write the Graph documentation. But I am not holding my breath.


Make sure that you’re not surprised about changes that appear inside Microsoft 365 applications by subscribing to the Office 365 for IT Pros eBook. Our monthly updates make sure that our subscribers stay informed.

]]>
https://office365itpros.com/2025/04/25/fix-grammar-and-spelling-errors/feed/ 0 69025
How SharePoint Online Restricted Content Discovery Works https://office365itpros.com/2025/04/02/restricted-content-discovery-works/?utm_source=rss&utm_medium=rss&utm_campaign=restricted-content-discovery-works https://office365itpros.com/2025/04/02/restricted-content-discovery-works/#comments Wed, 02 Apr 2025 07:00:00 +0000 https://office365itpros.com/?p=68682

Restricted Content Discovery Hides SharePoint Content from Copilot and Agents

The problem of poor permission management has surfaced from time to time in the history of SharePoint. The Office Delve app caused the last big upheaval within Microsoft 365 when it demonstrated an uncanny ability to surface sensitive documents to user view. Of course, Delve was never the problem. The issue is due to careless permission assignment, usually at site level.

When Microsoft launched Copilot in March 2023, it soon became apparent that Copilot is even better than Delve at finding and reusing documents, including files that an organization would prefer to remain restricted. Microsoft’s short-term answer was Restricted SharePoint Search, a horrible but expedient solution that works on the basis of an allow list for enterprise search which restricts users to only being able to search approved sites. Copilot always works as the signed in user, so the limits applied to users apply to Copilot to stop the AI using material stored in unapproved sites in its responses.

Restricted Content Discovery (RCD) is the latest solution to control unfettered access to confidential information stored in SharePoint Online sites. RCD is part of the SharePoint Advanced Management (SAM) suite. Microsoft is making SAM available to tenants with Microsoft 365 Copilot licenses via a code update that’s slowly deploying.

How Restricted Content Discovery Works

Restricted Content Delivery works by adding a flag to files stored in designated SharePoint Online sites. When an administrator marks a site for RCD through the SharePoint admin center or PowerShell. Figure 1 shows the “restrict content from Microsoft 365 Copilot” option in the admin center. When a site is selected for RCD, SharePoint sets a site-level property that causes index updates for every file in the site. Although RCD is applied at a site basis, SharePoint indexing happens at the file level, so a fan-out process must find and reindex every file in a site before RCD becomes effective for that site.

The time required to update the index for a site is highly dependent on the number of items in the site. Microsoft says that “for sites with more than 500,000 items, the Restricted Content Discovery update could take more than a week to fully process and reflect in search and Copilot.”

Setting the Restricted Content Discovery flag for a SharePoint Online site.
Figure 1: Setting the Restricted Content Discovery flag for a SharePoint Online site

The indexing update does not remove items from the tenant index. If it did, items would be unavailable for eDiscovery searches, auto-label policies for retention and sensitivity labels, and other solutions. Instead, the flag set on files instructs Copilot to ignore those files when it consults the Graph to find matching content to help ground user prompts. The same approach is used by the Data Loss Prevention (DLP) policy to block Copilot access to files assigned specific sensitivity labels.

The block applies to anywhere Copilot for Microsoft 365 can use SharePoint Online files, including Copilot agents. It doesn’t affect how site-level search works, nor does it interfere with other Purview solutions like eDiscovery, content searches, or DLP. However, content from sites enabled for RCD don’t appear in enterprise level searches.

RCD Management with PowerShell

PowerShell can be used to manage RCD for sites. Make sure that you use a recent version of the SharePoint Online management module (I used Microsoft.Online.SharePoint.PowerShell version 16.0.25715.12000). For example, to enable RCD for a site, run the Set-SPOSite cmdlet to set the RestrictContentOrgWideSearch property to $true.

Set-SPOSite -Identity https://office365itpros.sharepoint.com/sites/rabilling -RestrictContentOrgWideSearch $true

To remove RCD from a site, set the value for RestrictContentOrgWideSearch to $false:

Set-SPOSite -Identity https://office365itpros.sharepoint.com/sites/rabilling -RestrictContentOrgWideSearch $false

Much the same reindexing process must occur before files in sites where RCD is disabled after being enabled before files become available to Copilot.

To generate a list of sites with RCD enabled, run the Start-SPORestrictedContentDiscoverabilityReport command to create a job on a queue for processing. The Get-SPORestrictedContentDiscoverabilityReport cmdlet reports the status for the job, which eventually reports “Completed.”

Start-SPORestrictedContentDiscoverabilityReport

Generating the report will take some time. Are you sure you want to proceed?
Continue with this operation?
[Y] Yes  [N] No  [?] Help (default is "Y"): y

RunspaceId           : 1d839c7e-c0bf-4c11-be94-20179f2335e2
Id                   : 02aa91ea-5e12-43de-91a1-a58275d3b201
CreatedDateTimeInUtc : 03/31/2025 16:09:52
Status               : NotStarted

Get-SPORestrictedContentDiscoverabilityReport

RunspaceId           : 1d839c7e-c0bf-4c11-be94-20179f2335e2
Id                   : 02aa91ea-5e12-43de-91a1-a58275d3b201
CreatedDateTimeInUtc : 03/31/2025 17:03:52
Status               : Completed

To download the RCD insights report, run the Get-SPORestrictedContentDiscoverabilityReport cmdlet and pass the GUID (id) for the report. This value is shown in the Get-SPORestrictedContentDiscoverabilityReport output:

Get-SPORestrictedContentDiscoverabilityReport –Action Download –ReportId 02aa91ea-5e12-43de-91a1-a58275d3b201
Report RestrictedContentDiscoverabilityReport_1743437651407.csv downloaded successfully

Microsoft documentation says that “the downloaded report is located on the path where the command was run.” This is incorrect. The file ends up in whatever folder the PowerShell session starts up in. In my case, I ran the job when positioned in c:\temp and the file ended up in c:\windows\system32. The easy fix here is to use a PowerShell profile to define the folder where PowerShell starts up.

The contents of the “insights” report aren’t too exciting (Figure 2) and could be easily generated by looping through sites with PowerShell to find those with the flag set.

Restricted Content Discovery is enabled for these sites
Figure 2: Restricted Content Discovery is enabled for these sites

Restricted Content Discovery for All

It’s a reasonable guess that any Microsoft 365 tenant that’s interested in Copilot has some sensitive information stored in SharePoint Online sites. If you’re in this situation, you should consider RCD as the front-line method to prevent that information leaking out through Copilot. I’d also deploy the DLP policy to restrict Copilot access as a backup. Between the two lines of defence, it’s unlikely that inadvertent disclosure of confidential data will happen, and that’s a good thing.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/04/02/restricted-content-discovery-works/feed/ 5 68682
Copilot in Outlook Gets a Revamp https://office365itpros.com/2025/03/21/copilot-for-outlook-ui/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-outlook-ui https://office365itpros.com/2025/03/21/copilot-for-outlook-ui/#respond Fri, 21 Mar 2025 07:00:00 +0000 https://office365itpros.com/?p=68524

Tweaks to Copilot for Outlook Make the Functionality More Accessible

On Tuesday, I reported that I thought the new Facilitator agent in Teams chat is a good example of AI performing a task well. It’s evidence of how the initial rush of deploying AI everywhere to anything that could have a Copilot label applied is moderating into better implementations.

Message center notification MC892651 (last updated 18 March 2025, Microsoft 365 roadmap item 397092) could be regarded as being in the same category. In this case, the UI for Copilot interactions in the Outlook has what Microsoft terms as “major design improvements” for the new Outlook on Windows and Mac desktops, OWA, and Outlook mobile clients. Outlook classic remains unaltered.

Perhaps because it involves major improvements or a wide range of clients, the deployment of the update has been delayed. Microsoft originally intended to have full deployment done by late February 2025. That date is now late April 2025. When this happens, it normally means that Microsoft had to halt the deployment to fix some problems.

No New Functionality in Revamped UI

According to Microsoft, the revamped UI doesn’t include any new functionality. I never saw the ‘rewrite like a poem’ option before (which might have improved some of my email enormously), so the fact that the new layout and navigation makes this option accessible (Figure 1) is proof that the overhaul works.

The revamped Copilot for Outlook UI in the new Outlook for Windows.
Figure 1: The revamped Copilot for Outlook UI in the new Outlook for Windows

Of course, things work differently on mobile devices, but the changes seem to make things better there too (Figure 2).

Copilot for Outlook mobile.
Figure 2: Copilot for Outlook mobile

By comparison, the Copilot options in Outlook classic are a tad austere (Figure 3), just like the options were like in the other clients before the change. The changes made in the other clients proves once again that good design is important when it comes to making technology accessible to users.

Copilot options in Outlook classic.
Figure 3: Copilot options in Outlook classic

UI Great, Text Awful

I like the UI changes and think they improve how Copilot for Outlook works. However, the changes do nothing to improve the quality of the written text generated by Copilot, which remains bland and overly effusive to my taste. I guess that’s my personal approach to email shining through because I favor brief to-the-point messages over lengthy missives.

The late Mark Hurd (CEO of HP at the time) once advised me to always put the most important information in a message into the first paragraph so that recipients could quickly review items in their inbox without needing to read long messages on mobile devices (Blackberries and iPAQs then). Technology has moved on, but the advice is still true, especially as so many different forms of mobile devices are now in use. Maybe Copilot for Outlook needs a rewrite in one brief paragraph option.

More Change to Come

Although it sometimes seems much longer, we’re still only two years into the Copilot era. We’ll see more changes like this as Microsoft refines and enhances how Copilot is integrated into apps. Now that they’ve given Outlook a nice new UI, perhaps they’ll do the same for Excel and PowerPoint to make it easier to use Copilot in those apps. Or maybe that’s just me moaning because I’m not as proficient as I should be with those apps.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/03/21/copilot-for-outlook-ui/feed/ 0 68524
Use Data Loss Prevention to Stop Microsoft 365 Copilot Chat from Processing Documents in Its Responses https://office365itpros.com/2025/03/20/dlp-policy-for-microsoft-365-copilot/?utm_source=rss&utm_medium=rss&utm_campaign=dlp-policy-for-microsoft-365-copilot https://office365itpros.com/2025/03/20/dlp-policy-for-microsoft-365-copilot/#comments Thu, 20 Mar 2025 07:00:00 +0000 https://office365itpros.com/?p=68504

DLP Policy for Microsoft 365 Copilot to Restrict Access to Sensitive Documents

Ever since the introduction of Microsoft 365 Copilot in March 2023, organizations have struggled to stop the AI consuming confidential or sensitive documents in its responses. Some of the early tools, like Restricted SharePoint Search, were blunt instruments hurried out as responses to customer requests. Microsoft’s current best answer is SharePoint Restricted Content Discovery (RCD), a feature normally licensed through SharePoint Advanced Management (SAM). All tenants with Microsoft 365 Copilot licenses are due to receive access to RCD and the deployment process is ongoing.

Microsoft says that the key use case for RCD is to “prevent accidental discovery of [files stored in] high-risk sites.” RCD works by limiting the ability of end users to search selected sites. By excluding sites from search, RCD prevents Copilot Chat (and agents based on Copilot Chat) from using the files stored in those sites in its responses. It’s still possible for Copilot to use information from a sensitive document if the user has the file opened in an app like Word. At this point, the sensitive content is open in memory and available for Copilot to process.

Blocking files from user access doesn’t stop system functions like eDiscovery working.

Update April 21: MC1059677 announces the extension of DLP protection to Copilot in Office apps (Word, PowerPoint, Outlook, and Excel).

Blocking Access to Individual Files

RCD is a good way to cast a protective net across multiple sites. But what about protecting individual files that might be in sites that aren’t covered by RCD? Until now, the answer has been to use sensitivity labels to stop Copilot Chat using sensitive files to generate its responses. Although sensitivity labels can stop Copilot using the content of protected files, it cannot prevent Copilot finding reference protected files through a metadata search.

Creating a DLP Policy for Microsoft 365 Copilot

A solution to that problem might be coming in the form of a new type of Data Loss Prevention (DLP) policy. The feature is described in message center notification MC937930 (last updated 6 February 2025, Microsoft 365 Roadmap ID 423483). DLP policies are usually used to block external sharing of confidential information, like Teams meeting recordings. Blocking files for internal consumption is a new step.

Essentially, tenants can create a DLP policy to check for specific sensitivity labels and block Copilot Chat (and agent) access to files with those labels. The functionality is now in preview and is scheduled for general availability in June 2025 (complete worldwide by the end of July 2025). Some gaps are always expected in preview code, and the gaps right now include alerts, incident reports, policy simulation, and audit records. In other words, it’s very hard to know when a DLP policy match happens to block access. But testing indicates that the DLP policy works.

The DLP policy for Microsoft 365 Copilot is a special form of policy in that the policy only covers Copilot and no other type of data (Figure 1).

Creating a DLP policy for Microsoft 365 Copilot.
Figure 1: Creating a DLP policy for Microsoft 365 Copilot

The rules used in a DLP policy for Microsoft 365 Copilot are simple. The policy checks if a file has a specific sensitivity label, and if the sensitivity label is found, DLP executes the action to “prevent Copilot from processing content” (Figure 2). A rule can check for the presence or one or more sensitivity labels. In some respects, it might be easier to create a separate rule for each label.

Creating a DLP rule for Microsoft 365 Copilot.
Figure 2: Creating a DLP rule for Microsoft 365 Copilot

Testing the DLP Policy for Microsoft 365 Copilot

To test the new DLP policy, I created several documents referring to regulations governing cryptocurrency in Iceland (a topic selected at random because I knew that my tenant was unlikely to store any files relating to the topic). I used Copilot for Word to generate the text for each file and added a reference to a mythical regulation to the text of each document to give Copilot an easy target to find. The first check asked Copilot Chat to find documents relating to cryptocurrency in Iceland with special relevance to the regulation. The sensitivity labels assigned to the documents were not covered by a DLP policy for Microsoft 365 Copilot, and Copilot found all the documents (Figure 3).

Copilot finds confidential documents without sensitivity labels monitored by a DLP policy.
Figure 3: Copilot finds confidential documents without sensitivity labels monitored by a DLP policy

After applying sensitivity labels covered by the DLP policy for Microsoft 365 Copilot to two of the three documents, the search was rerun and Copilot found only one document (Figure 4).

The DLP policy for Microsoft 365 Copilot blocks files protected by specific sensitivity labels.
Figure 4: The DLP policy for Microsoft 365 Copilot blocks files protected by specific sensitivity labels

I don’t pretend this to be a full test. However, it’s the only way to check preview software that doesn’t generate audit records or other traces to show when DLP policy matches occur to force DLP to execute the defined actions.

New DLP Policy Shows Promise

I’ll look forward to retesting the DLP Policy for Microsoft 365 Copilot after the software reaches GA and the full array of auditing and reporting options are available. Auto-label policies can only apply sensitivity labels to Office files and PDFs, and I suspect that this limitation won’t be lifted. That’s a pity because it stops the DLP policy being able to control access to items like the .MP4 files used for Teams Meeting Recordings (transcripts).

The nice thing is that users see no trace of a sensitive document show up in Microsoft 365 Copilot Chat. Unlike basic sensitivity label protection, which allows Copilot Chat to show metadata found in its searches, the DLP policy is silent. And that’s just the way you’d want it to be when dealing with sensitive data.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/03/20/dlp-policy-for-microsoft-365-copilot/feed/ 3 68504
Why Microsoft 365 Copilot Works for Some and Not for Others https://office365itpros.com/2025/02/20/make-copilot-useful/?utm_source=rss&utm_medium=rss&utm_campaign=make-copilot-useful https://office365itpros.com/2025/02/20/make-copilot-useful/#comments Thu, 20 Feb 2025 07:00:00 +0000 https://office365itpros.com/?p=68101

I Can’t Wait for Agentic Experiences to Make Copilot Useful

We’re all on a journey to understand how to use artificial intelligence effectively to improve systems, lives, and human existence. If you pay for the necessary licenses, Copilot is everywhere within the Microsoft 365 ecosystem, both as helpers deployed in desktop apps like Word, Teams, and PowerPoint, and the possibility of custom agents for tenants to develop and deploy, albeit without the necessary tools to manage potentially thousands of agents created by citizen developers.

According to Microsoft CEO Satya Nadella, Microsoft wants to make it as simple for people to create agents than it is to create an Excel worksheet, which might mean the creation of the “highly customized agentic experiences” referred to in Microsoft 365 center notification MC985480 (January 22). I don’t quite know that phrase means, and the clarifying text that said it “means you can design unique prompts, connect to any LLM, and integrate these custom agents with Microsoft 365 Copilot” wasn’t much help either. When I asked Copilot, it struggled with the concept too (Figure 1). In any case, I’m sure that we’ll all be happy in our highly customized agentic world when it arrives.

Copilot attempts to define highly customized agentic experiences.
Figure 1: Copilot attempts to define highly customized agentic experiences

Why Today’s AI Falls Short of its Hype

All of which brings me to a thoughtful article in the Tomorrow’s Blueprint blog entitled “Why Others Think AI Is a Miracle But You Think It’s Useless.” The author is Microsoft product manager Abram Jackson, now deeply involved in the development of Microsoft 365 Copilot. The core of the article is an assertion that:

Today’s AI falls short of its hype for many due to three big reasons:

  • It often doesn’t have the data it needs to work with
  • Defining tasks precisely is very difficult
  • There’s little AI can do other than give you text or images.”

Abram knows much more about AI than I do. I reckon that he has captured the problems faced by many organizations as they consider how to extract value from a potentially massive investment in Copilot licenses.

Without access to data, Copilot can do nothing. The magic of Microsoft 365 Copilot, if some exists, is the Microsoft Graph, or access to the documents, emails, and Teams messages stored within Microsoft 365. Yet the legacy of some older Microsoft decisions around collaboration strategy forced organizations to restrict SharePoint Search to stop Copilot revealing information to anyone who asked. As it turns out, it is hard to stop Copilot using data because even document metadata can reveal secrets.

I like the way Abram discusses the issue of defining tasks. Math works because the answer is either right or wrong. Copilot works very well when given well-defined tasks to do, like summarizing a meeting transcript or extracting tasks for people to consider. The same goes for scanning an email thread or summarizing a Word document. Generating text is less satisfactory unless the user is very precise in their prompt and grounds Copilot with some suitable input, like documents to work from. The promise of early demos where Copilot generated project reports and other material in the blink of an eye is never attained where loose prompting gives the AI free rein to indulge itself.

How People Need to Use AI

The summary is that to extract value from AI (and Microsoft 365 Copilot in particular), users must:

Understand if a task is valuable and not prone to hallucinations. Asking Copilot for Word to scan a document and decide if it is well-structured and how make improvements is valuable for many people who aren’t natural writers. Asking Copilot for Word to generate the initial document introduces the possibility of hallucinations.

Work to define the task precisely: Asking Copilot to do something very precisely with clear boundaries and guidelines will generate much better results than dashing off a quick prompt. Grounding a prompt with some relevant information, like several pertinent documents will always help Copilot to generate better information.

Translate the result generated by the AI into the form you need it to be. For chat, the introduction of Copilot pages has proven useful because it allows users to easily capture the output generated by Copilot for reuse. But will the slides generated by Copilot for PowerPoint be the type you need? Or can Copilot for Excel really perform the computations you want? Of course, they can, but only with practice and perseverance on the part of the human.

As Abram says, this approach “isn’t natural and it is time-consuming.” It comes about because Copilot is essentially an eager assistant that wants to work but will do stupid things unless you tell it precisely what to do and how to do it. Expanding on the example shown in Figure 1, adding context and direction to the prompt gives Copilot the chance to deliver a much better answer. Prompts can now be up to 128,000 characters, so there’s lots of room for comprehensive instructions.

Make Copilot useful by giving the AI better and more detailed instructions. It's more likely to come up with a good answer.
Figure 2: Make Copilot useful by giving the AI better and more detailed instructions

The Bing Conundrum

One last point about data being available for Copilot to work with. I’m not sure about Abram’s statement that “hallucination is largely a solved problem for Microsoft Copilot.” I see odd stuff generated all the time. Abram justifies his claim by saying that “Copilot is trained to only respond with information it has been able to find through search.”

Copilot depends on Bing and Bing isn’t very good at searching. Take this website. Despite the ease in which Google has indexed and searched all my articles for years, Bing stubbornly refused to touch the site. I only discovered this fact when creating some declarative agents that used office365itpros.com as a source. Since then, the best efforts of WordPress support and my own attempts to navigate the online Bing webmaster advice have only just persuaded Bing to start indexing some pages. Some of the blocks are quite silly. One problem that caused Bing to refuse to index pages was the lack of an alt tag for a graphic in a sidebar.

If Copilot had better search facilities, it could generate better answers because it has better data to work with.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/02/20/make-copilot-useful/feed/ 1 68101
Microsoft Launches Copilot for All Initiative https://office365itpros.com/2025/01/16/microsoft-365-copilot-chat-jan25/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-chat-jan25 https://office365itpros.com/2025/01/16/microsoft-365-copilot-chat-jan25/#comments Thu, 16 Jan 2025 07:00:00 +0000 https://office365itpros.com/?p=67692

New Agent Capabilities for the Free Microsoft 365 Copilot Chat App

Infused with the zealotry of true believers, Microsoft announced Copilot for All on January 15, 2025 to reveal the details of the complicated Copilot renaming they previewed in December. And the new logo, of course.

In a nutshell, Microsoft is creating an “on-ramp” to give Microsoft 365 tenants that haven’t invested in expensive Microsoft 365 Copilot licenses the chance to use agent technology “grounded in Microsoft Graph data.” The idea here is to encourage commercial customers to run a mix of Copilot with some having the full-blown licensed version while others experience with the free-to-use version. Figure 1 shows the relative capabilities of the two Copilot options.

Functionality available in the two Microsoft 365 Copilot products.
Figure 1: Functionality available in the two Microsoft 365 Copilot products (source: Microsoft)

.Lots of Functionality in Microsoft 365 Copilot Chat

The free-to-use Microsoft 365 Copilot Chat app includes a lot of functionality in terms of its ability process user prompts against information available on web sites (providing those sites are indexed by Bing). Recently, Microsoft added features like Copilot pages and the image generator (Figure 2). Microsoft says that limitations exist on the number of images that can be generated daily. I guess I don’t create many images as I haven’t experienced any problems.

Generating an image in Microsoft 365 Copilot Chat.
Figure 2: Generating an image in Microsoft 365 Copilot Chat

The Chat client has enterprise data protection, so data is secure, protected, and actions are audited and captured in compliance records.

Pay-as-you-go Agents

The big news is that customers will be able to create and run custom agents grounded against “work data” on a pay-as-you-go (PAYG) metered basis. PAYG means the tenant must sign up for an Azure subscription with a valid credit card before the agent will run. Agent activity is charged against the subscription using “messages” as the metering unit (an action performed by an agent can consume up to 25 messages). Grounding against work data means that the agents can interrogate information available in the Microsoft Graph. Technically speaking, Graph data includes Exchange, Teams, SharePoint, and OneDrive plus anything imported into the Graph through a third-party connector. However, the capabilities of today’s agents are limited to SharePoint and OneDrive sites plus Graph connectors. In any case, there is some magic here to exploit because if an organization can import its data into the Graph, agents can reason over that data to create responses to user prompts, providing PAYG is set up for the tenant.

The custom agents are developed with Copilot Studio. I have spent some time working with Copilot Studio to build simple agents over the last few weeks. It’s not a terribly difficult task, but organizations do need to take the time to chart out how they plan to develop, deploy, and manage agents rather than rushing headlong into the brand-new world. Like any software, agents work best when some structure is in place.

The Big Differences between Microsoft 365 Copilot Chat and Microsoft 365 Copilot

Paying for agents to use Graph data does not deliver the full suite of capabilities enjoyed by those who invest in Microsoft 365 Copilot licenses. Figure 1 shows that Microsoft 365 Copilot includes a bunch of personal assistants where Copilot is built into Microsoft 365 apps like Teams, Word, Outlook, PowerPoint, and Excel. Sometimes, as in the case of the automatic document summary generated by Copilot in Word, the help is unwanted, but the personal assistants are very good at helping with other tasks, like summarizing long email threads or recapping Teams meetings.

Microsoft 365 Copilot also includes SharePoint Advanced Management (SAM). However, although Microsoft announced at Ignite 2024 that tenants with Microsoft 365 Copilot licenses would get SAM in early 2025, there’s no trace of these licenses turning up in any tenant that I have access to. License management can be complex and I’m sure that SAM will turn up soon.

Finally, PAYG access to Graph data does not include the semantic index. The index is generated automatically from Graph data in tenants with Microsoft 365 Copilot licenses to create a vector-based index of the relationships of items in the Graph. It’s an untrue urban legend that Microsoft 365 Copilot needs the semantic index to function. The semantic index enhances search results, but it’s not required for the chat app or agents to work.

In Simple Terms, Two Copilot Products

It’s easy to become confused by the naming of different elements within the Microsoft 365 Copilot ecosystem. It boils down to Microsoft offering free (with PAYG capabilities) and expensive Copilot products to Microsoft 365 customers. Microsoft obviously hopes that the free version will act as the on-ramp to full-fledged Copilot. It’s a reasonable tactic. Time will tell if it’s successful.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering the Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/01/16/microsoft-365-copilot-chat-jan25/feed/ 4 67692
The Confusing Renaming of Microsoft 365 Copilot https://office365itpros.com/2024/12/20/microsoft-365-copilot-rename/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-rename https://office365itpros.com/2024/12/20/microsoft-365-copilot-rename/#comments Fri, 20 Dec 2024 07:00:00 +0000 https://office365itpros.com/?p=67476

Microsoft 365 Copilot Rename Means Exactly What?

By now, I’m sure that people understand that Microsoft has two chat apps available for Microsoft 365 users:

  • Microsoft Copilot, which is limited to making queries against the Microsoft LLMs. The app is available without a license to anyone who signs into their Entra ID account before they can use Microsoft Copilot, which is why it’s sometimes referred to as Microsoft Copilot (for users with Entra accounts). This app started as Bing Chat Enterprise before the Copilot branding team applied their magic. To be fair, the addition of enterprise data protection to Microsoft Copilot in September 2024 improved the app greatly.
  • Microsoft 365 Copilot, which can include Graph content (data stored in SharePoint Online, OneDrive for Business, Teams, and Exchange Online) in its queries against the Microsoft LLMs (the Graph content “grounds” the queries). This app is also called BizChat, and I use that name for the remainder of this article. User accounts must hold a $360/year Microsoft 365 Copilot license before they can use BizChat.

The naming used for these apps and the Microsoft 365 Copilot suite (a Copilot for every app, like Copilot in Word, Copilot in Teams, Copilot in Outlook, etc.) has evolved since the original launch in March 2023. In that time, probably far too many brain cells have been sacrificed to keep up with Microsoft’s marketing machinations as they seek to drive Copilot deep into the consciousness of Microsoft employees and customers alike.

The January 2025 Change

Message center notification MC958903 (16 December 2024) marks yet another turn in the naming game. In mid-January 2025, Microsoft will introduce changes “to simplify the user experience.”

  • Microsoft Copilot becomes Microsoft 365 Copilot Chat. The app will be able to use Copilot agents for the first time. Agents that access web content are free, but using agents that access work data (Graph data) must be paid for on a pay-as-you-go (metered consumption) basis.
  • The current Microsoft 365 app, which includes a Copilot icon to access Copilot in its navigation bar, becomes Microsoft 365 Copilot, complete with a new M365Copilot.com URL to “make it easier to discover.” Depending on their licensing status, the Copilot icon brings people to either Microsoft 365 Copilot Chat or BizChat. The app will receive a UI makeover to “support future AI-first experiences” like exposing Copilot Pages. The changes are detailed in MC958905 and include a new icon that I thoroughly dislike (see Figure 1).
 The January 2025 chnges for Microsoft 365 Copilot (source: Microsoft)

Microsoft 365 Copilot rename
Figure 1: The January 2025 changes for the Microsoft 365 Copilot rename (source: Microsoft)

All of this was discussed at the Ignite 2024 conference in Chicago last month. I paid little attention at the time because I ignored most of the marketing fluff from the conference, preferring to wait to see the details emerge. If you’re interested, the keynote is still online, complete with a very brief mention of a rename (Figure 2).

Microsoft EVP Rajesh Jha describes the wonders of Microsoft 365 Copilot
Figure 2: Microsoft EVP Rajesh Jha describes the wonders of Microsoft 365 Copilot

The Confusion Between Product and App

I dislike renaming Microsoft Copilot to be Microsoft 365 Copilot Chat because it complicates what should be a simple differentiation between users who have Microsoft 365 Copilot licenses and those who do not. Once you apply the Microsoft 365 brand to an app, a certain implication exists that the app has something to do with Microsoft 365 and enjoys some access to Microsoft 365 content (which it doesn’t have).

I guess the chat app that can’t access Microsoft 365 content has some relationship with Microsoft 365 because it’s available through the Microsoft 365 Copilot app, but the connection is tenuous at best and difficult for people who don’t track the minutiae of changes within the service. It took me several readings of MC958903 before the details sunk in. I suspect that I am not alone.

I’m sure that Microsoft will point to its fabled telemetry to justify the decision. They always do. However, I think this is more of the “let’s brand everything with the <insert latest product du jour name here> name here” tactic seen in the past with Windows, Office, and .Net. The problem is that telemetry seldom highlights the potential for future confusion of the sort that’s likely when this change emerges.

Tiring Pace of Branding Changes

Everyone understands that Microsoft is making a big bet to be the leader in AI. Microsoft is spending a ton of money to build their leadership, including a reported $19 billion spend reported in their Q4 FY24 results. But the constant mantra of Copilot everywhere is starting to wear. It will be a relief when the tsunami subsides and we can all get back to productive work, with or without Copilot’s assistance.


Make sure that you’re not surprised about changes that appear inside Microsoft 365 applications by subscribing to the Office 365 for IT Pros eBook. Our monthly updates make sure that our subscribers stay informed.

]]>
https://office365itpros.com/2024/12/20/microsoft-365-copilot-rename/feed/ 7 67476
How Microsoft Copilot Generates Compliance Records https://office365itpros.com/2024/11/07/microsoft-copilot-interactions/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-copilot-interactions https://office365itpros.com/2024/11/07/microsoft-copilot-interactions/#respond Thu, 07 Nov 2024 07:00:00 +0000 https://office365itpros.com/?p=66948

Microsoft 365 Substrate Captures Interaction Details for Microsoft Copilot

After writing about how to use the Microsoft Graph PowerShell SDK to analyze the interactions between users and Microsoft 365 Copilot in various apps, I was asked if the code reports interaction records for Microsoft Copilot. This is the free version of Copilot that appears in the Microsoft 365 app when a signed-in Entra ID user account doesn’t have a Microsoft 365 Copilot license.

The big difference between the free and paid-for version is that Microsoft 365 Copilot can use Graph queries to find email, Teams messages, and documents to ground its queries while Microsoft Copilot is limited to Microsoft’s LLMs and Bing web searches. In addition, Microsoft 365 Copilot comes with extra features, such as custom Copilot agents for SharePoint Online.

Both versions support enterprise data protection (EDP). Microsoft added support for EDP to Microsoft Copilot in August 2024 and the announcement specifically says that information about prompts and responses is retained for eDiscovery purposes.

Asking Microsoft Copilot

My first step to gather information was to ask Microsoft Copilot if it generates interaction compliance records. Figure 1 shows the negative response.

Microsoft Copilot responds to a query about interaction compliance records
Figure 1: Microsoft Copilot responds to a query about interaction compliance records

Looking Behind the Scenes

As Microsoft Copilot couldn’t answer the question, it was time to look behind the scenes. I figured that the Microsoft 365 substrate would store anything it captured for Microsoft Copilot interactions in the same hidden TeamsMessagesData folder in the user’s mailbox.

Some are curious why Microsoft selected TeamsMessagesData as the storage location for these records. It doesn’t really matter what folder is used if it’s hidden and indexed for eDiscovery, but I think Microsoft chose TeamsMessagesData because the Copilot chats are very much like regular Teams one-on-one chats. The substrate captures Teams compliance records for one-on-one chats in the same folder.

MFCMAPI is the best tool to investigate mailbox contents. After using Microsoft Copilot several times, I opened the TeamsMessagesData folder with MFCMAPI and discovered that the substrate had captured compliance records for the Copilot interactions. Figure 2 shows the record captured for the prompt shown in Figure 1.

A prompt captured for a Microsoft Copilot interaction
Figure 2: A prompt captured for a Microsoft Copilot interaction

Once I located the compliance records, it was easy to update the PowerShell script to extract and report the Microsoft Copilot interactions. The updated code is available from GitHub.

No Data Shown for Responses

I noticed that compliance records captured for Microsoft Copilot responses do not include the response in the Body and BodyPreview properties. The same is true for responses generated for Microsoft 365 Chat (BizChat) responses. Looking back through records for Microsoft 365 Chat interactions, it appears that the only output is any documents located by Copilot to form its response. In Figure 3, we see a reference to a document in a Microsoft 365 Chat response followed by some base 64 encoded text.

A Microsoft 365 Copilot response including some referenced documents
Figure 3: A Microsoft 365 Chat response including some referenced documents

Inputting the encoded text into an online decoder reveals the text (Figure 4). It looks like whatever routine Microsoft uses to generate the compliance record doesn’t decode the text before it’s written into the mail item used to store the record in TeamsMessagesData.

A base64 decoder reveals the full text for a Microsoft 365 Chat response
Figure 4: A base64 decoder reveals the full text for a Microsoft 365 Chat response

The encoded state of the information also explains why the Activity Explorer in the AI Hub in the Purview portal can’t display Copilot’s response to a prompt (Figure 5).

The AI Hub's Activity Explorer fails to display a Microsoft 365 Chat response
Figure 5: The AI Hub’s Activity Explorer fails to display a Microsoft 365 Chat response

Summarizing Microsoft Copilot and Compliance Records

The answer to the question is that compliance records are generated for Microsoft Copilot interactions. However, the information logged in the compliance records isn’t as easy to access as it should be. The flaw shared by Microsoft Copilot and Microsoft 365 chat suggests that some buggy code is shared by the two apps. It should be easy for Microsoft to decode responses from base64 before including clear text in compliance records.

The issue is reported, but quite when a fix will appear is anyone’s guess. Hopefully, because the problem means that compliance records aren’t as useful as they should be, the fix should appear soon.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2024/11/07/microsoft-copilot-interactions/feed/ 0 66948
Create a Custom Copilot Agent for SharePoint Online https://office365itpros.com/2024/10/31/copilot-agents-sharepoint/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-agents-sharepoint https://office365itpros.com/2024/10/31/copilot-agents-sharepoint/#comments Thu, 31 Oct 2024 01:00:00 +0000 https://office365itpros.com/?p=66871

Copilot Agents Rolling Out to Targeted Release Tenants

On October 23, 2024, Microsoft published message center notification MC916296 (Microsoft 365 roadmap item 416297) to announce the rollout of Copilot agents in SharePoint Online to targeted release tenants. Worldwide deployment to targeted release tenants is due to finish in early November 2024 with general availability following to all tenants (with Microsoft 365 Copilot licenses) completing in late December 2024.

Microsoft included Copilot agents in SharePoint Online as part of their Wave 2 announcement on September 16, 2024. At the time, I thought that Copilot agents were the most interesting part of Wave 2. Copilot pages, another major part of the announcement, are a nice way to capture the output from Copilot queries, but having an agent automatically created for SharePoint sites to query just the content from that site seemed like more useful functionality. I was therefore very happy to see Copilot agents show up in my tenant.

Default Copilot Agent for Sites

When users have a Microsoft 365 Copilot license, they see a Copilot option in the navigation bar when a document library is open. Selecting Copilot opens the default agent for the site, which responds to user prompts by reasoning over Office documents and PDF files stored in the site. Limiting Copilot to a predefined set of files from a site stops Copilot using a wider search to find information in any file it can access across the tenant or through a web search if permitted by the tenant. It’s a way of getting a precise response from information held in a site.

Creating a Custom Copilot Agent

Site members with create and edit permissions (for a site owned by a Microsoft 365 group, any group member) can create a Copilot agent to create an even more precise search. For instance, I store the source Word documents for every article that I write (including this one) in a document library in a SharePoint Online site. Using the Create a Copilot agent option, I created a custom Copilot agent to reason over the articles. The entire operation took less than a minute, which is kind of startling.

The Sources tab of the wizard selects the folders or file for Copilot to process (Figure 1). You can select the entire site or any of the folders or individual files from the site, including from any document library if the site includes more than the default document library. The name of the agent can be between 4 and 42 characters.

Defining the source content for a custom Copilot agent
Figure 1: Defining the source content for a custom Copilot agent

The Behavior tab allows you to tailor the sample prompts shown to users and how Copilot will respond. In Figure 2, I’ve changed the tone for the responses from professional to formal and modified one of the starter prompts.

Modifying the behavior of a custom Copilot agent
Figure 2: Modifying the behavior of a custom Copilot agent

After saving the agent, Copilot creates a file in the document library for the agent and adds the agent to the recently used list of agents (Figure 3). If you make a mistake with an agent, simply delete the file. The file is also used to share agents. For instance, you can create a sharing link for the agent and include it in email or a Teams chat. If the people who see the link have access to the documents processed by the agent, they can use the sharing link to access the agent.

Creating a custom Copilot agent creates a file in the document library
Figure 3: Creating a custom Copilot agent creates a file in the document library

The list of recently used agents includes agents from other sites. You don’t need to navigate to a specific site to use its agents because they can be invoked from elsewhere in SharePoint.

Using the agent is like any other Copilot interaction. You compose a prompt (question) and submit it to Copilot for processing. Copilot restricts its search to the set of files defined for the agent. Figure 4 shows a sample interaction where I asked Copilot to search for anything that I have written about Copilot Pages, and it duly found the Word document source for the published article.

Interacting witn a custom Copilot agent
Figure 4: Interacting witn a custom Copilot agent

Custom agents work very well for sites storing small to medium documents. Copilot doesn’t do so well with large documents. For example, I created a custom agent based on the folder holding the Word source documents for the chapters in the Office 365 for IT Pros eBook. Many of these files are over 50 pages long, and the agent couldn’t use the chapter files in its responses.

Missing Features

Microsoft says that the current release does not include the ability to interact with a Copilot agent in a Teams chat, nor does it include the extension to Copilot Studio to customize agents. Another missing feature is the ability for site owners to approve agents or to define a default agent for a site. Microsoft says that these features will be available later in 2024. However, they haven’t said if administrators will be able to control Copilot agents across the tenant, such as having PowerShell cmdlets to enable or disable the feature for selected sites.

The Advantage of Precise Searches

Since its debut, Microsoft 365 Copilot has been plagued by oversharing issues caused when Copilot responses include unexpected information. The source for the information is available to the signed-in user, which is why Copilot can access the content, and is usually a result of flawed site permissions or overly-generous sharing. Flawed information generated in documents can creep into other documents and end up polluting the Graph.

Copilot agents offer more precise responses. I anticipate these agents being very useful in sites that hold specific information like product documentation when you really don’t want results to be polluted by some random document found in a site that no one remembers.


Learn about using SharePoint Online and the rest of Office 365 by subscribing to the Office 365 for IT Pros eBook. Use our experience to understand what’s important and how best to protect your tenant.

]]>
https://office365itpros.com/2024/10/31/copilot-agents-sharepoint/feed/ 4 66871
Microsoft Says SMEs Can Benefit from Microsoft 365 Copilot https://office365itpros.com/2024/10/25/microsoft-365-copilot-sme/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-sme https://office365itpros.com/2024/10/25/microsoft-365-copilot-sme/#respond Fri, 25 Oct 2024 07:00:00 +0000 https://office365itpros.com/?p=66799

Take the Results Presented with a Pinch of Salt

What are we to make of Microsoft’s release of a new study into the effect of Microsoft 365 Copilot for Small to Medium businesses? The blog post on the topic appeared on October 17 and highlights some results reported by Forrester Consulting, who Microsoft commissioned “to study the potential return on investment (ROI) of Microsoft 365 Copilot for SMBs.”

Microsoft 365 Copilot for SMEs

As the post says, the “results of the study are eye-opening” with big claims for projected ROI of up to 353% and $1.2 million of projected benefits. Projected is the important word here because it means that the ROI and benefits are potential and not achieved, even if they make good headlines. The report of 6% increase in net revenue and 20% reduction in operating costs seem more attainable.

Doubts About Any Technology Report

Doubts surface every time that I read a report about the gains that companies can make if they would only deploy some new technology. I ask myself if the authors of the report understand the technology they’re writing about as deeply as they should. I ask if the companies covered in the report are hand-picked to make the technology look as good as it can be. I ask what direction Microsoft gave Forrester Consulting when they commissioned the report and how independent Forrester can be in what they write about. And I ask if the results gathered from the over 200 companies surveyed for the report are massaged in any way. All nagging doubts honed from years of experience as a consultant.

I’ve no doubt that Microsoft 365 Copilot can do a good job for some SMEs, especially for companies who are backed up by a partner who knows the Copilot technology and understand where the potholes are. For instance, the assertion that 50% of time can be saved by legal firms in contract reviews is believable because many contracts cover the same ground, and a Copilot agent built for the purpose can reason over a corpus of contracts when reviewing text for problems.

It’s also true that Copilot’s ability to summarize text in email, Teams chats, and documents is of great help to people returning to work after a vacation. Catching up by wading through a full inbox or hundreds of Teams chats is never fun, and Copilot absolutely can help by summarizing information and presenting what happened while people were away in a very digestible format.

No Mention of Microsoft 365 Copilot Flaws

But I worry that the report ignores the flaws we know to exist in Microsoft 365 Copilot. Some SMEs are great at organizing their information; others are not, and they succumb to the same kind of group/teams sprawl and accumulation of digital debris that happens in enterprise tenants. SMEs might not have the same training capabilities as exist in larger organizations, which can lead to bad habits like oversharing through sloppy site permissions.

As you might imagine, none of this is covered by the Forrester report. No mention is present about why an SME might need to deploy Restricted SharePoint Search (or the newer but not yet available Restricted Content Discoverability capability), or deploy sensitivity labels to protect their most confidential documents from being reused by Copilot. There’s no comment about the way that errors can creep into user documents from Copilot responses and end up by corroding the reliability of stored documents. These are real issues surrounding the introduction of generative AI to Microsoft 365.

Just a Marketing Tool to Sell Microsoft 365 Copilot Licenses to SMEs

Then I remember that the Forrester report is no more than a marketing tool designed to encourage SMEs with Microsoft 365 Business Basic, Microsoft 365 Standard, or Microsoft 365 Business Premium pay for $360/user/year Microsoft 365 Copilot subscriptions. The companies covered in the report had up top 300 employees. At list price, Copilot licenses cost $108,000 annually for 300 employees. That’s a big investment for any SME.

But someone’s got to pay for the billions of dollars Microsoft is currently investing in AI, and a large percentage of the 400-million plus Office 365 installed base comes from the SME sector. If you work for an SME and are interested in Microsoft 365 Copilot, take the time to read the report, but do so with a large pinch of salt close at hand. Investing a large chunk of change in expensive software licenses without knowing exactly how you’ll achieve an ROI has never been a good business tactic.


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365.

]]>
https://office365itpros.com/2024/10/25/microsoft-365-copilot-sme/feed/ 0 66799
Copilot’s Automatic Summary for Word Documents https://office365itpros.com/2024/09/05/automatic-document-summary-word/?utm_source=rss&utm_medium=rss&utm_campaign=automatic-document-summary-word https://office365itpros.com/2024/09/05/automatic-document-summary-word/#comments Thu, 05 Sep 2024 07:00:00 +0000 https://office365itpros.com/?p=66234

Automatic Document Summary in a Bulleted List

Updated 4-Dec-2024

Last week, I referenced the update for Word where Copilot for Microsoft 365 generates an automatic summary for documents. This is covered in message center notification MC871010 (Microsoft 365 roadmap item 399921). Automatic summaries are included in Copilot for Microsoft 365 and Microsoft Copilot Pro (the version that doesn’t ground prompts using Graph data).

As soon as I published the article where I referred to the feature, it turned up in the latest channel update for Word. Figure 1 shows the automatic summary generated for a document (in this case, the source of an article).

 Copilot generates an automatic document summary
Figure 1: Copilot generates an automatic document summary

The summary is the same output as the bulleted list Copilot will generate if you open the Copilot pane and ask Copilot to summarize this doc. Clicking the Ask a question button opens the Copilot pane with the summary prepopulated ready for the user to delve deeper into the summary.

The summary is only available after a document is saved and closed. The next time someone opens the document, the summary pane appears at the top of the document and Copilot generates the summary. The pane remains at the top of the document and doesn’t appear on every page. If Copilot thinks it necessary (for instance, if more text is added to a document), it displays a Check for new summary button to prompt the user to ask Copilot to regenerate the summary.

Apart from removing the Copilot license from an account (in which case the summaries don’t appear), there doesn’t seem to be a way to disable the feature. You can collapse the summary, but it’s still there and can be expanded at any time.

Summarizing Large Word Documents

When Microsoft launched Copilot support for Word, several restrictions existed. For instance, Word couldn’t ground user prompts against internet content. More importantly, summarization could only handle relatively small documents. The guidance was that Word could handle documents with up to 15,000 words but would struggle thereafter.

This sounds a lot, and it’s probably enough to handle a large percentage of the documents generated within office environments. However, summaries really come into their own when they extract information from large documents commonly found in contracts and plans. The restriction, resulting from the size of the prompt that could be sent to the LLM, proved to be a big issue.

Microsoft responded in in August 2024 with an announcement that Word could now summarize documents of up to 80,000 words. In their text, Microsoft says that the new limit is four times greater than the previous limit. The new limit is rolling out for desktop, mobile, and browser versions of Word. For Windows, the increased limit is available in Version 2310 (Build 16919.20000) or later.

Processing Even Larger Word Documents

Eighty thousand words sounds a lot. At an average of 650 words per page, that’s 123 pages filled with text. I wanted to see how Copilot summaries coped with larger documents.

According to this source, the maximum size of a text-only Word document is 32 MB. With other elements included, the theoretical size extends to 512 MB. I don’t have documents quite that big, but I do have the source document for the Office 365 for IT Pros eBook. At 1,242 pages and 679,800 characters, including many figures, tables, cross-references, and so on, the file size is 29.4 MB.

Copilot attempted to generate a summary for Office 365 for IT Pros but failed. This wasn’t surprising because the file is so much larger than the maximum supported.

The current size of the Automating Microsoft 365 with PowerShell eBook file is 1.72 MB and spans 113,600 words in 255 pages. That’s much closer to the documented limit, and Copilot was able to generate a summary (Figure 2).

Automatic document summary generated for the Automating Microsoft 365 with PowerShell eBook.
Figure 2: Automatic document summary generated for the Automating Microsoft 365 with PowerShell eBook

Although the bulleted list contains information extracted from the file, it doesn’t reflect the true content of the document because Copilot was unable to send the entire file to the LLM for processing. The bulleted list comes from the first two of four chapters and completely ignores the chapters dealing with the Graph API and Microsoft Graph PowerShell SDK.

Summaries For Standard Documents

In early December 2024, Microsoft published documentation for Copilot in Word’s automatic document summary feature. Regretfully, the documentation didn’t include instructions about how to disable the feature on a per-user or tenant-wide basis. It looks like we’ll just have to cope with automatic summaries. At least the summaries work for regular Word documents of less than 80,000 words.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/09/05/automatic-document-summary-word/feed/ 7 66234
Using Company-wide Sharing Links with Copilot for Microsoft 365 https://office365itpros.com/2024/07/02/company-wide-link-copilot/?utm_source=rss&utm_medium=rss&utm_campaign=company-wide-link-copilot https://office365itpros.com/2024/07/02/company-wide-link-copilot/#comments Tue, 02 Jul 2024 08:00:00 +0000 https://office365itpros.com/?p=65424

Why Some People Can’t Use Shared Files with Copilot for Microsoft 365

After reading the article about the new sensitivity label advanced setting to block access for Microsoft content services to confidential Office documents, a reader asked why some users can use some documents shared using company-wide links with Copilot for Microsoft 365 while others cannot. The situation seemed a little strange because it happened for documents shared with everyone in the organization. The problem couldn’t be due to a sensitivity label because the capability only just rolled out and is limited to the Office applications.

The answer is in Microsoft’s documentation for secure file sharing, which says: “Creating a People in your organization link will not make the associated file or folder appear in search results, be accessible via Copilot, or grant access to everyone within the organization. Simply creating this link does not provide organizational-wide access to the content. For individuals to access the file or folder, they must possess the link and it needs to be activated through redemption.

In other words, sharing a file with everyone in your organization is only the first step in the process of making information available to Copilot for Microsoft 365. A company sharing link that arrives in your inbox or is shared through a Teams chat is dormant until you redeem it by using the link. At that time, SharePoint Online checks that your account belongs to the organization to conform your access to the file. If confirmed, the file joins the set of “shared with you” information, which makes it available to Copilot for Microsoft 365.

Testing Company-wide Sharing Links with Copilot

A simple test proves the point. Create a file that contains some information that’s unlikely to exist elsewhere within the company. In my case, I created a Word document about a fictional digital SLR camera called the Bunsen BX7. Now share the file with a company-wide link (Figure 1).

A company-wide sharing link.
Figure 1: A company-wide sharing link

After signing into another account, open Copilot for Microsoft 365 chat and attempt to find some information about the topic in the file. Copilot should return nothing because a Bing search of the internet and a Microsoft search of company resources available to the account turn up no mention of the topic. But if you now go and use the link to open the file, Copilot can find the information and use it in its responses.

Figure 2 shows a Copilot for Microsoft 365 chat session. The first prompt about the Bunsen BX7 turns up nothing and Copilot responds with some generic text about digital cameras. The second prompt is after redemption of the company-wide sharing link. Copilot is able to find the document and use the information in its response. You can see that the shared document is listed as a source for the response.

Copilot for Microsoft 365 chat uses a company-wide link.
Figure 2: Copilot for Microsoft 365 chat uses a company-wide link

The Desirability of Company-wide Links

The mystery of why some people can use shared documents with Copilot for Microsoft 365 is solved, but thoughts now turn to whether organizations should restrict the use of company-wide links for sensitive documents. The value of these links is that they allow anyone in the organization to access content. The downside is that it’s too easy to create and use company-wide links, which then creates the temptation for people to use these links to share confidential files wider than the organization wants the information to be known.

To guide users away from company-wide links to create sharing links for specific people instead, you can modify the SharePoint tenant configuration to make direct links the default option. Even better you can update individual site settings to disable company-wide links (anyone links are also disabled). For example, the first command sets direct links as the tenant default; the second disables company-wide links for a specific site.

Set-SPOTenant -DefaultSharingLinkType Direct

$Site = "https://office365itpros.sharepoint.com/sites/BlogsAndProjects"
Set-SPOSite -Identity $Site -DisableCompanyWideSharingLinks Disabled

If your organization uses sensitivity labels, you could also consider applying a label that restricts access to a small group of users. That way, even if someone sends a document outside the organization as an email attachment, external recipients won’t be able to open it.

The Challenge of Managing Information in an AI World

The advent of AI assistants creates new information governance challenges for Microsoft 365 tenants. Slowly but surely mechanisms are being developed to help organizations cope and manage the potential for information leakage and misuse. Some Microsoft solutions are no more than sticking plasters to allow customers to progress their Copilot deployments, but overall, the situation seems to be improving. Let’s hope that the trend continues and the current AI hype lives up to its promise.


]]>
https://office365itpros.com/2024/07/02/company-wide-link-copilot/feed/ 1 65424
Better Copilot Audit Records and Copilot Chat Appears in Classic Outlook https://office365itpros.com/2024/05/31/copilot-audit-records-resources/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-audit-records-resources https://office365itpros.com/2024/05/31/copilot-audit-records-resources/#comments Fri, 31 May 2024 07:00:00 +0000 https://office365itpros.com/?p=64983

Copilot Audit Records Now Include Resources Used in Responses

In April 2024, I wrote about the appearance of audit events to capture details when Microsoft 365 applications call Copilot to process a user request (prompt). These events have an operation type of CopilotInteraction.

Since then, Microsoft announced progress in capturing records when people use Copilot in the Stream player to query video transcripts (MC720180, last updated 22 May 2024). It’s like MC720180 (also updated on 22 May 2024), which describes using Copilot to interact with meetings. In both cases, the important point is that the audit events generated for Copilot interactions capture details of resources accessed by Copilot when responding to user prompts (previously the AccessedResources property in the AuditData payload was empty).

Linked to the Change in Transcript Storage Location

Because Copilot depends on meeting transcripts to answer queries, meeting interactions are only possible when meetings are recorded with a transcript. As discussed last week, Teams is standardizing on OneDrive for Business storage for the MP4 files generated for meeting recordings and transcripts. Like many situations in Microsoft 365, developments reported in one message center notification are linked to what’s described in another, seemingly unconnected, update.

The change should be effective in most places now as Microsoft aims to complete worldwide deployment in early June 2024.

Updated Script to Handle Copilot Audit Records

To test the effectiveness of the change, I updated the script I wrote for the previous article (downloadable from GitHub) to support audit records generated by the Stream player and to pay more attention to the data recorded in the associated resources property. Figure 1 shows the output of the script as viewed through the Out-GridView cmdlet.

Copilot audit records capture the resources Copilot accesses
Figure 1: Copilot audit records capture the resources Copilot accesses

Please check out the updated script and let me know if it’s helpful or could be improved.

Copilot in Outlook Classic

Speaking of Copilot, for a long time Microsoft communicated the message that Copilot experiences would only be available in the new Outlook client (aka Monarch). This was no more than a thinly-disguised ploy to drive adoption for Monarch, which still isn’t close to ready for consumption by corporate users.

In any case, message center notification MC794816 (21 May 2025, Microsoft 365 roadmap item 388753) reports the availability of the Copilot for Microsoft 365 chat experience for Outlook classic (Win32). This feature joins “Summarize,” the Copilot option that extracts the major points from an email thread (my second favorite Copilot feature after meeting summarization), and the option to have Copilot draft or revise message drafts. Microsoft will roll out Copilot for Microsoft 365 chat to Outlook classic in the current channel in June 2024.

Before anyone gets too excited, let me say that Copilot for Microsoft 365 chat in Outlook is the same application as accessed as a web application and in Teams. The only difference is that Copilot has an icon in the Outlook application bar and runs in the Outlook window (Figure 2). In other words, if you’re used to Copilot chat elsewhere, you’ll find no difficulty using it in Outlook, providing you have the necessary Copilot for Microsoft 365 license.

Outlook classic gets Copilot for Microsoft 365 chat
Figure 2: Outlook classic gets Copilot for Microsoft 365 chat

As you can see from Figure 2, chats generated in other instances of the client are available in Outlook.

Change, Change, and More Change

Change is ongoing within Microsoft 365. Some changes are dependent on other changes, such as Copilot audit records capturing associated resources for the Stream player. Others are the delivery of incremental functionality within an application. The trick is to keep an eye on what’s happening and to recognize what kind of change each message center notification represents. That’s sometimes hard to do based on the way Microsoft describes a change. Oh well, into every life a little rain must fall…


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/05/31/copilot-audit-records-resources/feed/ 1 64983
Disabling Bits of Copilot for Microsoft 365 https://office365itpros.com/2024/04/30/copilot-for-microsoft-365-service-plans/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-microsoft-365-service-plans https://office365itpros.com/2024/04/30/copilot-for-microsoft-365-service-plans/#comments Tue, 30 Apr 2024 07:00:00 +0000 https://office365itpros.com/?p=64575

Exerting Control Over Individual Copilot for Microsoft 365 Components

No doubt inspired by the article explaining how to remove individual features (service plans) from Microsoft 365 licenses, a reader asked if it is possible to control where Copilot for Microsoft 365 functionality surfaces in different apps. There’s no GUI in the Microsoft 365 admin center to disable bits of Copilot for a tenant. You can disable apps belonging to the Copilot license for a user account (Figure 1), but the question is what apps are bundled with Copilot for Microsoft 365 and what happens if administrators disable the apps for users.

Copilot for Microsoft 365 apps for a user account.
Figure 1: Copilot for Microsoft 365 apps for a user account

The Copilot for Microsoft 365 Service Plans

Looking into the details of the Copilot for Microsoft 365 license with the Microsoft Graph PowerShell SDK, we discover that the product (SKU) identifier is 639dec6b-bb19-468b-871c-c5c441c4b0cb and that the license covers eight service plans. As you’ll recall, a service plan governs functionality within a license that can be enabled or disabled. The Microsoft 365 admin center refers to service plans as apps when displaying the license information for a user.

Here’s how to find the license detail with PowerShell:

Connect-MgGraph -Scopes Directory.Read.All -NoWelcome
$CopilotSKU = Get-MgSubscribedSku | Where-Object SkuPartNumber -match "Microsoft_365_Copilot"
$CopilotSku.ServicePlans | Format-Table ServicePlanName, ServicePlanId

ServicePlanName                    ServicePlanId
---------------                    -------------
COPILOT_STUDIO_IN_COPILOT_FOR_M365 fe6c28b3-d468-44ea-bbd0-a10a5167435c
M365_COPILOT_SHAREPOINT            0aedf20c-091d-420b-aadf-30c042609612
GRAPH_CONNECTORS_COPILOT           82d30987-df9b-4486-b146-198b21d164c7
M365_COPILOT_CONNECTORS            89f1c4c8-0878-40f7-804d-869c9128ab5d
M365_COPILOT_APPS                  a62f8878-de10-42f3-b68f-6149a25ceb97
M365_COPILOT_TEAMS                 b95945de-b3bd-46db-8437-f2beb6ea2347
M365_COPILOT_BUSINESS_CHAT         3f30311c-6b1e-48a4-ab79-725b469da960
M365_COPILOT_INTELLIGENT_SEARCH    931e4a88-a67f-48b5-814f-16a5f1e6028d

Table 1 summarizes the service plans included in the Copilot for Microsoft 365 license.

Service Plan NameUser Friendly Feature NameService Plan Id
GRAPH_CONNECTORS_COPILOTGraph Connectors in Microsoft 365 Copilot82d30987-df9b-4486-b146-198b21d164c7
M365_COPILOT_INTELLIGENT_SEARCHIntelligent Search (Semantic Index)931e4a88-a67f-48b5-814f-16a5f1e6028d
M365_COPILOT_BUSINESS_CHATMicrosoft Copilot with Graph-grounded chat3f30311c-6b1e-48a4-ab79-725b469da960
M365_COPILOT_TEAMSMicrosoft 365 Copilot in Microsoft Teamsb95945de-b3bd-46db-8437-f2beb6ea2347
M365_COPILOT_APPSMicrosoft 365 Copilot in Productivity Apps (Office)a62f8878-de10-42f3-b68f-6149a25ceb97
M365_COPILOT_CONNECTORSPower Platform Connectors in Microsoft 365 Copilot89f1c4c8-0878-40f7-804d-869c9128ab5d
M365_COPILOT_SHAREPOINTMicrosoft 365 Copilot in SharePoint0aedf20c-091d-420b-aadf-30c042609612
COPILOT_STUDIO_IN_COPILOT_FOR_M365Copilot Studiofe6c28b3-d468-44ea-bbd0-a10a5167435c
Table 1: Copilot for Microsoft 365 Service Plans
COPILOT_STUDIO_IN_COPILOT_FOR_M365Copilot Studiofe6c28b3-d468-44ea-bbd0-a10a5167435c

What the Copilot for Microsoft 365 Service Plans Do

The Copilot service plans split into those governing user-facing features and background or administrative functionality.

User functionality:

  • Microsoft Copilot with Graph-grounded chat
  • Microsoft 365 Copilot in Microsoft Teams (app, summarization of chats and meeting discussions, ability to rewrite/adjust messages before posting to chats or channel conversations)
  • Microsoft 365 Copilot in Productivity Apps (Word, Excel, PowerPoint, Outlook (Win32 and Monarch), Loop, OneNote)

Teams and the productivity apps support Copilot in the desktop, browser, and mobile platforms.

Background and administrative functionality:

Copilot Studio.
Figure 2: Copilot Studio

Turning Off Bits of Copilot

Getting back to the original question, control is available over the chat app, Copilot in Teams, and the generalized bucket of productivity apps. For example, you cannot turn off Copilot for Word and Excel and have it available in PowerPoint and Outlook. The productivity apps are either enabled or disabled for Copilot. Granular control isn’t available.

Copilot for Office depends on the Microsoft 365 enterprise apps (subscription version of Office). Using another version, like Office 2024 (preview available now) isn’t possible because these apps don’t include the necessary UI and code to communicate with Copilot.

The answer to the question is that you can turn bits of Copilot for Microsoft 365 off. For instance, not everyone needs access to Copilot Studio. I’m not sure that I would disable any of the other service plans for background and administrative activity because you don’t know if the action might affect how the user-facing apps work. Disabling a user app certainly works and the license change will be effective within fifteen minutes for browser-based apps (Figure 3) and a few hours for desktop apps, depending on when the app refreshes its license information.

Microsoft Copilot chat discovers that it doesn't have a license.
Figure 3: Microsoft Copilot chat discovers that it doesn’t have a license

But if an organization is paying $360/year for Copilot for Microsoft 365 licenses, surely the imperative is to extract maximum value for the investment instead of restricting what people can use? But if you do decide to disable service plans from the Copilot for Microsoft 365 license, the script will happily do the job for you.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/04/30/copilot-for-microsoft-365-service-plans/feed/ 1 64575
Microsoft Grounds Copilot Apps with Graph and Web Content https://office365itpros.com/2024/03/25/copilot-for-microsoft-365-grounding/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-microsoft-365-grounding https://office365itpros.com/2024/03/25/copilot-for-microsoft-365-grounding/#comments Mon, 25 Mar 2024 08:00:00 +0000 https://office365itpros.com/?p=64268

Office Apps Get Better Grounding in Copilot for Microsoft 365

Message center notification MC734281 (12 March 2024) might have passed by without too much attention unless you’re particularly interested in Copilot for Microsoft 365. The notification informs tenants that Word, Excel, PowerPoint, and OneNote will ground user prompts by reference to enterprise data and the web. As Microsoft notes, this is like what happens when users interact with Copilot for Microsoft 365 chat.

Grounding against enterprise data means that when Copilot responds to user prompts, it will seek additional context by attempting to find relevant information in Microsoft 365 repositories using Graph requests. Web grounding means that Copilot will use Bing search to find relevant information from sites within and outside the enterprise. The fact that major apps will start to use grounded requests from April 2024 might come as a surprise. After all, Microsoft has long cited Copilot’s ability to use the “abundance of data” stored in Microsoft 365 as a major advantage of Copilot for Microsoft 365 over other AI tools that don’t have access to Microsoft 365 repositories.

The roll out starts with Word (Windows and Online) and progresses to PowerPoint, Excel, and OneNote. Microsoft expects to complete the deployment by September 2024.

The Importance of Grounding

Microsoft explains that grounding is “the process of using large language models (LLMs) with information that is use-case specific, relevant, and not available as part of the LLM’s trained knowledge.” In other words, if you ask Copilot for Microsoft 365 to do something and grounding doesn’t happen, it relies on the user prompt to query the LLM.

Until now, users have been able to ground prompts in apps like Word by including up to three reference documents in the prompt. Let me illustrate the importance of grounding by showing an example of two briefing notes generated by Copilot in Word about the Midnight Blizzard attack against Microsoft in January 2024. Copilot generated the first briefing note without any reference documents. Because it couldn’t search the Graph or web for relevant information, the grounding of the prompt was poor, and Copilot could only use whatever information is in the LLM.

As shown in Figure 1, the generated text included several inaccurate statements (hallucinations), including the remarkable assertion that the attack led to a drop of $400 billion in Microsoft’s market value together with a declaration had deprived millions of Microsoft cloud users from accessing services.

Briefing note about Midnight Blizzard generated by Copilot for Microsoft 365 (without reference documents).
Figure 1: Briefing note about Midnight Blizzard generated by Copilot for Microsoft 365 (without reference documents)

If some relevant reference documents are included in the prompt, Copilot’s generated text becomes more accurate and balanced (Figure 2).

Briefing note about Midnight Blizzard generated by Copilot for Word with reference material.
Figure 2: Briefing note about Midnight Blizzard generated by Copilot for Word with reference material

The important point here is that after Microsoft updates Copilot to allow the Office apps to ground prompts using Graph and web material, the chances of Copilot generating absolute rubbish lessen considerably. That is, if Copilot can find relevant information through its searches. Adding reference documents to prompts in Copilot for Word will generate even better results because the reference documents should give Copilot a more precise context to work with.

Microsoft says that Graph grounding is enabled for all user prompts and that Copilot requests will use “the file context” (whatever file is open at the time) plus web searches as well. Copilot for Microsoft 365 chat uses Graph and web lookups today.

The Quality of AI-Generated Text

In some respects, I was shocked that it has taken so long for Microsoft to ground Copilot requests in these important apps. Copilot for Microsoft 365 is evolving rapidly, but the ability to generate high-quality text at general availability seems like an essential rather than a nice to have feature. I’ve always been suspicious about the quality of the text generated by Word and this revelation certainly explains a lot.

Take Your Time

The advice of Directions on Microsoft analyst Wes Miller that organizations should pace themselves and understand exactly what they are buying before they invest in expensive Copilot licenses is accurate. Things are changing, and the hyperbole around Copilot is like a dust storm that obscures detail. Why rush in where angels fear to tread?

Before making your mind up about Copilot, take the time to read the article posted by MVP Joe Stocker where he reports a drop-off of Copilot activity after the novelty effect of asking the AI to perform tasks wears off. Although the sample size was small, this emphasizes the need to support users on their Copilot journey, especially as important new functionality like Graph and web grounding appears.

And if you attend the Microsoft 365 Conference in Orlando at the end of April, make sure that you come to my session about not letting Copilot for Microsoft 365 become a vanity project. You might even enjoy what I have to say!


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem, including in Copilot. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/03/25/copilot-for-microsoft-365-grounding/feed/ 1 64268
Can Copilot for Microsoft 365 Save Users 14 Hours a Month? https://office365itpros.com/2024/03/12/copilot-for-microsoft-365-14hrs/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-microsoft-365-14hrs https://office365itpros.com/2024/03/12/copilot-for-microsoft-365-14hrs/#respond Tue, 12 Mar 2024 01:00:00 +0000 https://office365itpros.com/?p=64051

It All Depends on the Person and How They Use Office

Personal perspectives of using technology are often valuable guides to how useful products will be in production. Given the current hype around Copilot for Microsoft 365, I was interested to read a LinkedIn post by Microsoft employee Luka Perne. Based on his use of Copilot over several months logged on a per-task basis, Perne believes he saves 14 hours per month. That’s quite an impressive number that more than justifies the $30/month Copilot license.

It’s always important to put personal observations in context and ask yourself if a product would work as well for you, especially when reading a report written by someone who works for the vendor. I’m sure that some gain enormously from Copilot for Microsoft 365, just as I’m equally convinced that success with Copilot depends on many individual factors.

Not a Marketing Document

What I liked about this report is that it is not trying to sell Copilot. If you look at Microsoft’s marketing material, Copilot works wonderfully because what you see are carefully selected scenes that show Copilot working with data selected to demonstrate its strengths. This coverage is more practical and informative.

For instance, Perne makes the point that people go through a learning curve as they interact with Copilot. Some progress faster and discover how to extract value quickly. Others struggle with prompts or are unsure how Copilot can help. That’s why it’s important to educate and support users during a Copilot deployment project.

Where Success is Found for Copilot for Microsoft 365

Microsoft employees working in engineering and services roles tend to be more comfortable with new technology than the average Microsoft 365 user. Copilot support for users (informal and formal) is likely better and more comprehensive than elsewhere, and users are motivated to explore the capabilities of the technology, including mastering the technique of constructing effective prompts. Overall, I suspect that a technology like Copilot is adopted more easily inside Microsoft than in customer environments.

Perne says that he’s been working with Copilot for four months. Some will gain the same increase in productivity he reports, but I suspect it will take others many months before they do the same.

As Perne notes, he values specific Copilot features. This matches my own experience where the summaries generated by Copilot for Teams meetings, Outlook email threads, and documents (Figure 1) are easily the most valuable in terms of time savings. Anyone who has ever worked with Microsoft (especially the corporate teams) can attest to the number of meetings that people attend and the ability to generate a quality summary based on the meeting transcript is much appreciated, especially when multiple meetings occur at the same time.

Working with Copilot for Microsoft 365 in a Word document.
Figure 1: Working with Copilot for Microsoft 365 in a Word document

Copilot’s ability to create and rewrite text can help people unsure of their writing skills. In my case, I think I do as well in terms of rewriting text by reviewing the suggestions made by Editor or Grammarly. Copilot is good at generating the outline of a document. However, the accuracy of the material Copilot uses to flesh out the outline depends on being able to find relevant information in SharePoint Online or OneDrive for Business. Without something to use, Copilot often strays into made-up text that reads well without being accurate.

Perne generated the graphics in his article with Excel. but notes the limitations Copilot currently has in Excel, like only working for tables with less than 10K rows. I’m sure this is an area that Microsoft will improve in the future. For now, I agree with the observation that I’ve picked up enough Excel over the years to survive without Copilot for the kind of worksheets I deal with.

The assertion that Copilot always delivered improved results for a non-native English speaker when it came to generating or rewriting text was insightful, and I think fair. Many large organizations have a corporate language that most communication is in. For Microsoft, that language is English, and I can see how useful Copilot is when asked to rewrite or correct text. The output will be bland, but it will be precise and readable, and that’s important in email and documents.

Can You Track Your Copilot Results?

The net is that many factors influence the ability of Copilot for Microsoft 365 to save time for people. If you’re technically literate, skilled in using Word, PowerPoint, Outlook, and Excel, and attend a lot of meetings, and store the material you work with in SharePoint Online and OneDrive for Business, the probability is that you will achieve good results. Whether you save 14 hours per month is another matter. Tracking savings using the same methodology as Perne is certainly one way to assess the outcome, if you’re as good as he was at noting results.


Keep up to date with developments like Copilot for Microsoft 365 by subscribing to the Office 365 for IT Pros eBook. Our monthly updates make sure that our subscribers understand the most important changes happening across Office 365.

]]>
https://office365itpros.com/2024/03/12/copilot-for-microsoft-365-14hrs/feed/ 0 64051
Microsoft Kills Viva Topics to Focus on Copilot https://office365itpros.com/2024/02/23/viva-topics-retirement/?utm_source=rss&utm_medium=rss&utm_campaign=viva-topics-retirement https://office365itpros.com/2024/02/23/viva-topics-retirement/#comments Fri, 23 Feb 2024 00:01:00 +0000 https://office365itpros.com/?p=63851

Viva Topics Retirement Propelled by More Lucrative Copilot Opportunity

In a surprise announcement posted in Microsoft 365 message center notification MC718486, Microsoft said that they will retire Viva Topics on February 22, 2025 and will stop new feature development as of February 22, 2024. Originating as part of Project Cortex, Microsoft launched Viva Topics as one of the four modules in its new Viva employee experience platform in February 2021. Support documentation covering the retirement is available online as is a FAQ.

The idea behind Viva Topics is that organizations could leverage their investment in SharePoint Online by creating curated knowledge network about topics important to the business. Knowledge editors would maintain the topics and link them to sources. Users could consume the information in the knowledge network by inserting topics into the natural flow of communications created in Outlook messages, Teams chats and channel conversations (Figure 1), or SharePoint documents. The latest development was to expose topics in the Microsoft 365 user profile card.

Viva Topics in a Teams channel conversation.

Viva Topics retirement
Figure 1: Viva Topics in a Teams channel conversation

There’s some great technology in Viva Topics. Alas, great technology doesn’t always survive in the acid test of the market. Some Microsoft 365 tenants use Topics, but I don’t see any evidence of a major groundswell of projects. The level of discussion about Topics is low in online forums and it’s not a subject for sessions submitted to major Microsoft 365 conferences. Although hardly a test that could be stood over, it is undeniable that potential speakers submit sessions for technology that interests them or that they work on. I cannot recall seeing a submission for a Viva Topics session in the last year.

Knowledge Management is Hard

Knowledge management is hard. Anyone who set up and managed a knowledge network for Viva Topics will appreciate that the AI-powered harvesting of topics from content stored in SharePoint Online can generate hundreds or thousands of topics to curate, refine, and publish, all of which takes time. The work of the knowledge managers might not be appreciated by end users, or even recognized if end users don’t receive education about how to use Topics.

Even though they announced lightweight management for Topics through Viva Engage in July 2023 and Copilot in Viva Topics in April 2023, the benefit of hindsight shows that Microsoft’s heart had been snatched by Copilot and the clarion call to development groups to create Copilot-branded experiences.

Copilot Wins the Game and Forces the Viva Topics Retirement

Apart from being swept along by the Copilot wave, I think hard business logic is a major driving factor behind Microsoft’s decision to retire Viva Topics. Copilot for Microsoft 365 brings in $30/user/month plus the opportunity to upsell customers to more expensive Office 365 or Microsoft 365 licenses. Microsoft’s pricing for Viva Topics varied over the years. According to Copilot, a Viva Topics license brings in $4/user/month (Figure 2).

Copilot figures out the cost of Viva Topics licenses.
Figure 2: Copilot figures out the cost of Viva Topics licenses

Even when included in the Viva Communications and Community license, Topics cannot contribute anywhere close to the revenue that Copilot will likely deliver over the next five years. In addition, Viva Topics is usually a much harder project to sell, and its implementation lacks the excitement and glamor currently associated with Copilot. I mean, topic refinement compared to AI-generated email and documents?

Looking at the situation through the business lens, it makes absolute sense for Microsoft to retire Viva Topics and realign the engineering resources from that program to work on other AI-related projects, such as the “new AI-powered knowledge management experiences” promised in the announcement.

Third Time Lucky

Microsoft’s record in knowledge management is not stellar. The next-generation knowledge portals promised at Ignite 2015 vanished as soon as the attendees left Chicago and its infamous baloney conference lunches behind. Now Viva Topics is being retired. Microsoft has put all its knowledge management eggs in the Copilot basket. Let’s hope that the next round of knowledge applications powered by Copilot demonstrate once again that Microsoft has the habit of getting things right third time around.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes to understand why the Viva Topics retirement happened. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering the Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2024/02/23/viva-topics-retirement/feed/ 2 63851
Stopping Copilot Access to SharePoint Online Sites and Document Libraries https://office365itpros.com/2024/02/21/exclude-sharepoint-site-from-copilot/?utm_source=rss&utm_medium=rss&utm_campaign=exclude-sharepoint-site-from-copilot https://office365itpros.com/2024/02/21/exclude-sharepoint-site-from-copilot/#comments Wed, 21 Feb 2024 01:00:00 +0000 https://office365itpros.com/?p=63738

Exclude SharePoint Site from Copilot by Blocking Search Indexing

One of the fundamental concepts underpinning Copilot for Microsoft 365 is the use of Graph queries to find information stored in Microsoft 365 to help ground user prompts. Grounding is the process of providing additional context to make it easier for Copilot to return high-quality responses to user prompts. For instance, if someone asks Copilot to write a briefing note about Office 365, Copilot first queries Microsoft 365 repositories like SharePoint Online to discover what information the user already has about the topic. Optionally, if allowed by the tenant, Copilot can query the web to find additional information.

After gathering information, Copilot refines the prompt and sends it to the Large Language Model (LLM) for processing. Eventually, possibly after further refinement, Copilot returns the response to the user.

Copilot Access to Content Stored in Microsoft 365 Repositories

One of the things you quickly learn about Copilot for Microsoft 365 is that the quality and reliability of generated text is highly dependent on the availability of information. For instance, Copilot is very good at summarizing Teams meetings because it has the meeting transcript to process. However, if you ask Copilot to draft text about a topic where it cannot find anything in Microsoft 365 to ground the prompt, Copilot will certainly generate a response, but the text might not be as useful as you expect. The output will certainly follow the requested format (a report, for instance), but the content is likely to surprise because it is likely to come from a web search that might or might not retrieve useful information.

Users can guide Copilot for Word by providing up to three reference documents. In effect, the user instructs Copilot that it should use the reference documents to ground the prompt. This works well, unless the documents you want to use are large (I am told that Microsoft is increasing the maximum supported size for reference documents).

All of this means that anyone contemplating a deployment of Copilot for Microsoft 365 should store information within Microsoft 365 to create what Microsoft calls an “abundance of data” for Copilot to consume. SharePoint Online and OneDrive for Business are prime repositories, but it’s possible that some SharePoint Online sites contain confidential or other information that the organization doesn’t want Copilot to consume.

Remember, Copilot can only use information that the signed-in account using Copilot can access. An account that has access to a site holding confidential information could find that Copilot retrieves and uses that information in its responses. The user is responsible for checking the text generated by Copilot, but accidents do happen, especially when time is short to get a document out.

Preventing Copilot Access to Sensitive Information

Two methods help to avoid accidental disclosure of confidential information. First, you can protect files with sensitivity labels. If Copilot consumes protected documents, it applies the same sensitivity label to the output.

However, not every organization uses sensitivity labels. In this situation, an organization can decide to exclude selected SharePoint Sites from indexing (Figure 1) by both Microsoft Search and the semantic index. If content is not indexed, it can’t be found by queries and therefore cannot be consumed by Copilot.

Configuring a SharePoint site to exclude it from search results.

Exclude sharepoint site from copilot
Figure 1: Exclude SharePoint Site from Copilot Access by Stopping it Appearing in Search Results

But what happens if you have a SharePoint site with several document libraries and want to make the content available from some libraries and not others? The answer is the same except that the exclusion from search results is applied through the advanced settings of document library settings (Figure 2).

Settings for a document library.
Figure 2: Settings for a document library

The downside of excluding sites or libraries from search results is that people can’t use SharePoint search to find documents.

Testing Excluded Sites and Document Libraries

How do you know site and document library exclusions work? The easiest way is to create a document with an unusual phrase in the excluded site or library and then attempt to use it with Copilot for Word. I created a document about ‘Project Derrigimlagh’ and included the phrase ‘wicked worms’ several times in the content. I then created a new Word document and added the document from the excluded library as a reference (Figure 3).

Selecting a reference file for Copilot for Word
Figure 3: Selecting a reference file for Copilot for Word

You might ask why the document can be added as a reference. The dialog shows recent documents, and the document is in this category, so it shows up. However, when Copilot attempts to consume the document, it cannot access the content. The result is that the prompt cannot be grounded and Copilot flags this as a failure to generate high-quality content (Figure 4). This is a general-purpose error that Copilot issues anytime it believes that it cannot respond to a prompt.

Copilot for Word can't generate high-quality content
Figure 4: Copilot for Word can’t generate high-quality content

Interestingly, when I removed the reference document and reran the prompt, Copilot generated text explaining the potential use of wicked worms as a biofuel source. This is emphatically not the content stored in the excluded document library. The information about Derrigimlagh came from the internet, and making wicked worms into a biofuel source is probably due to published material about using worms in a biorefinery. In any case, it’s a good example of how AI-based text generation needs to be treated with caution.

Use Sensitivity Labels If Possible

If an organization has implemented sensitivity labels, I think this is a better method to protect confidential material, if only because of the persistence of labels to generated documents. You can also define a default sensitivity label for a document library to make sure that everything stored in the library is protected and use auto-label policies to find and protect confidential material stored across all sites.

In a nutshell, sensitivity labels are more flexible and powerful, but it’s nice to have the backup of being able to exclude complete sites and individual document libraries. Just another thing to consider in a Copilot deployment!


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/02/21/exclude-sharepoint-site-from-copilot/feed/ 1 63738