Microsoft 365 Copilot – Office 365 for IT Pros https://office365itpros.com Mastering Office 365 and Microsoft 365 Fri, 27 Jun 2025 08:25:37 +0000 en-US hourly 1 https://i0.wp.com/office365itpros.com/wp-content/uploads/2024/06/cropped-Office-365-for-IT-Pros-2025-Edition-500-px.jpg?fit=32%2C32&ssl=1 Microsoft 365 Copilot – Office 365 for IT Pros https://office365itpros.com 32 32 150103932 Copilot Agent Governance Product Launched by ISV https://office365itpros.com/2025/06/27/agent-governance-rencore/?utm_source=rss&utm_medium=rss&utm_campaign=agent-governance-rencore https://office365itpros.com/2025/06/27/agent-governance-rencore/#respond Fri, 27 Jun 2025 07:00:00 +0000 https://office365itpros.com/?p=69796

Microsoft Leaves Gaps in Technologies for ISVs to Fill – Like Agent Governance

Every time Microsoft makes a big move, ISVs seek to take advantage with a new product. It’s the way of the work. Microsoft creates technology and ISVs fill the holes left in that technology. In some respects, the cloud is a difficult place for ISVs. There’s less to tweak than in an on-premises environment and although the Graph APIs have extended their coverage to more areas of Microsoft 365 over the last few years, significant gaps still exist for major workloads like Exchange Online and SharePoint Online.

But a new technology creates a new opportunity because everything starts from scratch. Microsoft’s big move into artificial intelligence with Copilot hasn’t created too many opportunities because Copilot depends on a massive infrastructure operated by Microsoft that’s inaccessible except through applications like BizChat. Agents are different. They’re objects that need to be managed. They consume resources that need to be paid for. They represent potential security and compliance problems that require mitigation. In short, agents represent a chance for ISVs to build products to solve customer problems as Microsoft heads full tilt to its agentic future.

Building an Infrastructure for Agent Governance

To be fair to Microsoft, they’ve started to build an infrastructure for agent management. Apart from a whitepaper about managing and governning agents, the first concrete sign is the introduction of agent objects in Entra ID. Microsoft is thinking about how agents can work together, and how that communication can be controlled and monitored. That’s all great stuff and it will deliver benefits in the future, but the immediate risk is the fear that agents might run amok inside Microsoft 365 tenants.

Microsoft reports that there are 56 million monthly active users of Power Platform, or 13% of the 430 million paid Microsoft 365 seats. That’s a lot of citizen developers who could create agents using tools like Copilot Studio. Unless tenant administrators disable ad-hoc email subscriptions for the tenant, developers could be building agents without anyone’s knowledge.

Don’t get me wrong. I see great advantages in agent technology and have even built agents myself, notably a very useful agent to interact with the Office 365 for IT Pros eBook. One thing that we’ve learned over the last 30 years is that when users are allowed to create, they will. And they’ll create objects without thought, and those objects will need to be cleaned up eventually, or, as Microsoft discovered, the mass of SharePoint Online sites created for Teams became a real problem for Microsoft 365 Copilot deployments. Incorporating solid management and governance from the start is of great benefit for new technologies.

Rencore Steps Up with Copilot Agent Governance

All of which brings me to Rencore’s announcement of two new modules for their governance product to deal with Copilot and agent governance and Power Platform governance (Figure 1). Matthias Einig, Rencore’s CEO, has been forceful about the need to take control of these areas and it’s good to see that he’s investing in product development to help Microsoft 365 tenants take control before agents get any chance to become a problem.

Rencore Agent Governance (source: Rencore).
Figure 1: Rencore Agent Governance (source: Rencore)

I have not used the Rencore product and do not endorse it. I just think that it’s great to see an ISV move into this area with purpose and intent. It seems like Rencore aims to address some major pain points, like shadow IT, the cost of running Copilot agents, over-sharing, and “agent sprawl.” All good stuff.

I’m sure other ISVs will enter this space (and there might be some active in the area already that I don’t know of). This will be an interesting area to track as ISVs seek new ways to mitigate the potential risks posed by agents.

No Time to Relax

Product from one ISV does not mean that we can all relax and conclude that agent management is done. It’s not. The continuing huge investment by Microsoft in this space means that agent capabilities will improve and grow over time. Each improvement and new feature has the potential to affect governance and compliance strategies. Don’t let your guard down and make sure that your tenant has agents under control. And keep them that way.


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365.

]]>
https://office365itpros.com/2025/06/27/agent-governance-rencore/feed/ 0 69796
Outlook’s New Summarize Option for Email Attachments https://office365itpros.com/2025/06/23/summarize-attachment-outlook/?utm_source=rss&utm_medium=rss&utm_campaign=summarize-attachment-outlook https://office365itpros.com/2025/06/23/summarize-attachment-outlook/#comments Mon, 23 Jun 2025 07:00:00 +0000 https://office365itpros.com/?p=69699

Summarize Attachment Feature is an Example of New Features Needed to Maintain Customer Interest

Introducing a new technology is hard. The great expectations created at the initial launch soon meets the hard reality of deployment and things don’t get better until the technology has had time to bake. This is as true for Microsoft 365 Copilot as for any other major technology. I see people questioning whether the $30/user/month really delivers any benefits, with real concern over whether people use any of the purported time saved through Copilot interventions doing anything more valuable than drinking more coffee.

News that the U.S. Better Business Bureau forced Microsoft to change some of the claims it makes about how Microsoft 365 Copilot affects user productivity doesn’t help the case for AI-based assistance. And lukewarm or mildly enthusiastic (but independent) reports about Copilot usage in organizations, like the recent UK Government report based on a 3-month trial for 20,000 employees don’t bolster the case much either.

All Microsoft can do is continue to push out updates and new AI-based features to keep customer interest while Copilot matures to become more useful in day-to-day activities. The result is a flood of new Copilot-related features, not all of which seem valuable except in specific cases. I don’t know whether AI-informed People Skills will become popular (some HR professionals that I know like People Skills a lot). Those in the Power Platform world (now with 56 million monthly active users according to data made available at Microsoft’s FY25 Q3 results) see lots of changes to make Copilot agents more productive. I do like the ability to upload documents to agents for the agents to reason over.

Summarizing Attachments

All of which brings me to the update described in message center notification MC1073094 (13 May 2025, Microsoft 365 Roadmap item 475249). It’s an example of a recent Copilot enhancement to help users process “classic” email attachments faster. Even though cloudy attachments are preferable in many respects, many people still send files instead of links.

Copilot has been able to summarize cloudy attachments for email for quite a while. Now, when a message with one or more classic file attachments arrives, users with a Microsoft 365 license see a new summarize option for Office and PDF attachments. The feature is available in the New Outlook for Windows, OWA, Outlook mobile, and Outlook for Mac, but not for Outlook classic. Microsoft is rolling out the update now with estimated completion by late August 2025.

Figure 1 shows the general idea. A Word file is attached to a message. Clicking the summarize option from the drop-down menu beside the attachment causes Copilot to create and display the summary for the file inside the Summary by Copilot panel (or card). If a message has multiple file attachments, the summarize option must be invoked separately.

The summarize option for a file attachment for a message opened in OWA.
Figure 1: The summarize option for a file attachment for a message opened in OWA

Copilot cannot process encrypted attachments (using sensitivity labels or another encryption mechanism).

No Archived Messages

My archive mailbox is full of attachments from long-forgotten projects, including files related to some legal cases that I was involved with. I was curious to see what sense Copilot might extract from some of the PDFs and Word documents from those cases. Oddly, Outlook won’t summarize any of the attachments for messages stored in an archive mailbox. To generate a summary for these files, you must open download and open Office files in a desktop or web app and use the Copilot options available in the app.

Thinking about why this might be so, I guess the logic is that attachments for archived messages probably aren’t of very high interest, and if someone goes to the trouble of finding an archived message, they have a purpose for doing so and won’t mind opening attachments to view content. On the other hand, I could be overthinking things and Microsoft simply designed the feature to work only with messages from the primary mailbox.

The Value of Small Changes

Over my many years of work, I cannot say how many emails I have received with file attachments. Being able to see a quick summary of an attachment is a good example of how AI can be effective. The feature works well because the AI has just one file to process, so it’s unlikely that hallucinations or other issues will occur. You might disagree with points made in the summary, but having the summary is a timesaver and a great starting point for understanding whether a file contains anything important.

Another example of a small but interesting change is the ability to create a meeting from an Outlook email thread (MC1090693, 9 June 2025, Microsoft 365 roadmap item 494154). The idea is that Copilot scans an email thread to determine the topic for a meeting and its participants and creates a meeting invitation ready to go. This kind of thing doesn’t need AI because existing Graph APIs can do the work, but Copilot integrates the work into a new Schedule with Copilot option (only for email threads with sufficient data to base a meeting upon). According the roadmap item, this feature is for the mobile clients, but I bet it will be available in the new Outlook and OWA too.

In the overall scheme of Copilot, delivering Outlook features to make small tasks easier is not important. However, changes that reduce friction for users are important and collectively a bunch of changes like this might just be enough to convince an organization that they really can’t live without Copilot.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/06/23/summarize-attachment-outlook/feed/ 1 69699
Microsoft Launches the Copilot Interaction Export API https://office365itpros.com/2025/05/30/aiinteractionhistory-api/?utm_source=rss&utm_medium=rss&utm_campaign=aiinteractionhistory-api https://office365itpros.com/2025/05/30/aiinteractionhistory-api/#comments Fri, 30 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69376

aiInteractionHistory Graph API Available in June 2025

Microsoft 365 message center notification MC1080682 (22 May 2025, Microsoft 365 Roadmap item 491631) announces that the new Microsoft 365 Copilot Interaction Export API (aka, the aiInteractionHistory API) will roll out in June 2025. This is the same API that I covered in a Practical365.com article last December and the documentation still says that the API is available through the Graph beta endpoint. Perhaps the intention is to move the API to the V1.0 (production) endpoint when it’s officially released.

I don’t see much change in how the API works or the retrieved data since I last looked at it. A welcome change is that it is now possible to fetch a maximum of 100 records per request rather then ten. Fetching ten interaction records at a time made the API very slow. Although faster than before, the API is still slow, especially for an API designed to allow third-party apps and ISVs “to export Copilot user interaction data for processing in their security and compliance (S+C) applications.”

Other audit APIs support fetching up to a thousand records at a time. Maybe a V1.0 version of the API will support a higher value. Details of how the API works and an example script can be found in the original article.

Licenses and Permissions

The AiEnterpriseInteraction.Read.All Graph permission needed to access interaction data is not available as a delegated permission, meaning that the only way to access the data is through an app (including app-only interactive Microsoft Graph PowerShell SDK sessions). Weirdly, accounts used to run apps using the API to fetch interaction records must have a Microsoft 365 Copilot license.

What the aiInteractionHistory API Captures

According to Microsoft, the API “captures the user prompts and Copilot responses in Copilot private interactions chat and provides insights into the resources Copilot has accessed to generate the response.” This statement does not mean that the data lays bare the details of Copilot interactions. Some of the information needs to be mined and interpreted to make sense. For instance, here are the details of an interaction record:

Name                           Value
----                           -----
locale                         en-us
body                           {[content, [AutoGenerated]undefined<attachment id="fd3a9044-309c-4ec9-a568-676f1d521f24"></attachment><attachment id="01TAGX3U2ESA5P3HBQZFHKK2DHN…
from                           {[@odata.type, #microsoft.graph.chatMessageFromIdentitySet], [user, System.Collections.Hashtable], [application, ], [device, ]}
appClass                       IPM.SkypeTeams.Message.Copilot.Word
attachments                    {02 Managing Identities.docx, unknown-file-name}
contexts                       {02 Managing Identities.docx, unknown-file-name}
createdDateTime                25/04/2025 09:27:05
conversationType               appchat
interactionType                userPrompt
mentions                       {}
links                          {}
sessionId                      19:t67NyrXsxDyC8qGGCtSQZYjC3TV1lYvq3IkjzpXquUc1@thread.v2
id                             1745573225046
requestId                      GTbr3lBouCMpcP7L1qVv8Q.20.1.1.1.4
etag                           1745573225046

The appClass property tells us what Copilot app the interaction is for. In this case, it’s Copilot for Word. The attachments property tells us if any reference files are used. One is mentioned here, and given that the body property mentions AutoGenerated, we can conclude that this interaction occurred when Copilot for Word generated an automatic summary for a document.

The interactionType tells us that this record is for a user prompt. Responses from Copilot have aiResponse in the interactionType property. User prompts that aren’t for automatic summaries have the text of the prompt in the body property. For example:

Name                           Value
----                           -----
content                        What functionality isn't available with a Microsoft 365 retention policy
contentType                    text

aiInteractionHistory API requests require the identifier for a user account and the response is the records for that user. Details of the user are in the from property, but you’ll have to navigate to from.user.id to see the identifier for the user. A DisplayName property is available in the from structure but doesn’t hold the display name of the user.

Assuming that a third-party application wanted to retrieve the ai interaction history records and process the records for its own purposes, it’s obvious from this brief discussion that the application has some work to do to interpret the raw data to make it useful for compliance investigations or other purposes. The script published with the December article referenced above shows how to approach the task, which is like the parsing of audit records to extract useful content. Figure 1 shows the kind of data that can be extracted from the aiInteractionHistory API records.

Data extracted using the aiInteractionHistory API
Figure 1: Data extracted using the aiInteractionHistory API

The Many Sources of Information About Copilot Interactions

It’s hard to know how useful the aiInteractionHistory API will turn out to be. Other sources of information can be mined to discover how people use Copilot, including usage data, audit records, and the compliance records held in user mailboxes. I guess it all depends on what you’re looking for.


]]>
https://office365itpros.com/2025/05/30/aiinteractionhistory-api/feed/ 1 69376
Why Copilot Access to “Restricted” Passwords Isn’t as Big an Issue as Uploading Files to ChatGPT https://office365itpros.com/2025/05/20/microsoft-365-copilot-pen-test2/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-pen-test2 https://office365itpros.com/2025/05/20/microsoft-365-copilot-pen-test2/#comments Tue, 20 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69290

Unless You Consider Excel Passwords to be Real Passwords

I see that some web sites have picked up the penetration test story about using Microsoft 365 Copilot to extract sensitive information from SharePoint. The May 14 Forbes.com story is an example. The headline of “New Warning — Microsoft Copilot AI Can Access Restricted Passwords” is highly misleading.

Microsoft 365 Copilot and penetration tests.

Unfortunately, tech journalists and others can rush to comment without thinking an issue through, and that’s what I fear has happened in many of the remarks I see in places like LinkedIn discussions. People assume that a much greater problem exists when if they would only think things through, they’d see the holes in the case being presented.

Understanding the Assumptions made by the Penetration Test

As I pointed out in a May 12 article, the penetration test was interesting (and did demonstrate just how weak Excel passwords are). However, the story depends on three major assumptions:

  • Compromise: The attacker has control of an Entra ID account with a Microsoft 365 Copilot license. In other words, the target tenant is compromised. In terms of closing off holes for attackers to exploit, preventing access is the biggest problem in the scenario. All user accounts should be protected with strong multifactor authentication like the Microsoft authenticator app, passkeys, or FIDO-2 keys. SMS is not sufficient, and basic authentication (just passwords) is just madness.
  • Poor tenant management: Once inside a tenant and using a compromised account, Microsoft 365 Copilot will do what the attacker asks it to do, including finding sensitive information like a file containing passwords. However, Copilot cannot find information that is unavailable to the signed-in user. If the tenant’s SharePoint Online deployment is badly managed without well-planned and well-managed access controls, then Copilot will happily find anything that the user’s access allows it to uncover. This is not a problem for Copilot: it is a failure of tenant management that builds on the first failure to protect user accounts appropriately.
  • Failure to deploy available tools: Even in the best-managed SharePoint Online deployment, users can make mistakes when configuring access, Users can also follow poor practice, such as storing important files in OneDrive for Business rather than SharePoint Online. But tenants with Microsoft 365 Copilot licenses can mitigate against user error with tools available to them such as Restricted Content Discovery (RCD) and the DLP policy for Microsoft 365 Copilot. The latter requires the tenant to deploy sensitivity labels too, but that’s part of the effort required to protect confidential and sensitive information.

I’m sure any attacker would love to find an easily-compromised tenant where they can gain control over accounts that have access to both badly managed SharePoint Online sites that hold sensitive information and Microsoft 365 Copilot to help the attackers find that information. Badly-managed and easily-compromised Microsoft 365 tenants do exist, but it is my earnest hope that companies who invest in Microsoft 365 Copilot have the common sense to manage their tenants properly.

Uploading SharePoint and OneDrive Files to ChatGPT

Personally speaking, I’m much more concerned about users uploaded sensitive or confidential information to OpenAI for ChatGPT to process. The latest advice from OpenAI is how the process works for their Deep Research product. Users might like this feature because they can have their documents processed by AI. However, tenant administrators and anyone concerned with security or compliance might have a different perspective.

I covered the topic of uploading SharePoint and OneDrive files to ChatGPT on March 26 and explained that the process depends on an enterprise Entra ID app (with app id e0476654-c1d5-430b-ab80-70cbd947616a) to gain access to user files. Deep Research is different and its connector for SharePoint and OneDrive is in preview, but the basic principle is the same: a Graph-based app uploads files for ChatGPT to process. If that app is blocked (see my article to find out how) or denied access to the Graph permission needed to access files, the upload process doesn’t work.

Set Your Priorities

I suggest that it’s more important to block uploading of files from a tenant to a third-party AI service where you don’t know how the files are managed or retained. It certainly seems like a more pressing need than worrying about the potential of an attacker using Microsoft 365 Copilot to run riot over SharePoint, even if a penetration test company says that this can happen (purely as a public service, and not at all to publicize their company).

At least, that’s assuming user accounts are protected with strong multifactor authentication…


]]>
https://office365itpros.com/2025/05/20/microsoft-365-copilot-pen-test2/feed/ 1 69290
How to Enhance Copilot Usage Data https://office365itpros.com/2025/05/09/copilot-usage-data-accounts/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-usage-data-accounts https://office365itpros.com/2025/05/09/copilot-usage-data-accounts/#comments Fri, 09 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69127

Combine Copilot Usage Data with User Account Details to Gain Better Insight for Deployments

Discussing the usage data that’s available for Microsoft 365 Copilot (in the Microsoft 365 admin center and via a Graph API), a colleague remarked that it would be much easier to leverage the usage data if it contained the department and job title for each user. The usage data available for any workload is sparse and needs to be enhanced to be more useful.

Knowing what data sources exist within Microsoft 365 and how to combine sources with PowerShell or whatever other method you choose is becoming a valuable skill for tenant administrators. I’ve been down this path before to discuss combining usage data with audit data to figure out user accounts who aren’t using expensive Copilot licenses. Another example is combining Entra ID account information with MFA registration methods to generate a comprehensive view of user authentication settings.

Scripting a Solution

In this instance, the solution is very straightforward. Use a Graph API call (complete with pagination) to download the latest Copilot usage data, Find the set of user accounts with a Microsoft 365 Copilot license and loop through the set to match the user account with usage data. Report what’s found (Figure 1).

Copilot usage datacombined with user account details.
Figure 1: Copilot usage data combined with user account details

Obfuscated Data and Graph Reports

The thing that most people trip over is matching usage data with user accounts. This is impossible if your tenant obfuscates (anonymizes) usage data. This facility has been available since late 2020 and if the obfuscation setting is on in the Microsoft 365 admin center, all usage data, including the data used by the admin center and Graph API requests is “de-identified” by replacing information like user principal names and display names with a system-generated string.

It’s therefore important to check the settings and reverse it if necessary for the duration of the script to make sure that you can download “real” user information. If you don’t, there’s no way of matching a value like FE7CC8C15246EDCCA289C9A4022762F7 with a user principal name like Lotte.Vetler@office365itpros.com.

Fortunately, I had a lot of code to repurpose, so the script wasn’t difficult to write. You can download the complete script from the Office 365 for IT Pros GitHub repository.

Finding Areas for Focus

Getting back to the original question, I assume the idea of including job titles and departments with Copilot usage data is to figure out where to deploy assistance to help people understand how to use Copilot in different apps. You could do something like this to find the departments with Copilot users who have no activity in the report period (90 days).

    Group-Object -Property Department | ForEach-Object {
        [PSCustomObject]@{
            Department = $_.Name
            UserCount  = $_.Group.Count
        }
    }

$GroupedReport | Sort-Object -Property Department | Format-Table -AutoSize

Department               UserCount
----------               ---------
Analysis and Strategy            3
Business Development             1
Core Operations                 57
Editorial                        1
Group HQ                         1
Information Technology           3
Marketing                       22
Planning & Action                1
Project Management               1
Research and Development         1

With this kind of output, the team driving Copilot adoption and use for the organization would be wise to spend some time with the Core Operations and Marketing departments to ask why so many of their users don’t appear to be using Copilot.

As noted above, understanding how to use PowerShell to mix and match data sources to answer questions is a valuable skill. There’s lots of data available in a Microsoft 365 tenant. That data is there to be used!


Need some assistance to write and manage PowerShell scripts for Microsoft 365? Get a copy of the Automating Microsoft 365 with PowerShell eBook, available standalone or as part of the Office 365 for IT Pros eBook bundle.

]]>
https://office365itpros.com/2025/05/09/copilot-usage-data-accounts/feed/ 1 69127
How Microsoft 365 Copilot Tenants Benefit from SharePoint Advanced Management https://office365itpros.com/2025/05/06/sharepoint-advanced-management-2/?utm_source=rss&utm_medium=rss&utm_campaign=sharepoint-advanced-management-2 https://office365itpros.com/2025/05/06/sharepoint-advanced-management-2/#respond Tue, 06 May 2025 07:00:00 +0000 https://office365itpros.com/?p=69011

Ignite Announcement About SAM for Copilot Customers Misinterpreted by Many

At the Ignite 2024 conference, Microsoft announced that “Microsoft 365 Copilot will now include built-in content governance controls and insights provided by SharePoint Advanced Management.” At the time, and still broadly believed, the assumption was that Microsoft would provide customers with Microsoft 365 Copilot licenses with SharePoint Advanced Management (SAM) licenses. Maybe even a single SAM license would be sufficient to license SAM technology alongside Copilot. That’s not the case.

If you’ve been waiting for a SAM license to appear in your tenant, you’ll be disappointed and won’t see SAM listed in the set of tenant subscriptions. Don’t be swayed by the banner in the SharePoint Online admin center to announce that your SharePoint Advanced Management subscription is enabled (Figure 1). It’s not. Access to SAM features is granted through a check enabled in code for the presence of Copilot. The necessary update is now broadly available to customers.

SharePoint Advanced Management options in the SharePoint admin center.

SAM
Figure 1: SharePoint Advanced Management options in the SharePoint admin center

SAM Features for Microsoft 365 Copilot Customers

The facts are laid out in the SAM documentation. Customers with eligible Copilot licenses can use some, but not all, SAM functionality without a SAM license. Here’s the list:

  • Site Lifecycle Policy
    • Inactive SharePoint sites policy
    • Site Ownership Policy
  • Data Access Governance (DAG) Insights
    • “Everyone Except External Users” (EEEU) insights
    • Sharing Links and Sensitivity Labels
    • PowerShell: Permission state report for SharePoint and OneDrive Sites, and Files
    • Sharing links report
  • Site Access Review
  • Restricted Content Discovery (RCD – enabled via PowerShell)
  • Restricted Access Control (RAC) for SharePoint and OneDrive for Business.
  • Recent Admin Actions and Change History
  • Block Download Policy
    • SharePoint and OneDrive sites
    • Teams recordings

There’s some good stuff here, particularly Restricted Content Discovery (RCD), the Site Lifecycle Policy to manage inactive sites, and the Block download policy. Every tenant with Microsoft 365 Copilot should consider enabling RCD to block Copilot access to sites containing sensitive Office and PDF files and sites containing old and obsolete material (the digital rot or debris that clutters up so many tenants).

The problem with Copilot reusing sensitive material in its responses is obvious. The issue with Copilot reusing old, obsolete, and potentially misleading content in its responses is equally problematic, especially if human checks don’t catch errors in responses. Copilot doesn’t know when a Word document written ten years ago is outdated and inaccurate. All Copilot sees is words that can be processed and reused.

When SAM is Needed

All of which brings me to a point where a SAM license is required. In my case, I wanted to test the extend SharePoint protections with a default sensitivity label feature. The idea here is to make sure that unlabeled files receive protection when downloaded by applying a sensitivity label with equivalent rights to those enjoyed by site users. Defining a default sensitivity label for a document library already requires an Office 365 E5 license or equivalent. Why this slight extension wanders into the need to have SAM is another example of bizarre Microsoft licensing.

The documentation notes that Copilot can’t currently open files with sensitivity labels applied in this manner. This means that Copilot cannot extract the protected content to use in its responses because it doesn’t have the right to do so. However, Copilot can search the metadata of labeled files and show that metadata to those who perform searches. Restricted Content Discovery is the right way to block Copilot access to files.

Anyway, without a SAM license, I can’t test. Do I want to pay Microsoft for a license for the privilege of testing their software? I don’t think so.

Copilot in Word for iOS

In closing, I attempted to use a new feature in Word for iOS (and Android) to dictate some notes for this article for Copilot to reason over and produce a draft. The feature is covered in MC1060866 (23 April 2025) and deployment has begun, which is why I guess I could use it. The dictation part worked, even if some of my words were misunderstood (Figure 2). But any attempt to have Copilot do some magic failed utterly. I guess that AI can’t help me…

Dictating text in Word for iOS for Copilot to process.
Figure 2: Dictating text in Word for iOS for Copilot to process


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365.

]]>
https://office365itpros.com/2025/05/06/sharepoint-advanced-management-2/feed/ 0 69011
Copilot’s Solution to Fix Grammar and Spellings https://office365itpros.com/2025/04/25/fix-grammar-and-spelling-errors/?utm_source=rss&utm_medium=rss&utm_campaign=fix-grammar-and-spelling-errors https://office365itpros.com/2025/04/25/fix-grammar-and-spelling-errors/#respond Fri, 25 Apr 2025 07:00:00 +0000 https://office365itpros.com/?p=69025

Fix Grammar and Spelling Problems with a Single Click

I am indebted to Paul Robichaux for pointing out a notable example of where Copilot for Word could help authors rewrite text to be clearer and more concise. The offending article is in the Microsoft Graph documentation for a topic called “enhanced personalization control overview.” It’s part of a set of controls for Microsoft 365 features such as name pronunciation and user pronouns in an area that’s been under development for several years. The text is turgid, unclear, and difficult to understand (Figure 1).

This text is perfectly clear, concise, and highly readable.

Fix grammar and spelling errors.
Figure 1: This text is perfectly clear, concise, and highly readable

The Copilot Rewrite

I asked Copilot for Word to make the original text more consise and understandable. Copilot produced this text:

Microsoft 365 Copilot customizes itself for each user, assisting with daily tasks based on their work knowledge. Both tenant administrators and users can manage this personalization using advanced control features.

Discover how Microsoft 365 Copilot enhances personalization, ensures privacy, and provides control through improved settings.

The corrected text is still not good. Human intervention is necessary to explain how Copilot “can become personalized to each user” (original text) or how Copilot “customizes itself for each user” (revision), and so on. It is a stark warning of the danger of accepting AI-generated text without question. Not only can customer communications end up with absolute rubbish, but the danger also exists that AI-generated incorrect and misleading text ends up being stored in a file and reused ad nauseum by Copilot when it generates responses to future user prompts.

You might wonder why the Microsoft writers did not ask Copilot to refine their text. Well, I do not know if they did or not, but it could be that because the text is about a new feature that does not exist, Copilot could not find anything better to say in the Graph or in its LLMs. Remember, generative text depends on what has gone before. Copilot can rehash material it knows about, but it cannot write material about a new topic from scratch.

The Copilot Promise to Fix Grammar and Spelling Errors

Which brings me neatly to message center notification MC1060868 (23 April 2025, Microsoft 365 roadmap item 483954), which promises a new Copilot “fix grammar and spellings” feature that will address all grammar and spelling problems found in text with a single click. General availability of the feature is due in late April 2025 with deployment scheduled to complete worldwide by mid-June 2025.

Microsoft doesn’t say what languages are supported, but I assume that the feature will appear in all the languages supported by Copilot. MC1060868 contains no detail about which Copilot apps will benefit. Copilot for Word is an obvious target, and I assume that Copilot for Outlook will also receive help to tidy up email communications. As to the other apps, I guess we will see after the feature arrives.

It is a logical progression to have a single-pass process to find and remedy common errors in documents. Word has options to check for spelling and grammar errors as user type text into documents. The difference here is that Word suggests and nudges people when it detects potential errors whereas Copilot will go ahead and rewrite text to remove errors. It is then up to the user to decide whether to keep or discard the Copilot rewrite. Overall, Copilot’s one-click solution is a more proactive approach to helping people generate better text.

But is it Possible to Fix Grammar and Spelling with One Click?

That is, if everything works. The history of software designed to help people write better text is littered with dead ends. Does anyone pay much attention to the recommendations of Microsoft Editor? Why do people continue to subscribe to services like Grammarly when Microsoft offers spell and grammar checking in its products. Perhaps we are heading to a new golden age of beautiful text created by humans and enhanced by AI. Maybe, and I am sure the prospect will be welcomed by those who write the Graph documentation. But I am not holding my breath.


Make sure that you’re not surprised about changes that appear inside Microsoft 365 applications by subscribing to the Office 365 for IT Pros eBook. Our monthly updates make sure that our subscribers stay informed.

]]>
https://office365itpros.com/2025/04/25/fix-grammar-and-spelling-errors/feed/ 0 69025
How SharePoint Online Restricted Content Discovery Works https://office365itpros.com/2025/04/02/restricted-content-discovery-works/?utm_source=rss&utm_medium=rss&utm_campaign=restricted-content-discovery-works https://office365itpros.com/2025/04/02/restricted-content-discovery-works/#comments Wed, 02 Apr 2025 07:00:00 +0000 https://office365itpros.com/?p=68682

Restricted Content Discovery Hides SharePoint Content from Copilot and Agents

The problem of poor permission management has surfaced from time to time in the history of SharePoint. The Office Delve app caused the last big upheaval within Microsoft 365 when it demonstrated an uncanny ability to surface sensitive documents to user view. Of course, Delve was never the problem. The issue is due to careless permission assignment, usually at site level.

When Microsoft launched Copilot in March 2023, it soon became apparent that Copilot is even better than Delve at finding and reusing documents, including files that an organization would prefer to remain restricted. Microsoft’s short-term answer was Restricted SharePoint Search, a horrible but expedient solution that works on the basis of an allow list for enterprise search which restricts users to only being able to search approved sites. Copilot always works as the signed in user, so the limits applied to users apply to Copilot to stop the AI using material stored in unapproved sites in its responses.

Restricted Content Discovery (RCD) is the latest solution to control unfettered access to confidential information stored in SharePoint Online sites. RCD is part of the SharePoint Advanced Management (SAM) suite. Microsoft is making SAM available to tenants with Microsoft 365 Copilot licenses via a code update that’s slowly deploying.

How Restricted Content Discovery Works

Restricted Content Delivery works by adding a flag to files stored in designated SharePoint Online sites. When an administrator marks a site for RCD through the SharePoint admin center or PowerShell. Figure 1 shows the “restrict content from Microsoft 365 Copilot” option in the admin center. When a site is selected for RCD, SharePoint sets a site-level property that causes index updates for every file in the site. Although RCD is applied at a site basis, SharePoint indexing happens at the file level, so a fan-out process must find and reindex every file in a site before RCD becomes effective for that site.

The time required to update the index for a site is highly dependent on the number of items in the site. Microsoft says that “for sites with more than 500,000 items, the Restricted Content Discovery update could take more than a week to fully process and reflect in search and Copilot.”

Setting the Restricted Content Discovery flag for a SharePoint Online site.
Figure 1: Setting the Restricted Content Discovery flag for a SharePoint Online site

The indexing update does not remove items from the tenant index. If it did, items would be unavailable for eDiscovery searches, auto-label policies for retention and sensitivity labels, and other solutions. Instead, the flag set on files instructs Copilot to ignore those files when it consults the Graph to find matching content to help ground user prompts. The same approach is used by the Data Loss Prevention (DLP) policy to block Copilot access to files assigned specific sensitivity labels.

The block applies to anywhere Copilot for Microsoft 365 can use SharePoint Online files, including Copilot agents. It doesn’t affect how site-level search works, nor does it interfere with other Purview solutions like eDiscovery, content searches, or DLP. However, content from sites enabled for RCD don’t appear in enterprise level searches.

RCD Management with PowerShell

PowerShell can be used to manage RCD for sites. Make sure that you use a recent version of the SharePoint Online management module (I used Microsoft.Online.SharePoint.PowerShell version 16.0.25715.12000). For example, to enable RCD for a site, run the Set-SPOSite cmdlet to set the RestrictContentOrgWideSearch property to $true.

Set-SPOSite -Identity https://office365itpros.sharepoint.com/sites/rabilling -RestrictContentOrgWideSearch $true

To remove RCD from a site, set the value for RestrictContentOrgWideSearch to $false:

Set-SPOSite -Identity https://office365itpros.sharepoint.com/sites/rabilling -RestrictContentOrgWideSearch $false

Much the same reindexing process must occur before files in sites where RCD is disabled after being enabled before files become available to Copilot.

To generate a list of sites with RCD enabled, run the Start-SPORestrictedContentDiscoverabilityReport command to create a job on a queue for processing. The Get-SPORestrictedContentDiscoverabilityReport cmdlet reports the status for the job, which eventually reports “Completed.”

Start-SPORestrictedContentDiscoverabilityReport

Generating the report will take some time. Are you sure you want to proceed?
Continue with this operation?
[Y] Yes  [N] No  [?] Help (default is "Y"): y

RunspaceId           : 1d839c7e-c0bf-4c11-be94-20179f2335e2
Id                   : 02aa91ea-5e12-43de-91a1-a58275d3b201
CreatedDateTimeInUtc : 03/31/2025 16:09:52
Status               : NotStarted

Get-SPORestrictedContentDiscoverabilityReport

RunspaceId           : 1d839c7e-c0bf-4c11-be94-20179f2335e2
Id                   : 02aa91ea-5e12-43de-91a1-a58275d3b201
CreatedDateTimeInUtc : 03/31/2025 17:03:52
Status               : Completed

To download the RCD insights report, run the Get-SPORestrictedContentDiscoverabilityReport cmdlet and pass the GUID (id) for the report. This value is shown in the Get-SPORestrictedContentDiscoverabilityReport output:

Get-SPORestrictedContentDiscoverabilityReport –Action Download –ReportId 02aa91ea-5e12-43de-91a1-a58275d3b201
Report RestrictedContentDiscoverabilityReport_1743437651407.csv downloaded successfully

Microsoft documentation says that “the downloaded report is located on the path where the command was run.” This is incorrect. The file ends up in whatever folder the PowerShell session starts up in. In my case, I ran the job when positioned in c:\temp and the file ended up in c:\windows\system32. The easy fix here is to use a PowerShell profile to define the folder where PowerShell starts up.

The contents of the “insights” report aren’t too exciting (Figure 2) and could be easily generated by looping through sites with PowerShell to find those with the flag set.

Restricted Content Discovery is enabled for these sites
Figure 2: Restricted Content Discovery is enabled for these sites

Restricted Content Discovery for All

It’s a reasonable guess that any Microsoft 365 tenant that’s interested in Copilot has some sensitive information stored in SharePoint Online sites. If you’re in this situation, you should consider RCD as the front-line method to prevent that information leaking out through Copilot. I’d also deploy the DLP policy to restrict Copilot access as a backup. Between the two lines of defence, it’s unlikely that inadvertent disclosure of confidential data will happen, and that’s a good thing.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/04/02/restricted-content-discovery-works/feed/ 5 68682
Copilot in Outlook Gets a Revamp https://office365itpros.com/2025/03/21/copilot-for-outlook-ui/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-outlook-ui https://office365itpros.com/2025/03/21/copilot-for-outlook-ui/#respond Fri, 21 Mar 2025 07:00:00 +0000 https://office365itpros.com/?p=68524

Tweaks to Copilot for Outlook Make the Functionality More Accessible

On Tuesday, I reported that I thought the new Facilitator agent in Teams chat is a good example of AI performing a task well. It’s evidence of how the initial rush of deploying AI everywhere to anything that could have a Copilot label applied is moderating into better implementations.

Message center notification MC892651 (last updated 18 March 2025, Microsoft 365 roadmap item 397092) could be regarded as being in the same category. In this case, the UI for Copilot interactions in the Outlook has what Microsoft terms as “major design improvements” for the new Outlook on Windows and Mac desktops, OWA, and Outlook mobile clients. Outlook classic remains unaltered.

Perhaps because it involves major improvements or a wide range of clients, the deployment of the update has been delayed. Microsoft originally intended to have full deployment done by late February 2025. That date is now late April 2025. When this happens, it normally means that Microsoft had to halt the deployment to fix some problems.

No New Functionality in Revamped UI

According to Microsoft, the revamped UI doesn’t include any new functionality. I never saw the ‘rewrite like a poem’ option before (which might have improved some of my email enormously), so the fact that the new layout and navigation makes this option accessible (Figure 1) is proof that the overhaul works.

The revamped Copilot for Outlook UI in the new Outlook for Windows.
Figure 1: The revamped Copilot for Outlook UI in the new Outlook for Windows

Of course, things work differently on mobile devices, but the changes seem to make things better there too (Figure 2).

Copilot for Outlook mobile.
Figure 2: Copilot for Outlook mobile

By comparison, the Copilot options in Outlook classic are a tad austere (Figure 3), just like the options were like in the other clients before the change. The changes made in the other clients proves once again that good design is important when it comes to making technology accessible to users.

Copilot options in Outlook classic.
Figure 3: Copilot options in Outlook classic

UI Great, Text Awful

I like the UI changes and think they improve how Copilot for Outlook works. However, the changes do nothing to improve the quality of the written text generated by Copilot, which remains bland and overly effusive to my taste. I guess that’s my personal approach to email shining through because I favor brief to-the-point messages over lengthy missives.

The late Mark Hurd (CEO of HP at the time) once advised me to always put the most important information in a message into the first paragraph so that recipients could quickly review items in their inbox without needing to read long messages on mobile devices (Blackberries and iPAQs then). Technology has moved on, but the advice is still true, especially as so many different forms of mobile devices are now in use. Maybe Copilot for Outlook needs a rewrite in one brief paragraph option.

More Change to Come

Although it sometimes seems much longer, we’re still only two years into the Copilot era. We’ll see more changes like this as Microsoft refines and enhances how Copilot is integrated into apps. Now that they’ve given Outlook a nice new UI, perhaps they’ll do the same for Excel and PowerPoint to make it easier to use Copilot in those apps. Or maybe that’s just me moaning because I’m not as proficient as I should be with those apps.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/03/21/copilot-for-outlook-ui/feed/ 0 68524
Use Data Loss Prevention to Stop Microsoft 365 Copilot Chat from Processing Documents in Its Responses https://office365itpros.com/2025/03/20/dlp-policy-for-microsoft-365-copilot/?utm_source=rss&utm_medium=rss&utm_campaign=dlp-policy-for-microsoft-365-copilot https://office365itpros.com/2025/03/20/dlp-policy-for-microsoft-365-copilot/#comments Thu, 20 Mar 2025 07:00:00 +0000 https://office365itpros.com/?p=68504

DLP Policy for Microsoft 365 Copilot to Restrict Access to Sensitive Documents

Ever since the introduction of Microsoft 365 Copilot in March 2023, organizations have struggled to stop the AI consuming confidential or sensitive documents in its responses. Some of the early tools, like Restricted SharePoint Search, were blunt instruments hurried out as responses to customer requests. Microsoft’s current best answer is SharePoint Restricted Content Discovery (RCD), a feature normally licensed through SharePoint Advanced Management (SAM). All tenants with Microsoft 365 Copilot licenses are due to receive access to RCD and the deployment process is ongoing.

Microsoft says that the key use case for RCD is to “prevent accidental discovery of [files stored in] high-risk sites.” RCD works by limiting the ability of end users to search selected sites. By excluding sites from search, RCD prevents Copilot Chat (and agents based on Copilot Chat) from using the files stored in those sites in its responses. It’s still possible for Copilot to use information from a sensitive document if the user has the file opened in an app like Word. At this point, the sensitive content is open in memory and available for Copilot to process.

Blocking files from user access doesn’t stop system functions like eDiscovery working.

Update April 21: MC1059677 announces the extension of DLP protection to Copilot in Office apps (Word, PowerPoint, Outlook, and Excel).

Blocking Access to Individual Files

RCD is a good way to cast a protective net across multiple sites. But what about protecting individual files that might be in sites that aren’t covered by RCD? Until now, the answer has been to use sensitivity labels to stop Copilot Chat using sensitive files to generate its responses. Although sensitivity labels can stop Copilot using the content of protected files, it cannot prevent Copilot finding reference protected files through a metadata search.

Creating a DLP Policy for Microsoft 365 Copilot

A solution to that problem might be coming in the form of a new type of Data Loss Prevention (DLP) policy. The feature is described in message center notification MC937930 (last updated 6 February 2025, Microsoft 365 Roadmap ID 423483). DLP policies are usually used to block external sharing of confidential information, like Teams meeting recordings. Blocking files for internal consumption is a new step.

Essentially, tenants can create a DLP policy to check for specific sensitivity labels and block Copilot Chat (and agent) access to files with those labels. The functionality is now in preview and is scheduled for general availability in June 2025 (complete worldwide by the end of July 2025). Some gaps are always expected in preview code, and the gaps right now include alerts, incident reports, policy simulation, and audit records. In other words, it’s very hard to know when a DLP policy match happens to block access. But testing indicates that the DLP policy works.

The DLP policy for Microsoft 365 Copilot is a special form of policy in that the policy only covers Copilot and no other type of data (Figure 1).

Creating a DLP policy for Microsoft 365 Copilot.
Figure 1: Creating a DLP policy for Microsoft 365 Copilot

The rules used in a DLP policy for Microsoft 365 Copilot are simple. The policy checks if a file has a specific sensitivity label, and if the sensitivity label is found, DLP executes the action to “prevent Copilot from processing content” (Figure 2). A rule can check for the presence or one or more sensitivity labels. In some respects, it might be easier to create a separate rule for each label.

Creating a DLP rule for Microsoft 365 Copilot.
Figure 2: Creating a DLP rule for Microsoft 365 Copilot

Testing the DLP Policy for Microsoft 365 Copilot

To test the new DLP policy, I created several documents referring to regulations governing cryptocurrency in Iceland (a topic selected at random because I knew that my tenant was unlikely to store any files relating to the topic). I used Copilot for Word to generate the text for each file and added a reference to a mythical regulation to the text of each document to give Copilot an easy target to find. The first check asked Copilot Chat to find documents relating to cryptocurrency in Iceland with special relevance to the regulation. The sensitivity labels assigned to the documents were not covered by a DLP policy for Microsoft 365 Copilot, and Copilot found all the documents (Figure 3).

Copilot finds confidential documents without sensitivity labels monitored by a DLP policy.
Figure 3: Copilot finds confidential documents without sensitivity labels monitored by a DLP policy

After applying sensitivity labels covered by the DLP policy for Microsoft 365 Copilot to two of the three documents, the search was rerun and Copilot found only one document (Figure 4).

The DLP policy for Microsoft 365 Copilot blocks files protected by specific sensitivity labels.
Figure 4: The DLP policy for Microsoft 365 Copilot blocks files protected by specific sensitivity labels

I don’t pretend this to be a full test. However, it’s the only way to check preview software that doesn’t generate audit records or other traces to show when DLP policy matches occur to force DLP to execute the defined actions.

New DLP Policy Shows Promise

I’ll look forward to retesting the DLP Policy for Microsoft 365 Copilot after the software reaches GA and the full array of auditing and reporting options are available. Auto-label policies can only apply sensitivity labels to Office files and PDFs, and I suspect that this limitation won’t be lifted. That’s a pity because it stops the DLP policy being able to control access to items like the .MP4 files used for Teams Meeting Recordings (transcripts).

The nice thing is that users see no trace of a sensitive document show up in Microsoft 365 Copilot Chat. Unlike basic sensitivity label protection, which allows Copilot Chat to show metadata found in its searches, the DLP policy is silent. And that’s just the way you’d want it to be when dealing with sensitive data.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/03/20/dlp-policy-for-microsoft-365-copilot/feed/ 3 68504
Facilitator Agent Brings AI-Powered Notetaking to Teams Chat https://office365itpros.com/2025/03/18/facilitator-agent-chat/?utm_source=rss&utm_medium=rss&utm_campaign=facilitator-agent-chat https://office365itpros.com/2025/03/18/facilitator-agent-chat/#comments Tue, 18 Mar 2025 07:00:00 +0000 https://office365itpros.com/?p=68453

Facilitator Agent Extracts Value from Teams Chat

In an article last month, I discussed why Microsoft 365 Copilot works better for some people than it does for others. The article is based on a blog by Abram Jackson, a program manager working on Microsoft 365 Copilot, and one of the points he makes is that AI fails when it doesn’t have the right data to process. This is why Copilot is so good at summarizing a bounded set of data such as a Teams meeting transcript or email thread and less good at other tasks.

Which brings me to a new bounded AI use in the Teams Facilitator “collaborative communication” agent (see message center notification MC1017117, last updated 10 March 2025, Microsoft 365 roadmap item 476811). The agent has been available in targeted release and is heading for general availability in April 2025. Facilitator is available for meetings and chats, but here I focus on chats because this is an area where AI hasn’t ventured before. According to Microsoft, “the Facilitator creates and maintains an up-to-date summary of important information as the conversation happens, including key decisions, actions items, and open questions to resolve.

The administrator documentation and user documentation and doesn’t need to be repeating here. Essentially, you’ll need a Microsoft 365 Copilot license to use Facilitator (otherwise known as AI Notes). Note generation is supported for English now with support for more languages in the pipeline.

Control over  who can use Facilitator is exerted by allowing people access to the Facilitator app in the Teams admin center. Microsoft says that after general availability, the app is enabled by default and can be used in chats by enabling the AI Notes option (click the icon to the right of the Copilot icon). Let’s see what happens.

Using AI Notes in a Chat

When a chat starts, it’s an empty thread and there’s nothing for AI to process. In fact, AI cannot process information until it has sufficient data to understand what’s happening. This is what’s happening in Figure 1. Facilitator is enabled for the chat but only three messages are in the thread and that’s not enough.

Facilitator needs some messages to process before it can do anything.
Figure 1: The Facilitator agent needs some messages to process before it can do anything

This isn’t a problem because the intention behind Facilitator is that it will help chat participants understand what’s been discussed in a thread. It’s easy to understand the conversation after three messages. It’s much more difficult to do so after a hundred messages in a fast-moving debate. The same situation occurs for Microsoft 365 Copilot in a Teams meeting where a certain amount of data must accumulate in the meeting transcript before Copilot becomes active.

As the chat develops, Facilitator begins to generate notes (Figure 2) to capture the major points raised in the chat, any decisions made, and any questions that remain unanswered. Facilitator updates the notes displayed in the pane periodically and highlights new information that a chat participant hasn’t seen. Like other Copilot implementations, reference numbers allow users to access the source for a note.

AI Notes generated by the Facilitator agent as a chat develops.
Figure 2: AI Notes generated by the Facilitator agent as a chat develops

At the end of the chat, any of the chat participants can ask Facilitator a question by using an @Faciliator mention and entering the question (Figure 3).

The Facilitator agent summarizes a chat.
Figure 3: The Facilitator agent summarizes a chat

Alternatively, a participant with access to the AI Notes can copy the notes and paste them into the chat. This is a good way to share AI Notes with chat participants who don’t have a Microsoft 365 Copilot license as those people cannot enable and view AI Notes for the chat.

External Participants Turn Off Facilitator

The Facilitator agent can’t be used in chats that involve external participants (guest users or external federated chats). This is likely because no mechanism is available in a chat to allow people to grant consent for their messages to be processed by an agent. When people join a meeting, they have the chance to grant consent for transcription, and it’s the transcript that’s used by Microsoft 365 Copilot to summarize the meeting or answer questions about the proceedings.

Facilitator is a Nice Tool to Have

I like Facilitator very much. It’s an example of focused application of AI LLMs to reason over a bounded set of data to generate results that works well in practice. Facilitator is not enough to justify the full price of a Microsoft 365 Copilot license, but it is step in the right direction and is a sign that we’re moving away from what some call the “party tricks” of Copilot to the implementation of some really useful tools.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/03/18/facilitator-agent-chat/feed/ 1 68453
Why Microsoft 365 Copilot Works for Some and Not for Others https://office365itpros.com/2025/02/20/make-copilot-useful/?utm_source=rss&utm_medium=rss&utm_campaign=make-copilot-useful https://office365itpros.com/2025/02/20/make-copilot-useful/#comments Thu, 20 Feb 2025 07:00:00 +0000 https://office365itpros.com/?p=68101

I Can’t Wait for Agentic Experiences to Make Copilot Useful

We’re all on a journey to understand how to use artificial intelligence effectively to improve systems, lives, and human existence. If you pay for the necessary licenses, Copilot is everywhere within the Microsoft 365 ecosystem, both as helpers deployed in desktop apps like Word, Teams, and PowerPoint, and the possibility of custom agents for tenants to develop and deploy, albeit without the necessary tools to manage potentially thousands of agents created by citizen developers.

According to Microsoft CEO Satya Nadella, Microsoft wants to make it as simple for people to create agents than it is to create an Excel worksheet, which might mean the creation of the “highly customized agentic experiences” referred to in Microsoft 365 center notification MC985480 (January 22). I don’t quite know that phrase means, and the clarifying text that said it “means you can design unique prompts, connect to any LLM, and integrate these custom agents with Microsoft 365 Copilot” wasn’t much help either. When I asked Copilot, it struggled with the concept too (Figure 1). In any case, I’m sure that we’ll all be happy in our highly customized agentic world when it arrives.

Copilot attempts to define highly customized agentic experiences.
Figure 1: Copilot attempts to define highly customized agentic experiences

Why Today’s AI Falls Short of its Hype

All of which brings me to a thoughtful article in the Tomorrow’s Blueprint blog entitled “Why Others Think AI Is a Miracle But You Think It’s Useless.” The author is Microsoft product manager Abram Jackson, now deeply involved in the development of Microsoft 365 Copilot. The core of the article is an assertion that:

Today’s AI falls short of its hype for many due to three big reasons:

  • It often doesn’t have the data it needs to work with
  • Defining tasks precisely is very difficult
  • There’s little AI can do other than give you text or images.”

Abram knows much more about AI than I do. I reckon that he has captured the problems faced by many organizations as they consider how to extract value from a potentially massive investment in Copilot licenses.

Without access to data, Copilot can do nothing. The magic of Microsoft 365 Copilot, if some exists, is the Microsoft Graph, or access to the documents, emails, and Teams messages stored within Microsoft 365. Yet the legacy of some older Microsoft decisions around collaboration strategy forced organizations to restrict SharePoint Search to stop Copilot revealing information to anyone who asked. As it turns out, it is hard to stop Copilot using data because even document metadata can reveal secrets.

I like the way Abram discusses the issue of defining tasks. Math works because the answer is either right or wrong. Copilot works very well when given well-defined tasks to do, like summarizing a meeting transcript or extracting tasks for people to consider. The same goes for scanning an email thread or summarizing a Word document. Generating text is less satisfactory unless the user is very precise in their prompt and grounds Copilot with some suitable input, like documents to work from. The promise of early demos where Copilot generated project reports and other material in the blink of an eye is never attained where loose prompting gives the AI free rein to indulge itself.

How People Need to Use AI

The summary is that to extract value from AI (and Microsoft 365 Copilot in particular), users must:

Understand if a task is valuable and not prone to hallucinations. Asking Copilot for Word to scan a document and decide if it is well-structured and how make improvements is valuable for many people who aren’t natural writers. Asking Copilot for Word to generate the initial document introduces the possibility of hallucinations.

Work to define the task precisely: Asking Copilot to do something very precisely with clear boundaries and guidelines will generate much better results than dashing off a quick prompt. Grounding a prompt with some relevant information, like several pertinent documents will always help Copilot to generate better information.

Translate the result generated by the AI into the form you need it to be. For chat, the introduction of Copilot pages has proven useful because it allows users to easily capture the output generated by Copilot for reuse. But will the slides generated by Copilot for PowerPoint be the type you need? Or can Copilot for Excel really perform the computations you want? Of course, they can, but only with practice and perseverance on the part of the human.

As Abram says, this approach “isn’t natural and it is time-consuming.” It comes about because Copilot is essentially an eager assistant that wants to work but will do stupid things unless you tell it precisely what to do and how to do it. Expanding on the example shown in Figure 1, adding context and direction to the prompt gives Copilot the chance to deliver a much better answer. Prompts can now be up to 128,000 characters, so there’s lots of room for comprehensive instructions.

Make Copilot useful by giving the AI better and more detailed instructions. It's more likely to come up with a good answer.
Figure 2: Make Copilot useful by giving the AI better and more detailed instructions

The Bing Conundrum

One last point about data being available for Copilot to work with. I’m not sure about Abram’s statement that “hallucination is largely a solved problem for Microsoft Copilot.” I see odd stuff generated all the time. Abram justifies his claim by saying that “Copilot is trained to only respond with information it has been able to find through search.”

Copilot depends on Bing and Bing isn’t very good at searching. Take this website. Despite the ease in which Google has indexed and searched all my articles for years, Bing stubbornly refused to touch the site. I only discovered this fact when creating some declarative agents that used office365itpros.com as a source. Since then, the best efforts of WordPress support and my own attempts to navigate the online Bing webmaster advice have only just persuaded Bing to start indexing some pages. Some of the blocks are quite silly. One problem that caused Bing to refuse to index pages was the lack of an alt tag for a graphic in a sidebar.

If Copilot had better search facilities, it could generate better answers because it has better data to work with.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2025/02/20/make-copilot-useful/feed/ 1 68101
Microsoft Launches Copilot for All Initiative https://office365itpros.com/2025/01/16/microsoft-365-copilot-chat-jan25/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-chat-jan25 https://office365itpros.com/2025/01/16/microsoft-365-copilot-chat-jan25/#comments Thu, 16 Jan 2025 07:00:00 +0000 https://office365itpros.com/?p=67692

New Agent Capabilities for the Free Microsoft 365 Copilot Chat App

Infused with the zealotry of true believers, Microsoft announced Copilot for All on January 15, 2025 to reveal the details of the complicated Copilot renaming they previewed in December. And the new logo, of course.

In a nutshell, Microsoft is creating an “on-ramp” to give Microsoft 365 tenants that haven’t invested in expensive Microsoft 365 Copilot licenses the chance to use agent technology “grounded in Microsoft Graph data.” The idea here is to encourage commercial customers to run a mix of Copilot with some having the full-blown licensed version while others experience with the free-to-use version. Figure 1 shows the relative capabilities of the two Copilot options.

Functionality available in the two Microsoft 365 Copilot products.
Figure 1: Functionality available in the two Microsoft 365 Copilot products (source: Microsoft)

.Lots of Functionality in Microsoft 365 Copilot Chat

The free-to-use Microsoft 365 Copilot Chat app includes a lot of functionality in terms of its ability process user prompts against information available on web sites (providing those sites are indexed by Bing). Recently, Microsoft added features like Copilot pages and the image generator (Figure 2). Microsoft says that limitations exist on the number of images that can be generated daily. I guess I don’t create many images as I haven’t experienced any problems.

Generating an image in Microsoft 365 Copilot Chat.
Figure 2: Generating an image in Microsoft 365 Copilot Chat

The Chat client has enterprise data protection, so data is secure, protected, and actions are audited and captured in compliance records.

Pay-as-you-go Agents

The big news is that customers will be able to create and run custom agents grounded against “work data” on a pay-as-you-go (PAYG) metered basis. PAYG means the tenant must sign up for an Azure subscription with a valid credit card before the agent will run. Agent activity is charged against the subscription using “messages” as the metering unit (an action performed by an agent can consume up to 25 messages). Grounding against work data means that the agents can interrogate information available in the Microsoft Graph. Technically speaking, Graph data includes Exchange, Teams, SharePoint, and OneDrive plus anything imported into the Graph through a third-party connector. However, the capabilities of today’s agents are limited to SharePoint and OneDrive sites plus Graph connectors. In any case, there is some magic here to exploit because if an organization can import its data into the Graph, agents can reason over that data to create responses to user prompts, providing PAYG is set up for the tenant.

The custom agents are developed with Copilot Studio. I have spent some time working with Copilot Studio to build simple agents over the last few weeks. It’s not a terribly difficult task, but organizations do need to take the time to chart out how they plan to develop, deploy, and manage agents rather than rushing headlong into the brand-new world. Like any software, agents work best when some structure is in place.

The Big Differences between Microsoft 365 Copilot Chat and Microsoft 365 Copilot

Paying for agents to use Graph data does not deliver the full suite of capabilities enjoyed by those who invest in Microsoft 365 Copilot licenses. Figure 1 shows that Microsoft 365 Copilot includes a bunch of personal assistants where Copilot is built into Microsoft 365 apps like Teams, Word, Outlook, PowerPoint, and Excel. Sometimes, as in the case of the automatic document summary generated by Copilot in Word, the help is unwanted, but the personal assistants are very good at helping with other tasks, like summarizing long email threads or recapping Teams meetings.

Microsoft 365 Copilot also includes SharePoint Advanced Management (SAM). However, although Microsoft announced at Ignite 2024 that tenants with Microsoft 365 Copilot licenses would get SAM in early 2025, there’s no trace of these licenses turning up in any tenant that I have access to. License management can be complex and I’m sure that SAM will turn up soon.

Finally, PAYG access to Graph data does not include the semantic index. The index is generated automatically from Graph data in tenants with Microsoft 365 Copilot licenses to create a vector-based index of the relationships of items in the Graph. It’s an untrue urban legend that Microsoft 365 Copilot needs the semantic index to function. The semantic index enhances search results, but it’s not required for the chat app or agents to work.

In Simple Terms, Two Copilot Products

It’s easy to become confused by the naming of different elements within the Microsoft 365 Copilot ecosystem. It boils down to Microsoft offering free (with PAYG capabilities) and expensive Copilot products to Microsoft 365 customers. Microsoft obviously hopes that the free version will act as the on-ramp to full-fledged Copilot. It’s a reasonable tactic. Time will tell if it’s successful.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering the Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2025/01/16/microsoft-365-copilot-chat-jan25/feed/ 4 67692
Microsoft’s Simple Message at Ignite: It’s All About AI https://office365itpros.com/2024/11/22/ignite-2024-ai/?utm_source=rss&utm_medium=rss&utm_campaign=ignite-2024-ai https://office365itpros.com/2024/11/22/ignite-2024-ai/#comments Fri, 22 Nov 2024 07:00:00 +0000 https://office365itpros.com/?p=67183

Copilot Branding Applied Liberally Across All Product Announcements at Ignite 2024

I decided to stay away from the Ignite 2024 conference in Chicago this week. The monetary investment to fly to Chicago, stay in a hotel, meals, lost time, and the conference fee outweighed the potential return. I would have liked to meet up with people, but the cost to attend what’s essentially a marketing event was way too high.

What’s clear from the announcements made at Ignite is that Microsoft is heavily focused at recouping the massive investments they’ve made to build out the datacenter infrastructure to deliver artificial intelligence functionality. That’s understandable in light of quarterly investments of around $20 billion in hardware, software, and datacenter fabric. Another factor is the need to extract more revenue from the Microsoft 365 installed base to offset a slowing in the growth of overall user numbers.

A Slew of AI Announcements at Ignite 2024

The net result is a slew of announcements for AI-infused functionality helpfully captured in the Ignite 2024 “Book of News.” The online document mentions Copilot 259 times and AI 278 times, which is a clear statement of where Microsoft’s PR priorities lie.

The announcements range from general availability for features that are already shipping (like Agents in SharePoint Online) to some very interesting developments for Teams, like the ability for Copilot in Teams to analyze information shared on-screen during meetings. Another thing that seized my attention was how Copilot can schedule focus time or 1:1 meetings similar to the way that the now-defunct Cortana Scheduler attempted to help users select optimum meeting slots. The ability to have live translation for multilingual meetings (rather than just from a single language into other languages) should also be popular in multinational organizations.

A welcome development is the introduction of detection of prompt injection in Purview Communication Compliance. After researchers at Black Hat 2024 described some vulnerabilities in Microsoft 365 Copilot Chat, including prompt manipulation, Microsoft said that they had addressed the issue without giving details. Now, Communication Compliance will detect and report attempts to inject prompts to “elicit unauthorized behavior from the large language model (LLM).”

Restricting Access to Information

On the tenant administrative side, the work to help organizations restrict the ability of Microsoft 365 Copilot to process documents continues. For example, a new DLP rule condition based on the sensitivity label assigned to documents can prevent Copilot summarizing information from documents or using content from documents in its responses. On the downside, it’s unbelievable that Microsoft can justify calling one new rule condition “Microsoft Purview Data Loss Prevention for Microsoft 365 Copilot.

At a broader scale, Restricted Content Discoverability (RCD) will stop Copilot accessing documents in sites on a deny list. RCD is a more sensible and scalable approach than the 100-curated site allow list implemented in Restricted SharePoint Search.

I was pleased to hear that Microsoft plans to make SharePoint Advanced Management (SAM) licenses available to tenants with Microsoft 365 Copilot. I called for this to happen in an October 3 post. It didn’t make sense to ask customers to pay the $3/user/month fee for SAM to control aspects of Microsoft 365 Copilot that they pay $30/user/month for. Apparently, the roll-out of SAM licenses to eligible tenants will happen in early 2025.

Also in SharePoint Online, a new sensitivity label option will extend SharePoint site permissions to downloaded documents. The new configuration handles situations like when a user loses access to a site, or a file is deleted from a site. In these situations, the sensitivity label will recognize that the situation for a document has changed and block access. To implement the protection, you’ll need both an E5 license (to set a default sensitivity label for the site) and a SAM license.

Conditional Access for Generative AI

Not to be outdone by announcements by other development groups, the Entra ID team released details of Protect AI with a Conditional Access Policy, which is all about limiting access to AI services like Microsoft 365 Copilot and Security Copilot through conditional access policies.

To make the block work, Microsoft asks tenants to create two service principals to represent the Enterprise Copilot Platform and Security Copilot apps. The service principals represent the instantiation of the apps used by Copilot within a tenant and allow conditional access policies to monitor connections to the apps (read this article to discover more about sign-in activity for service principals). Conditional access policies can apply restrictions to app connections like enforcing multifactor authentication (MFA) or a certain type of strength for multifactor authentication, like requiring the use of a FIDO2 key.

I created a conditional access policy to require MFA for Copilot. It works, but the user experience isn’t great. For instance, Figure 1 shows what the user sees when an account that doesn’t use MFA attempts to connect to Microsoft Copilot.

Microsoft Copilot fails to connect due to the requirement for MFA

Ignite 2024
Figure 1: Microsoft Copilot fails to connect due to the requirement for MFA

It seems like the user-facing experience doesn’t cope well with the error that results when the browser attempts to connect to the Enterprise Copilot Platform app. No doubt the chat client will get an update to resolve the problem.

Great Technology Revealed at Ignite 2024, But Someone’s Got to Pay

It’s great that Microsoft continues to push the boundaries of how AI can help Microsoft 365 tenants. However, we shouldn’t lose sight of the fact that Microsoft 365 Copilot is not as widely used within the 400-million plus installed base of Office 365 paid seats. It’s definitely in Microsoft’s interest to convince more of that installed base to buy Copilot, but it would be nice if every new feature that arrives didn’t come with the requirement for a new license, license upgrade, or add-on.


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365.

]]>
https://office365itpros.com/2024/11/22/ignite-2024-ai/feed/ 4 67183
Will Microsoft 365 Copilot Errors and Hallucinations Eventually Corrupt the Microsoft Graph? https://office365itpros.com/2024/10/18/copilot-errors-graph/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-errors-graph https://office365itpros.com/2024/10/18/copilot-errors-graph/#respond Fri, 18 Oct 2024 07:00:00 +0000 https://office365itpros.com/?p=66738

Copilot Errors in AI-Generated Text Can Persist and Spread

When I discussed working with Copilot Pages last Wednesday, I noted the usefulness of being able to capture output generated by Microsoft 365 Copilot as a response to a prompt in a Loop component. That’s the happy side of the equation. The dark side is that being able to capture AI-generated text so easily makes it easier for hallucinations and mistakes to sneak into the Microsoft Graph and become the source for further Copilot errors.

Take the example I showed in Figure 1 of the article where Copilot’s response captured in a page includes an incorrect fact about compliance search purge actions. Copilot reports that a soft-delete action moves items into the Deleted Items folder (in reality, the items go into the Deletions folder in Recoverable Items). This isn’t a big problem because I recognized the issue immediately. The Copilot results cited two documents and two web sites, but I couldn’t find the erroneous text in any of these locations, which implies that the knowledge came from the LLM.

Copilot Errors Can Persist

The text copied into the Copilot page included the error and was caught and corrected there. The content stored in the Loop component is accurate. But here’s the thing. When I went back to Microsoft 365 Business Chat (aka BizChat) to repeat the question with a different prompt asking Copilot to be explicit about what happens to soft-deleted items, the error is present once again, even though Copilot now cites the page created for the previous query (Figure 1).

Copilot generated text contains an error

Copilot errors
Figure 1: Copilot generated text contains an error

At this point there’s not much more I can do. I have checked the Graph and other sources cited by Copilot and can’t find the error there. I’ve added a Copilot page with corrected information and seen that page cited in a response where the error is present. There’s no other route available to track down pesky Copilot errors. I guess this experience underlines once again that any text generated by an AI tool must be carefully checked and verified before it’s accepted.

AI-Generated Text Infects the Graph

But humans are humans. Some of us are very good at reading over AI-generated text to correct mistakes that might be present. Some of us are less good and might just accept what Copilot generates as accurate and useful information. The problem arises when AI-generated material that includes errors is stored in files in SharePoint Online or OneDrive for Business. (I’m more worried about material stored in SharePoint Online because it is shared more broadly than the personal files held in OneDrive).

When documents containing flawed AI-generated text infect the Graph, no one knows about the errors or where they originated. The polluted text becomes part of the corporate knowledge base. Errors are available to be recycled by Copilot again and again. In fact, because more documents are created containing the same errors over time, the feeling that the errors are fact becomes stronger because Copilot has more files to cite as sources. And if people don’t know that the text originated from Copilot, they’ll regard it as content written and checked by a human.

The Human Side

Humans make mistakes too. We try and eliminate errors as much as we can by asking co-workers to review text and check facts. Important documents might be reviewed several times to pick up and tease out issues prior to publication. At least, that’s what should happen.

The content of documents ages and can become less reliable over time. The digital debris accumulated in SharePoint Online and OneDrive for Business over years is equally likely to cajole Copilot into generating inaccurate or misleading content. Unless organizations manage old content over time, the quality of the results generated by Copilot are likely to degrade. To be fair to Microsoft, lots of work is happening in places like SharePoint Advanced Management to tackle aspects of the problem.

Protecting the Graph

I hear a lot about managing the access Copilot has to content by restricting search or blocking off individual documents. By comparison, little discussion happens about how to ensure the quality of information generated by users (with or without AI help) to prevent the pollution of the Microsoft Graph.

Perhaps we’re coming out of the initial excitement caused by thoughts about how AI could liberate users from mundane tasks to a period where we realize how AI must be controlled and mastered to extract maximum advantage. It’s hard to stop AI pollution creeping into the Microsoft Graph, but I think that this is a challenge that organizations should think about before the state of their Graph descends into chaos.


]]>
https://office365itpros.com/2024/10/18/copilot-errors-graph/feed/ 0 66738
Copilot Usage Report APIs Available https://office365itpros.com/2024/09/13/copilot-usage-report-api/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-usage-report-api https://office365itpros.com/2024/09/13/copilot-usage-report-api/#comments Fri, 13 Sep 2024 07:00:00 +0000 https://office365itpros.com/?p=66347

Copilot Usage Reports Weak on Detail

Announced in message center notification MC877369 (29 August 2024, Microsoft 365 roadmap item 396562), the Microsoft Graph beta usage reports API now includes support for Microsoft 365 Copilot tenant usage data. All tenants with Microsoft 365 Copilot licenses should now have access to the usage data.

Microsoft says that the availability of this information will “facilitate the creation of customized reporting and analytics,” but the fact is that the data exposed by the API is bare-bones. On the upside, the data matches what’s available in the report section of the Microsoft 365 admin center (Figure 1).

  • Tenant-level summary of Copilot-enabled (licensed) users and active users.
  • Adoption trend (tenant summary) over time.
  • Last activity date for Copilot interaction in different apps for each user.
Copilot usage reports in the Microsoft 365 admin center.
Figure 1: Copilot usage reports in the Microsoft 365 admin center

Accounts accessing the Graph data must have a Copilot for Microsoft 365 license.

User Count Summary

The user count summary report returns a count of the user accounts licensed for Microsoft 365 Copilot (enabled users) and a count of the users with an active interaction with Copilot in each app during the reporting period (7, 30, 90, or 180 days). Unsurprisingly, when someone is enabled for Copilot in one app, they’re usually enabled for all:

  • Teams
  • Outlook (classic, new Outlook for Windows, OWA).
  • Excel.
  • PowerPoint.
  • Copilot Graph-grounded chat (aka Copilot Chat).
  • OneNote.
  • Loop.
$Uri = "https://graph.microsoft.com/beta/reports/getMicrosoft365CopilotUserCountSummary(period='D90')"
$Data = Invoke-GraphRequest -Uri $Uri -Method Get
$Data.value.adoptionByProduct

Name                           Value
----                           -----
loopEnabledUsers               100
reportPeriod                   90
oneNoteActiveUsers             3
wordEnabledUsers               100
powerPointEnabledUsers         100
microsoftTeamsActiveUsers      97
oneNoteEnabledUsers            100
excelActiveUsers               43
loopActiveUsers                2
copilotChatEnabledUsers        100
outlookEnabledUsers            100
anyAppEnabledUsers             100
anyAppActiveUsers              97
microsoftTeamsEnabledUsers     100
excelEnabledUsers              100
wordActiveUsers                61
powerPointActiveUsers          12
copilotChatActiveUsers         73
outlookActiveUsers             18

User Activity Detail

The user activity detail report is the most interesting because it details the last activity date for Copilot interaction by users with each of the various Copilot-enabled apps. In addition, the last activity date for any Copilot interaction with any of the supported apps is published (lastActivityDate). An array (value) holds a separate usage report for each Copilot-enabled account.

The user principal name and display name is obfuscated if the tenant data privacy control is enabled. In the following extract, we see that the user has never used Copilot for Loop and OneNote and hasn’t used Copilot with PowerPoint since April 11, 2024:

$Uri = "https://graph.microsoft.com/beta/reports/getMicrosoft365CopilotUsageUserDetail(period='D90')"
$Data = Invoke-GraphRequest -Uri $Uri -Method Get
$Data.value[0] | Format-Table Name, Value -AutoSize

Name                                  Value
----                                  -----
copilotActivityUserDetailsByPeriod    {System.Collections.Hashtable}
reportRefreshDate                     2024-11-04
oneNoteCopilotLastActivityDate
loopCopilotLastActivityDate           2024-09-11
microsoftTeamsCopilotLastActivityDate 2024-09-27
powerPointCopilotLastActivityDate     2024-09-22
wordCopilotLastActivityDate           2024-10-29
outlookCopilotLastActivityDate        2024-10-09
excelCopilotLastActivityDate          2024-09-05
lastActivityDate                      2024-10-31
copilotChatLastActivityDate           2024-10-31
userPrincipalName                     Tony.Redmond@office365itpros.com
displayName                           Tony Redmond

Adoption Trend over Time

This report returns an array called adoptionByDate with entries for each day during the reporting period (7, 30, 90, or 180 days). The purpose of the report is to track progress in Copilot adoption over time and to note if any specific action had an effect. For instance, you might run an education campaign to teach users how to generate effective results using Copilot in Excel. Over the weeks following the campaign, you’d expect to see the number of users who use Copilot in Excel to grow.

$Uri = "https://graph.microsoft.com/beta/reports/getMicrosoft365CopilotUserCountTrend(period='D90')"
$Data = Invoke-GraphRequest -Uri $Uri -Method Get
$Data.Value.copilotActivityUserDetailsByPeriod

reportDate                     2024-06-17
excelEnabledUsers              100
wordActiveUsers                51
powerPointActiveUsers          11
copilotChatActiveUsers         66
outlookActiveUsers             15
loopEnabledUsers               100
oneNoteActiveUsers             1
wordEnabledUsers               100
powerPointEnabledUsers         100
microsoftTeamsActiveUsers      86
oneNoteEnabledUsers            1
excelActiveUsers               21
loopActiveUsers                1
copilotChatEnabledUsers        100
outlookEnabledUsers            100
anyAppEnabledUsers             100
anyAppActiveUsers              86
microsoftTeamsEnabledUsers     100

Track Copilot Activity Using Audit Records instead of Copilot Usage Reports

Although it’s nice to have Copilot usage reports included in the Graph API, the information exposed isn’t very informative in terms of how people use Copilot. The data tells you that someone used Copilot in an app during a day. At least, they clicked a Copilot button. The information doesn’t reveal any more insight than that. Any enterprise who invests large sums of money in expensive Microsoft 365 Copilot licenses will find a dearth of detail here in terms of understanding whether the investment is justified. In many cases, you will be better off analyzing the audit records captured for Copilot interactions to figure out what’s really going on.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2024/09/13/copilot-usage-report-api/feed/ 10 66347
Using Microsoft 365 Copilot for Word https://office365itpros.com/2023/12/14/copilot-for-word/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-word https://office365itpros.com/2023/12/14/copilot-for-word/#comments Thu, 14 Dec 2023 01:00:00 +0000 https://office365itpros.com/?p=62822

Copilot for Word Will Help Many Authors Create Better Text

As folks might know, I write quite a few articles about technical topics. Recently, I’ve had the assistance of Microsoft 365 Copilot in Word. Not because I felt the need for any help but rather in the spirit of discovering if Copilot lives up to its billing of ushering in “a new era of writing, leveraging the power of AI. It can help you go from a blank page to a finished document in a fraction of the time it would take to compose text on your own.”

Good technical articles tell a story. They start by introducing a topic and explaining why it’s of interest before progressing to a deeper discussion covering interesting facets of the topic. The final step is to reach a conclusion. Copilot for Word aims to help by assisting authors to structure their text, write concise sentences, and start drafting based on a prompt submitted by the author.

Starting Off with Copilot for Word

Writing the first few sentences can be the hardest part of an article. To help, Copilot for Word can generate text by responding to a user prompt. A prompt is how to tell Copilot what to do. It can be up to 2,000 characters.

Crafting good prompts is a skill, just like it is to build good keyword searches of the type used to find information with Google or another search engine. Figure 1 shows my first attempt at a prompt for this article.

Prompting Copilot for Word.
Figure 1: Prompting Copilot for Word

I wasn’t happy with the content generated by Copilot because it read like the text of a marketing brochure. This isn’t altogether surprising given two facts. First, my prompt wasn’t precise enough. Second, generative AI tools like Copilot can only create text based on previous content. The response obviously originated from Microsoft marketing content that lauded the powers of Copilot.

A second attempt was more concise and precise (Figure 2) and produced more acceptable text (Figure 3).

Refining a prompt for Copilot for Word.
Figure 2: Refining a prompt for Copilot for Word
The text generated by Copilot for Word.
Figure 3: The text generated by Copilot for Word

Although better, I would never use the text generated by Copilot. It has value (especially the last three points), but it’s just not my style. The point to remember is that Copilot supports refinement of its output through further prompts. The text shown in Figure 3 is the result of asking Copilot to “make the text more concise.”

Using Reference Documents

A prompt can include links (references) for up to three documents, which must be stored in a Microsoft 365 repository. Copilot uses references to “ground” the prompt with additional context to allow it to respond to prompts better. When starting to write about a new topic, you might not have a usable reference, but in many business situations there should be something that helps, such as a document relating to a project or customer. The prompt shown in Figure 4 asks Copilot to write an article about the January 2024 update for the Office 365 for IT Pros eBook and includes a reference document (an article about the December 2023 update).

Including a reference document in a Copilot for Word prompt
Figure 4: Including a reference document in a Copilot for Word prompt

The generated text (Figure 5) follows the structure of the reference document and I no complaints about the opening paragraph. Copilot even figured out that the January update is #103. The problems mount swiftly thereafter as Copilot’s generated text promises a new chapter on Microsoft Viva and an updated chapter on Copilot for Microsoft 365, neither of which exist. I also don’t know what the integration between Teams and Syntex refers to, and the new Teams Pro license is a predecessor of Teams Premium. Later, we’re told that Microsoft Lists will launch in February 2024. These are Copilot hallucinations.

Copilot generates an article about an Office 365 for IT Pros monthly update.
Figure 5: Copilot generates an article about an Office 365 for IT Pros monthly update

This experience underlines the necessity to check everything generated by Copilot. You have no idea where Copilot might source information and whether that data is obsolete or simply just wrong. Tenants can limit Copilot’s range by preventing it from searching internet sources for information, but even the best corporate information stored in SharePoint Online or OneDrive for Business can contain errors (and often does).

Rewrites with Copilot for Word

Apart from generating text, Copilot for Word can rewrite text. Figure 6 shows a rewrite of the second paragraph from this article. The version generated by Copilot uses the “professional” style (the other styles are “neutral”, “casual”, “concise,” and “imaginative.”

Text rewritten by Copilot for Word.
Figure 6: Text rewritten by Copilot for Word

The two versions are reasonably close. I prefer mine because it’s written in my style, but the alternative is acceptable.

Rewrite is useful when reviewing someone else’s text. I often edit articles submitted to Practical365.com for publication. Because authors come from many countries, their level of English technical writing varies greatly. Being able to have CoPilot rewrite text often helps me understand the true intent of an author.

The Usefulness of Copilot for Word

I’ve tried many different text proofing tools in Word, from the built-in ones like Microsoft Editor to external ones like Grammarly. They all have their pros and cons, and their own quirks. Copilot for Word is more user-friendly and intuitive than any existing tool. If they remember to check the generated text carefully, Copilot will help many people write better. The downside is the $30/user/month cost for Microsoft 365 Copilot licenses (currently, you can’t buy a Copilot license just for Word).

Microsoft 365 Copilot obviously covers much more than generating better text with Word. That being said, it’s nice that the integration of AI into one of the more venerable parts of Microsoft 365 works so well.

Summarizing Copilot for Word

It seems apt to close with the summary generated by Copilot for this article (Figure 7). Copilot summarizes documents by scanning the text to find the main ideas. What’s surprising in this text is the inclusion of ideas that are not in document, such as “What Copilot for Word cannot do.” Copilot cites paragraphs five and six as the source, but neither paragraph mentions anything about weather or visuals, or that Copilot for Word is limited to outputting text in bullet points or paragraphs. This information must have come from the foundational LLMs used by Copilot.

Copilot summary of a document's content.
Figure 7: Copilot summary of a document’s content

I’m sure Copilot included the information to be helpful but it’s jarring to find the AI introducing new ideas in summaries. Oh well, this kind of stuff gives people like me stuff to write about…


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2023/12/14/copilot-for-word/feed/ 3 62822
Microsoft Details Compliance Support for Microsoft 365 Copilot https://office365itpros.com/2023/11/09/microsoft-365-copilot-compliance/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-compliance https://office365itpros.com/2023/11/09/microsoft-365-copilot-compliance/#comments Thu, 09 Nov 2023 01:00:00 +0000 https://office365itpros.com/?p=62342

Compliance through Sensitivity Labels, Audit Events, and Compliance Records

Now that the fuss around the general availability of Microsoft 365 Copilot (November 1) is fading, organizations face the harsh reality of deciding whether to invest a minimum of $108,000 (300 Copilot licenses for a year) to test the effectiveness of an AI-based digital assistant is worthwhile. Before deploying any software, companies usually have a checklist to validate that the software is suitable for their users. The checklist might contain entries such as:

In MC686593 (updated 6 November, 2023), Microsoft addresses the last point by laying out how Purview compliance solutions support the deployment of Microsoft 365 Copilot. Rollout of the capabilities are due between now and mid-December 2023.

Sensitivity Labels Stop Microsoft 365 Copilot Using Content

Microsoft 365 Copilot depends on an abundance of user information stored in Microsoft 365 repositories like SharePoint Online and Exchange Online. With information to set context and provide the source for answering user prompts, Copilot cannot work. The possibility that Copilot might include sensitive information in its output is real, and it’s good to know that Copilot respects the protection afforded by sensitivity labels. The rule is that if a sensitivity label applied to an item allows a user at least read access, its content is available to Copilot to use when responding to prompts from that user. If the label blocks access, Copilot can’t use the item’s content.

If the Confidential label allows Microsoft 365 Copilot to access the information, it can be used in responses
Figure 1: If the Confidential label allows Microsoft 365 Copilot to access the information, it can be used in responses

Audit Events Record Microsoft 365 Copilot Interactions

Recent changes in the Microsoft 365 unified audit log and the surrounding ecosystem have not been good. The Search-UnifiedAuditLog cmdlet doesn’t work as it once did, a factor that might impact the way organizations extract audit data for storage in their preferred SIEM. Some will not like the removal of the classic audit search from the Purview compliance portal in favor of the asynchronous background search feature. Both changes seem to be an attempt by Microsoft to reduce the resources consumed by audit searches. This tactic is perfectly acceptable if communicated to customers. The problem is the deafening silence from Microsoft.

On a positive note, the audit log will capture events for Copilot prompts from users and the responses generated by Copilot in a new Interacted with Copilot category. These events can be searched for and analyzed using the normal audit retrieval facilities.

Compliance Records for Microsoft 365 Copilot

The Microsoft 365 substrate captures Copilot prompts and responses and stores this information as compliance records in user mailboxes, just like the substrate captures compliance records for Teams chats. Microsoft 365 retention policies for Teams chats have been expanded to process the Copilot records. If you already have a policy set up for Teams chat, it processes Copilot records too (Figure 2).

 Retention processing handles Microsoft 365 Copilot interactions along with Teams chats
Figure 2: Retention processing handles Microsoft 365 Copilot interactions along with Teams chats

Although it’s easier for Microsoft to combine processing for Teams chats and Copilot interactions, I can see some problems. For example, some organizations like to have very short retention periods for Teams chat messages (one day is the minimum). Will the same retention period work for Copilot interactions? It would obviously be better if separate policies processed the different data types. Perhaps this will happen in the future.

Because the substrate captures Copilot interactions, the interactions are available for analysis by Communication Compliance policies. It should therefore be possible to discover if someone is using Copilot in an objectionable manner.

Block and Tackle Support for Microsoft 365 Copilot

None of this is earthshattering. SharePoint Online stores protected documents in clear to support indexing, but it would be silly if Microsoft 365 Copilot could use protected documents in its response. Gathering audit events treats Copilot like all the other workloads, and compliance records make sure that eDiscovery investigations can include Copilot interactions in their work. However, it’s nice that Microsoft has done the work to make sure that organizations can mark the compliance item on deployment checklists as complete.


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365.

]]>
https://office365itpros.com/2023/11/09/microsoft-365-copilot-compliance/feed/ 4 62342
Lessons About AI to Learn from Bing Chat Enterprise https://office365itpros.com/2023/10/05/bing-chat-enterprise-ai/?utm_source=rss&utm_medium=rss&utm_campaign=bing-chat-enterprise-ai https://office365itpros.com/2023/10/05/bing-chat-enterprise-ai/#comments Thu, 05 Oct 2023 01:00:00 +0000 https://office365itpros.com/?p=61792

Bing Chat Enterprise and its Place in the Copilot Spectrum

Microsoft published message center notification MC649341 in late August to inform eligible customers (with Microsoft 365 E3 and E5; A3 and A5 (faculty only); and Business Standard and Business Premium licenses) that they had enabled Bing Chat Enterprise in preview for their tenants. On September 21, Bing Chat Enterprise then featured in the announcement of General Availability for Microsoft 365 Copilot, when Microsoft listed Bing Chat Enterprise as one of the commercial SKU (product) line-up for Microsoft Copilot (Figure 1).

Bing Chat Enterprise within the Microsoft Copilot line-up
Figure 1: Bing Chat Enterprise within the Microsoft Copilot line-up

Bing Chat Enterprise is available to the same Microsoft 365 product SKUs as Microsoft 365 Copilot is (Microsoft 365 E3 and E5, Microsoft 365 Business Standard and Premium). When formally available, other customers can buy a Bing Chat Enterprise license for $5/user/month.

I didn’t pay too much attention to Bing Chat Enterprise when Microsoft made their big announcement because the details about Microsoft 365 Copilot are more interesting. Since then we’ve learned that Microsoft will require eligible customers to buy a minimum of 300 Copilot licenses and that all transactions must be approved by Microsoft sales. In other words, a Microsoft partner can’t go ahead and order 300 licenses for one of their customers without approval. Although unpopular with partners, this restriction and the minimum purchase requirement are likely to be short-term measures to allow Microsoft to ramp-up support and other capabilities for Copilot but they might frustrate smaller organizations.

For instance, Microsoft 365 Business Premium is an eligible SKU for Copilot but it tops out at 300 users. The current rule means that a customer running Microsoft 365 Business Premium must buy Copilot for everyone in their organization (costing $108,000 annually). I guess many organizations will wait for the initial rush to work through Microsoft systems before considering a Copilot deployment.

Managing Bing Chat Enterprise

Which brings me back to Bing Chat Enterprise (BCE). While you’re waiting for the mists to clear around Microsoft 365 Copilot, BCE is a good tool to educate users about how to interact with generative AI. BCE is like the Microsoft 365 Chat app that comes with Copilot. The big difference is that Microsoft 365 Chat has access to user data stored in Microsoft 365 repositories like SharePoint Online, Exchange Online, and Teams. BCE must make do with whatever Bing Search can find. However, the same kind of interactive prompting to find and refine information happens.

Microsoft has deployed a Bing Chat Enterprise service plan to user accounts with eligible licenses. This action is described in message center notification MC665935 (updated September 11) and replaces the original tenant on/off switch previously deployed (MC677230) through an online page. Microsoft plans to remove the tenant on/off switch soon and base user access to BCE exclusively on the service plan from November 2023.

The advantage of using a service plan is that administrators can selectively enable or disable BCE for accounts by either editing accounts through the Microsoft 365 admin center or with PowerShell by removing service plan identifier 0d0c0d31-fae7-41f2-b909-eaf4d7f26dba from accounts using the Set-MgUserLicense cmdlet. For example, this command removes the BCE service plan from an account with a Microsoft 365 E5 license:

$DisabledServicePlan = @("0d0c0d31-fae7-41f2-b909-eaf4d7f26dba") 
Set-MgUserLicense -UserId Sean.Landy@Office365itpros.com -AddLicenses @{SkuId = "06ebc4ee-1bb5-47dd-8120-11324bc54e06"; DisabledPlans = $DisabledServicePlan} -RemoveLicenses @()

Learning from Bing Chat Enterprise

Learning how to prompt AI tools for answers is a key skill for users to acquire. Microsoft has a nice write-up on the subject where executives give examples of how they use Copilot and include the Copilot Lab to help users acquire knowledge about prompting. However, as we know from queries given to search engines, many never move past the simplest query, and if that happens with Copilot, there’s little chance that people will be satisfied with the results.

Interacting with BCE to find and refine answers to questions is good practice for Copilot. Sure, Copilot prompts will be different because they can reference documents and other items stored in Microsoft 365 and direct that the output should be in a specific form, like an email, but the principle behind conversational interrogation remains the same.

For example, I asked BCE to generate a PowerShell script to check that an account already had a specific license before attempting to assign the license. The first response used cmdlets from the now-deprecated and non-functioning Microsoft Online Services module. I asked BCE to try again, this time using cmdlets from the Microsoft Graph PowerShell SDK. Figure 2 shows the response.

Using Bing Chat Enterprise to write PowerShell
Figure 2: Using Bing Chat Enterprise to write PowerShell

The script code looks like it should work except that it won’t. The command to pipe a variable to the Update-MgUser cmdlet will fail horribly because the SDK does not currently support piping. It’s one of the SDK foibles that Microsoft is working on to fix.

AI can make things up (“hallucinations”), but in this instance BCE based its answer on Microsoft documentation and contributions to the well-respected and chock-full-of-knowledge StackOverflow site.

The learning for users is to never accept what AI produces without checking the generated answer first to be sure that it is correct and answers the original question, even if the cited sources seem impeccable. Maintaining a healthy level of scepticism about AI generated text is essential because it’s possible that someone would prompt Copilot for some information, see what looks like good information coming back, and email that information without thinking that it could be wrong, contain sensitive content, or is otherwise inappropriate to share.

Learning with AI

We’re at the start of what could be a transformational phase in how we deal with Office information. Good as the technology might be at the start, it’s going to take time for people to master driving AI to do the right things. Rubbish in equals rubbish out. AI just makes rubbish generation faster, if you allow it to happen.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2023/10/05/bing-chat-enterprise-ai/feed/ 1 61792
Microsoft Makes Microsoft 365 Copilot Generally Available https://office365itpros.com/2023/09/22/microsoft-365-copilot-ga/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-ga https://office365itpros.com/2023/09/22/microsoft-365-copilot-ga/#comments Fri, 22 Sep 2023 01:00:00 +0000 https://office365itpros.com/?p=61691

Enterprise Customers Can Buy Microsoft 365 Copilot on November 1, 2023

Microsoft 365 Copilot and other AI SKUs

Originally unveiled last March and then put through a testing program involving 600 customers (who paid a substantial amount for the privilege), Microsoft announced (September 21) that Microsoft 365 Copilot will be generally available for enterprise customers on November 1, 2023. Although they didn’t expand what they mean by “enterprise customers,” I’m sure that Copilot will be available for tenants running the two “eligible” SKUs targeted at small businesses (Microsoft 365 Business Standard and Business Premium). This page covers Copilot for the SME segment.

Time to Prepare Budgets

After checking their IT budgets to see if they can find the funds necessary to upgrade to one of the eligible products and then pay the hefty $30/user per month charge for Copilot, interested customers can contact Microsoft sales to buy licenses.

The agenda for this week’s The Experts Conference (TEC) event included several sessions about using artificial intelligence with Microsoft 365. Interestingly, when polled, none of the conference attendees indicated that their companies were interested in deploying Copilot. Cost is a big issue, but so is the work necessary to prepare tenants for Copilot, including user training and support. For more information, see the Microsoft 365 Copilot overview page.

The lack of interest at TEC might be misleading. For instance, software is more interesting when it’s available and companies can learn about real-life scenarios from other customers to understand how to justify the spend. It’s also true that the Microsoft sales force hasn’t yet gone into high gear to sell Copilot. Now that a general availability date is known, that pressure can be expected to increase.

Copilot Lab the Most Interesting Part of Announcement

When I talk about Copilot, I refer to it as an inexperienced artificial assistant that needs a lot of coaching to achieve good results. Users provide coaching through the prompts they input to tell Copilot what to do. Good prompts that are concise and provide context are much more likely to generate what the user wants than fuzzy requests for help.

The average user is not an expert in prompt formulation. Even after 25 years of using Google search, many struggle to construct focused search terms. The same is true for people searching for information within a tenant using Microsoft Search. Some know how to use document metadata to find exactly what they want. Others rely on being able to find items using document titles.

Without good prompts, Microsoft 365 Copilot will fail utterly. The AI cannot read user minds to understand what someone really wants. It’s got to be told, and it’s got to be told with a level of precision that might surprise.

All of which means that the announcement of Copilot Lab is a really good idea. Essentially, Copilot Lab is a learning ground for people to discover how to construct effective prompts (Figure 1), including being able to share prompts that they create.

Copilot Lab (from Microsoft video)

Microsoft 365 Copilot
Figure 1: Copilot Lab (from Microsoft video)

The implementation seems very like the way that Power Apps allows users to create apps from a library of templates. Anyone facing into a new technology appreciates some help to get over the initial learning hurdle, and that’s what I expect Copilot Lab will do.

Microsoft Copilot Chat

The other new part of the Microsoft 365 Copilot ecosystem is a chat application that looks very much like Bing Enterprise Chat (Figure 2). The big differences are that Microsoft Copilot Chat has access to information stored in Microsoft 365 repositories like SharePoint Online that are available to the signed-in user. Microsoft 365 chat is available through https://www.microsoft365.com/copilot and in Teams chat.

Microsoft 365 Chat (from Microsoft video)
Figure 2: Microsoft 365 Chat (from Microsoft video)

The Monarch Issue

Another issue raised at TEC was the insistence Microsoft has that the Outlook Monarch client is the only version that will support Copilot. While it’s true that Microsoft wants customers to move to the new Outlook, user resistance is palpable and could become a barrier to adoption. Although there’s value to be gained by Copilot summarizing notes from a Teams meeting or creating a Word document or PowerPoint presentation based on existing content, many people still organize their working life around Outlook. And that’s Outlook classic, not a web-based version that’s still missing functionality like offline access (coming soon, or so I hear).

If Microsoft really wanted to, I think they could create an OWA Powered Experience (OPX)-based plug-in for Outlook classic (like the Room Finder) to integrate Copilot. Where there’s a will, there’s a way. In this instance, the will seems to be missing. And that’s just a little sad.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2023/09/22/microsoft-365-copilot-ga/feed/ 2 61691
Microsoft Removes Reuse Files Feature from Word https://office365itpros.com/2023/08/31/reuse-files-word/?utm_source=rss&utm_medium=rss&utm_campaign=reuse-files-word https://office365itpros.com/2023/08/31/reuse-files-word/#comments Thu, 31 Aug 2023 01:00:00 +0000 https://office365itpros.com/?p=61286

Perhaps an Indication that Copilot Does a Better Job?

When I read message center notification MC668802 (18 Aug 2023), the thought went through my mind that Microsoft’s intention to retire the Reuse Files in Word feature might be a reflection of their focus on Copilot for Microsoft 365.

Starting in August 2023, users won’t see the Reuse Files option in the Word ribbon. However, you can still search for and use the feature. When you launch Reuse Files, Word uses Graph API calls to find documents that it thinks you might want to copy content from or include a link to in your current file (Figure 1).

Reuse Files feature in Word
Figure 1: Reuse Files feature in Word

Introduced in late 2020, I thought that the idea of being able to build new documents by reusing work previously done is good. However, Microsoft says that by January 2024, they will remove all traces of the Reuse Files feature from Word. Microsoft didn’t say anything about the availability of Reuse Files in Outlook (for Windows). Nor did they say if the Reuse Slides feature in PowerPoint will disappear sometime in the future.

Improving Your Subscription by Removing Reuse Files

In MC668802, Microsoft says that they are “committed to improving your Microsoft 365 subscription” and “we occasionally remove features and benefits that duplicate equivalent offerings.”

The comment about duplicating equivalent offerings is what brings me to Copilot. It can be argued that the reuse files feature could be replicated by simply opening a Word document and copying text from it into your file. The difference is intelligence. The Reuse Files feature uses Graph API requests to find files that the app thinks might be of use. Unfortunately, the initial set of files that it lists are usually just the last set of files that you’ve worked on, and the files found when you enter a search term don’t always seem to match the request.

At $30/user/month (plus an eligible Microsoft 365 subscription), Microsoft 365 Copilot is expensive. The required investment makes it imperative that organizations select those allowed to use Copilot with care, even if you believe the hype that users only need to get a couple of dollars value from using Copilot to offset its cost. But what we know of Copilot to date is that it applies a lot of artificial intelligence technology to find information to respond to user prompts (queries). In addition, tenants that use Copilot have a semantic index to help find appropriate information. That’s something which doesn’t exist in normal tenants.

Perhaps Microsoft is removing “AI Lite” features like Reuse Files from the playing field to give Copilot a clear run. Put another way, not having features like Reuse Files in the Microsoft 365 apps emphasizes the usefulness and capabilities of Copilot for Microsoft 365.

Maybe an Innocuous Decision

It’s entirely possible that I am reading too much into an innocuous decision by Microsoft to remove a feature that isn’t used very much. Microsoft might have decided that the engineering effort required to maintain and support the Reuse Files feature isn’t worth it because of low usage (or because the feature really isn’t very good). After all, if users don’t know about a feature, they won’t use it (OWA search refiners might be another example).

Only Microsoft knows, and they cloud the decision in words that make it seem that the removal of the Reuse Files feature is for our own good. Maybe it is. Who knows?

Clearing the Deck

Microsoft removes relatively few features from Microsoft 365. Clutter is one example, replaced by Outlook’s Focused Inbox. It’s nice to think that Microsoft removes items to improve our subscriptions. I suspect that the truth is that feature removals clear the deck and make it easier for Microsoft rather than users.


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365.

]]>
https://office365itpros.com/2023/08/31/reuse-files-word/feed/ 1 61286
Microsoft Prepares Partners for Microsoft 365 Copilot https://office365itpros.com/2023/08/25/microsoft-365-copilot-partners/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-partners https://office365itpros.com/2023/08/25/microsoft-365-copilot-partners/#comments Fri, 25 Aug 2023 01:00:00 +0000 https://office365itpros.com/?p=61350

Get Software, Prompts, and Content Right to Make Microsoft 365 Copilot Work

Ever since Microsoft announced Copilot for Microsoft 365 last March, I’ve spent time to learn about concepts like generative AI to better understand the technology. I’ve also tracked Microsoft’s announcements to interpret their messaging about Copilot and analyzed the costs organizations face to adopt Copilot. Given the hefty licensing costs, I’ve reflected on how organizations might go about deciding who should get Copilot. You could say that I’ve thought about the topic.

Which brings me to a Microsoft partner session delivered yesterday about preparing for Microsoft 365 Copilot. I wrote on this theme last June, so wanted to hear the public messages Microsoft gives to its partners to use in customer engagements.

Get the Right Software

Mostly, I didn’t learn anything new, but I did hear three messages receive considerable emphasis. The first is that customers need the right software to run Microsoft 365 Copilot. Tenants need:

  • Microsoft 365 apps for enterprise.
  • Outlook Monarch.
  • Microsoft Loop.
  • Microsoft 365 Business Standard, Business Premium, E3, or E5.

Apart from mentioning the semantic index, nothing was said about the focus on Microsoft 365 SKUs. The semantic index preprocesses information in a tenant to make it more consumable by Copilot. For instance, the semantic index creates a custom dictionary of terms used in the organization and document excerpts to help answer queries. The idea is that the semantic index helps to refine (“ground”) user queries (“prompts”) before they are processed by the LLM.

Nice as the semantic index is, there’s nothing in the selected Microsoft 365 SKUs to make those SKUs amendable to the semantic index. Microsoft has simply selected those SKUs as the ones to support Copilot. It’s a way to drive customers to upgrade from Office 365 to Microsoft 365, just like Microsoft insists that customers use Outlook Monarch instead of the traditional Outlook desktop client.

Mastering Prompts

Quite a lot of time was spent discussing the interaction between users and Copilot. Like searching with Google or Bing, the prompts given to Copilot should be as specific as possible (Figure 1).

Constructing a Copilot prompt in Word

Microsoft 365 copilot
Figure 1: Constructing a Copilot prompt in Word (source: Microsoft)

It’s rather like assigning a task to a human assistant. Prompts are written in natural language and should:

  • Be Precise and detailed.
  • Include context (for instance, documents that Copilot should include in its processing).
  • Define what output is expected (and what format – like a presentation or document).

The aim is to avoid the need for Copilot to interpret (guess) what the user wants. A human assistant might know what their boss wants based on previous experience and insight gained over time, but Copilot needs those precise instructions to know what to do.

Constructing good prompts is a skill that users will need to build. Given that many people today struggle with Google searches twenty years after Google became synonymous with looking for something, it’s not hard to understand how people might find it difficult to coax Copilot to do their bidding, even if Copilot is patient and willing to accept and process iterative instructions until it gets things right.

Microsoft 365 Copilot is different to other variants like those for Security and GitHub that are targeted at specific professionals. A programmer, for instance, has a good idea of the kind of assistance they want to write code and the acid test of what GitHub Copilot generates is whether the code works (or even compiles). It’s harder to apply such a black and white test for documents.

The Quality of Content

Microsoft talks about Copilot consuming “rich data sets.” This is code for the information that users store in Microsoft 365 workloads like Exchange Online, Teams, SharePoint Online, OneDrive for Business, and Loop. Essentially, if you don’t have information that Microsoft Search can find, Copilot won’t be able to use it. Documents stored on local or shared network drives are inaccessible, for instance.

All of this makes sense. Between the semantic index and Graph queries to retrieve information from workloads, Copilot has a sporting chance of being able to answer user prompts. Of course, if the information stored in SharePoint Online and other workloads is inaccurate or misleading, the results will be the same. But if the information is accurate and precise, you can expect good results.

This leads me to think about the quality of information stored in Microsoft 365 workloads. I store everything in Microsoft 365 and wonder how many flaws Copilot will reveal. I look at how coworkers store information and wonder even more. Remember, Copilot can use any information it can find through Microsoft Search (including external data enabled through Graph connectors), which underlines the need to provide good guidance in the prompts given to Copilot. Letting Copilot do its own thing based on anything it can find might not be a great strategy to follow.

Lots Still to Learn

Microsoft 365 Copilot is still in private preview (at a stunning $100K fee charged to participating customers). Until the software gets much closer to general availability, I suspect that we’ll have more questions than answers when it comes to figuring out how to deploy, use, manage, and control Copilot in the wild. We still have lots to learn.

If you’re in Atlanta for The Experts Conference (September 19-20), be sure to attend my session on Making Generative AI Work for Microsoft 365 when I’ll debate the issues mentioned here along with others. TEC includes lots of other great sessions, including a Mary-Jo Foley keynote about “Microsoft’s Priorities vs. Customer Priorities: Will the Two Ever Meet?” TEC is always a great conference. Come along and be amused (or is that educated?)


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2023/08/25/microsoft-365-copilot-partners/feed/ 1 61350
Microsoft Launches Simplified Sharing for Microsoft 365 Apps https://office365itpros.com/2023/08/04/simplified-sharing-experience/?utm_source=rss&utm_medium=rss&utm_campaign=simplified-sharing-experience https://office365itpros.com/2023/08/04/simplified-sharing-experience/#comments Fri, 04 Aug 2023 01:00:00 +0000 https://office365itpros.com/?p=61049

Making Sharing of Files and Folders Easier

Apart from Microsoft 365 roadmap item 124933, I can’t find a formal announcement about the Simplified Sharing Experience, but I have been aware that Microsoft recently updated the share dialog used by Microsoft 365 apps to make it easier and more straightforward to use. According to a LinkedIn post, (Figure 1) Microsoft ran an A/B experiment to test the new dialog. I guess I was one of the testers! In any case, the new sharing dialog is now available in all Microsoft 365 tenants. Users of OneDrive consumer will see the upgraded dialog in the second half of 2023.

Microsoft spreads the news about the simplified sharing experience
Figure 1: Microsoft spreads the news about the simplified sharing experience

The Role of the Share Dialog

The share dialog is what people see when they share a document or folder with others inside or outside their organization. According to Microsoft, the dialog is used over 800 million times monthly across 52 different Microsoft 365 experiences (desktop, browser, and mobile). In other words, Microsoft 365 apps offer users the opportunity to share in 52 different places across the suite. The most common of the experiences are likely in SharePoint Online, OneDrive for Windows, and Teams.

Microsoft says that they focused on creating a dialog that makes it simpler for users to perform core sharing tasks. When someone invokes the new screen (Figure 2) to share a file or folder, they see a simpler layout pre-populated with the default sharing link as specified by the tenant or site policy (in this case, the sharing link allows access to people within the organization). The name of the sensitivity label assigned to the document is also shown to provide a visual indicator about its relative confidentiality.

Revamping sharing link dialog
Figure 2: The revamped sharing link dialog

To complete the link, add the people to notify and enter a note to tell them what to do, and click Send to have the message sent by email or Copy link to copy the sharing link to the clipboard.

If you need to change the type of sharing link, select the cogwheel to expose the link settings (Figure 3). Again, everything is very straightforward and simple. If you choose a link that allows external sharing, I’m told that the new design “makes users more comfortable with sharing.” I’m not quite sure what this means, but any of the sharing that I’ve done with people outside the organization has worked smoothly.

Editing the setting for a sharing link
Figure 3: Editing the setting for a sharing link

Microsoft has also overhauled the Manage access dialog to help people manage the set of users and groups that have access to a file or folder (Figure 4).

The revamped manage access dialog
Figure 4: The revamped manage access dialog

Microsoft says that customer feedback about the new dialog is very positive. It’s worth noting that this is not the first time that Microsoft has revamped the sharing dialog. The last major overhaul was in 2020-21 when Microsoft rationalized on a common sharing dialog for all apps, notably for Teams.

The Importance of Sharing

Getting sharing right is clearly important. When Microsoft launched the Delve app in 2015, it resulted in a crescendo of protest from tenants who suddenly found that Delve suggested documents to users when the organization thought that Delve should not. Of course, the software did nothing wrong. Delve respected the access rights given to users when it computed the set of interesting documents to suggest (using an early version of Graph document insights). The problem was entirely down to poor management and access control, often at the level of complete SharePoint Online sites. Users might not have realized that they had access to the documents in poorly-protected sites, but software can’t be blamed if it goes looking for documents to suggest to a user and finds some that are available.

We’re heading for a similar situation with Microsoft 365 Copilot. The Copilot software depends on finding information with Graph queries to help satisfy user prompts. Like Delve, Copilot will find files that are available to the user who prompts for help, and the results generated for the user might include some confidential. And if the user doesn’t bother to check the content generated by Copilot, the information might then be revealed with people who shouldn’t have it. This is the danger of oversharing, and it’s certainly an issue for organizations contemplating Microsoft 365 Copilot need to resolve before implementation.

Simplified Sharing Experience One Step Along the Path

The new sharing dialog won’t solve oversharing. It’s just one step along the path to help users share information with the right people in the right way.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2023/08/04/simplified-sharing-experience/feed/ 9 61049