Microsoft 365 Copilot – Office 365 for IT Pros https://office365itpros.com Mastering Office 365 and Microsoft 365 Thu, 14 Dec 2023 22:09:53 +0000 en-US hourly 1 https://i0.wp.com/office365itpros.com/wp-content/uploads/2024/06/cropped-Office-365-for-IT-Pros-2025-Edition-500-px.jpg?fit=32%2C32&ssl=1 Microsoft 365 Copilot – Office 365 for IT Pros https://office365itpros.com 32 32 150103932 Using Microsoft 365 Copilot for Word https://office365itpros.com/2023/12/14/copilot-for-word/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-word https://office365itpros.com/2023/12/14/copilot-for-word/#comments Thu, 14 Dec 2023 01:00:00 +0000 https://office365itpros.com/?p=62822

Copilot for Word Will Help Many Authors Create Better Text

As folks might know, I write quite a few articles about technical topics. Recently, I’ve had the assistance of Microsoft 365 Copilot in Word. Not because I felt the need for any help but rather in the spirit of discovering if Copilot lives up to its billing of ushering in “a new era of writing, leveraging the power of AI. It can help you go from a blank page to a finished document in a fraction of the time it would take to compose text on your own.”

Good technical articles tell a story. They start by introducing a topic and explaining why it’s of interest before progressing to a deeper discussion covering interesting facets of the topic. The final step is to reach a conclusion. Copilot for Word aims to help by assisting authors to structure their text, write concise sentences, and start drafting based on a prompt submitted by the author.

Starting Off with Copilot for Word

Writing the first few sentences can be the hardest part of an article. To help, Copilot for Word can generate text by responding to a user prompt. A prompt is how to tell Copilot what to do. It can be up to 2,000 characters.

Crafting good prompts is a skill, just like it is to build good keyword searches of the type used to find information with Google or another search engine. Figure 1 shows my first attempt at a prompt for this article.

Prompting Copilot for Word.
Figure 1: Prompting Copilot for Word

I wasn’t happy with the content generated by Copilot because it read like the text of a marketing brochure. This isn’t altogether surprising given two facts. First, my prompt wasn’t precise enough. Second, generative AI tools like Copilot can only create text based on previous content. The response obviously originated from Microsoft marketing content that lauded the powers of Copilot.

A second attempt was more concise and precise (Figure 2) and produced more acceptable text (Figure 3).

Refining a prompt for Copilot for Word.
Figure 2: Refining a prompt for Copilot for Word
The text generated by Copilot for Word.
Figure 3: The text generated by Copilot for Word

Although better, I would never use the text generated by Copilot. It has value (especially the last three points), but it’s just not my style. The point to remember is that Copilot supports refinement of its output through further prompts. The text shown in Figure 3 is the result of asking Copilot to “make the text more concise.”

Using Reference Documents

A prompt can include links (references) for up to three documents, which must be stored in a Microsoft 365 repository. Copilot uses references to “ground” the prompt with additional context to allow it to respond to prompts better. When starting to write about a new topic, you might not have a usable reference, but in many business situations there should be something that helps, such as a document relating to a project or customer. The prompt shown in Figure 4 asks Copilot to write an article about the January 2024 update for the Office 365 for IT Pros eBook and includes a reference document (an article about the December 2023 update).

Including a reference document in a Copilot for Word prompt
Figure 4: Including a reference document in a Copilot for Word prompt

The generated text (Figure 5) follows the structure of the reference document and I no complaints about the opening paragraph. Copilot even figured out that the January update is #103. The problems mount swiftly thereafter as Copilot’s generated text promises a new chapter on Microsoft Viva and an updated chapter on Copilot for Microsoft 365, neither of which exist. I also don’t know what the integration between Teams and Syntex refers to, and the new Teams Pro license is a predecessor of Teams Premium. Later, we’re told that Microsoft Lists will launch in February 2024. These are Copilot hallucinations.

Copilot generates an article about an Office 365 for IT Pros monthly update.
Figure 5: Copilot generates an article about an Office 365 for IT Pros monthly update

This experience underlines the necessity to check everything generated by Copilot. You have no idea where Copilot might source information and whether that data is obsolete or simply just wrong. Tenants can limit Copilot’s range by preventing it from searching internet sources for information, but even the best corporate information stored in SharePoint Online or OneDrive for Business can contain errors (and often does).

Rewrites with Copilot for Word

Apart from generating text, Copilot for Word can rewrite text. Figure 6 shows a rewrite of the second paragraph from this article. The version generated by Copilot uses the “professional” style (the other styles are “neutral”, “casual”, “concise,” and “imaginative.”

Text rewritten by Copilot for Word.
Figure 6: Text rewritten by Copilot for Word

The two versions are reasonably close. I prefer mine because it’s written in my style, but the alternative is acceptable.

Rewrite is useful when reviewing someone else’s text. I often edit articles submitted to Practical365.com for publication. Because authors come from many countries, their level of English technical writing varies greatly. Being able to have CoPilot rewrite text often helps me understand the true intent of an author.

The Usefulness of Copilot for Word

I’ve tried many different text proofing tools in Word, from the built-in ones like Microsoft Editor to external ones like Grammarly. They all have their pros and cons, and their own quirks. Copilot for Word is more user-friendly and intuitive than any existing tool. If they remember to check the generated text carefully, Copilot will help many people write better. The downside is the $30/user/month cost for Microsoft 365 Copilot licenses (currently, you can’t buy a Copilot license just for Word).

Microsoft 365 Copilot obviously covers much more than generating better text with Word. That being said, it’s nice that the integration of AI into one of the more venerable parts of Microsoft 365 works so well.

Summarizing Copilot for Word

It seems apt to close with the summary generated by Copilot for this article (Figure 7). Copilot summarizes documents by scanning the text to find the main ideas. What’s surprising in this text is the inclusion of ideas that are not in document, such as “What Copilot for Word cannot do.” Copilot cites paragraphs five and six as the source, but neither paragraph mentions anything about weather or visuals, or that Copilot for Word is limited to outputting text in bullet points or paragraphs. This information must have come from the foundational LLMs used by Copilot.

Copilot summary of a document's content.
Figure 7: Copilot summary of a document’s content

I’m sure Copilot included the information to be helpful but it’s jarring to find the AI introducing new ideas in summaries. Oh well, this kind of stuff gives people like me stuff to write about…


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2023/12/14/copilot-for-word/feed/ 3 62822
Microsoft Details Compliance Support for Microsoft 365 Copilot https://office365itpros.com/2023/11/09/microsoft-365-copilot-compliance/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-compliance https://office365itpros.com/2023/11/09/microsoft-365-copilot-compliance/#comments Thu, 09 Nov 2023 01:00:00 +0000 https://office365itpros.com/?p=62342

Compliance through Sensitivity Labels, Audit Events, and Compliance Records

Now that the fuss around the general availability of Microsoft 365 Copilot (November 1) is fading, organizations face the harsh reality of deciding whether to invest a minimum of $108,000 (300 Copilot licenses for a year) to test the effectiveness of an AI-based digital assistant is worthwhile. Before deploying any software, companies usually have a checklist to validate that the software is suitable for their users. The checklist might contain entries such as:

In MC686593 (updated 6 November, 2023), Microsoft addresses the last point by laying out how Purview compliance solutions support the deployment of Microsoft 365 Copilot. Rollout of the capabilities are due between now and mid-December 2023.

Sensitivity Labels Stop Microsoft 365 Copilot Using Content

Microsoft 365 Copilot depends on an abundance of user information stored in Microsoft 365 repositories like SharePoint Online and Exchange Online. With information to set context and provide the source for answering user prompts, Copilot cannot work. The possibility that Copilot might include sensitive information in its output is real, and it’s good to know that Copilot respects the protection afforded by sensitivity labels. The rule is that if a sensitivity label applied to an item allows a user at least read access, its content is available to Copilot to use when responding to prompts from that user. If the label blocks access, Copilot can’t use the item’s content.

If the Confidential label allows Microsoft 365 Copilot to access the information, it can be used in responses
Figure 1: If the Confidential label allows Microsoft 365 Copilot to access the information, it can be used in responses

Audit Events Record Microsoft 365 Copilot Interactions

Recent changes in the Microsoft 365 unified audit log and the surrounding ecosystem have not been good. The Search-UnifiedAuditLog cmdlet doesn’t work as it once did, a factor that might impact the way organizations extract audit data for storage in their preferred SIEM. Some will not like the removal of the classic audit search from the Purview compliance portal in favor of the asynchronous background search feature. Both changes seem to be an attempt by Microsoft to reduce the resources consumed by audit searches. This tactic is perfectly acceptable if communicated to customers. The problem is the deafening silence from Microsoft.

On a positive note, the audit log will capture events for Copilot prompts from users and the responses generated by Copilot in a new Interacted with Copilot category. These events can be searched for and analyzed using the normal audit retrieval facilities.

Compliance Records for Microsoft 365 Copilot

The Microsoft 365 substrate captures Copilot prompts and responses and stores this information as compliance records in user mailboxes, just like the substrate captures compliance records for Teams chats. Microsoft 365 retention policies for Teams chats have been expanded to process the Copilot records. If you already have a policy set up for Teams chat, it processes Copilot records too (Figure 2).

 Retention processing handles Microsoft 365 Copilot interactions along with Teams chats
Figure 2: Retention processing handles Microsoft 365 Copilot interactions along with Teams chats

Although it’s easier for Microsoft to combine processing for Teams chats and Copilot interactions, I can see some problems. For example, some organizations like to have very short retention periods for Teams chat messages (one day is the minimum). Will the same retention period work for Copilot interactions? It would obviously be better if separate policies processed the different data types. Perhaps this will happen in the future.

Because the substrate captures Copilot interactions, the interactions are available for analysis by Communication Compliance policies. It should therefore be possible to discover if someone is using Copilot in an objectionable manner.

Block and Tackle Support for Microsoft 365 Copilot

None of this is earthshattering. SharePoint Online stores protected documents in clear to support indexing, but it would be silly if Microsoft 365 Copilot could use protected documents in its response. Gathering audit events treats Copilot like all the other workloads, and compliance records make sure that eDiscovery investigations can include Copilot interactions in their work. However, it’s nice that Microsoft has done the work to make sure that organizations can mark the compliance item on deployment checklists as complete.


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365.

]]>
https://office365itpros.com/2023/11/09/microsoft-365-copilot-compliance/feed/ 4 62342
Lessons About AI to Learn from Bing Chat Enterprise https://office365itpros.com/2023/10/05/bing-chat-enterprise-ai/?utm_source=rss&utm_medium=rss&utm_campaign=bing-chat-enterprise-ai https://office365itpros.com/2023/10/05/bing-chat-enterprise-ai/#comments Thu, 05 Oct 2023 01:00:00 +0000 https://office365itpros.com/?p=61792

Bing Chat Enterprise and its Place in the Copilot Spectrum

Microsoft published message center notification MC649341 in late August to inform eligible customers (with Microsoft 365 E3 and E5; A3 and A5 (faculty only); and Business Standard and Business Premium licenses) that they had enabled Bing Chat Enterprise in preview for their tenants. On September 21, Bing Chat Enterprise then featured in the announcement of General Availability for Microsoft 365 Copilot, when Microsoft listed Bing Chat Enterprise as one of the commercial SKU (product) line-up for Microsoft Copilot (Figure 1).

Bing Chat Enterprise within the Microsoft Copilot line-up
Figure 1: Bing Chat Enterprise within the Microsoft Copilot line-up

Bing Chat Enterprise is available to the same Microsoft 365 product SKUs as Microsoft 365 Copilot is (Microsoft 365 E3 and E5, Microsoft 365 Business Standard and Premium). When formally available, other customers can buy a Bing Chat Enterprise license for $5/user/month.

I didn’t pay too much attention to Bing Chat Enterprise when Microsoft made their big announcement because the details about Microsoft 365 Copilot are more interesting. Since then we’ve learned that Microsoft will require eligible customers to buy a minimum of 300 Copilot licenses and that all transactions must be approved by Microsoft sales. In other words, a Microsoft partner can’t go ahead and order 300 licenses for one of their customers without approval. Although unpopular with partners, this restriction and the minimum purchase requirement are likely to be short-term measures to allow Microsoft to ramp-up support and other capabilities for Copilot but they might frustrate smaller organizations.

For instance, Microsoft 365 Business Premium is an eligible SKU for Copilot but it tops out at 300 users. The current rule means that a customer running Microsoft 365 Business Premium must buy Copilot for everyone in their organization (costing $108,000 annually). I guess many organizations will wait for the initial rush to work through Microsoft systems before considering a Copilot deployment.

Managing Bing Chat Enterprise

Which brings me back to Bing Chat Enterprise (BCE). While you’re waiting for the mists to clear around Microsoft 365 Copilot, BCE is a good tool to educate users about how to interact with generative AI. BCE is like the Microsoft 365 Chat app that comes with Copilot. The big difference is that Microsoft 365 Chat has access to user data stored in Microsoft 365 repositories like SharePoint Online, Exchange Online, and Teams. BCE must make do with whatever Bing Search can find. However, the same kind of interactive prompting to find and refine information happens.

Microsoft has deployed a Bing Chat Enterprise service plan to user accounts with eligible licenses. This action is described in message center notification MC665935 (updated September 11) and replaces the original tenant on/off switch previously deployed (MC677230) through an online page. Microsoft plans to remove the tenant on/off switch soon and base user access to BCE exclusively on the service plan from November 2023.

The advantage of using a service plan is that administrators can selectively enable or disable BCE for accounts by either editing accounts through the Microsoft 365 admin center or with PowerShell by removing service plan identifier 0d0c0d31-fae7-41f2-b909-eaf4d7f26dba from accounts using the Set-MgUserLicense cmdlet. For example, this command removes the BCE service plan from an account with a Microsoft 365 E5 license:

$DisabledServicePlan = @("0d0c0d31-fae7-41f2-b909-eaf4d7f26dba") 
Set-MgUserLicense -UserId Sean.Landy@Office365itpros.com -AddLicenses @{SkuId = "06ebc4ee-1bb5-47dd-8120-11324bc54e06"; DisabledPlans = $DisabledServicePlan} -RemoveLicenses @()

Learning from Bing Chat Enterprise

Learning how to prompt AI tools for answers is a key skill for users to acquire. Microsoft has a nice write-up on the subject where executives give examples of how they use Copilot and include the Copilot Lab to help users acquire knowledge about prompting. However, as we know from queries given to search engines, many never move past the simplest query, and if that happens with Copilot, there’s little chance that people will be satisfied with the results.

Interacting with BCE to find and refine answers to questions is good practice for Copilot. Sure, Copilot prompts will be different because they can reference documents and other items stored in Microsoft 365 and direct that the output should be in a specific form, like an email, but the principle behind conversational interrogation remains the same.

For example, I asked BCE to generate a PowerShell script to check that an account already had a specific license before attempting to assign the license. The first response used cmdlets from the now-deprecated and non-functioning Microsoft Online Services module. I asked BCE to try again, this time using cmdlets from the Microsoft Graph PowerShell SDK. Figure 2 shows the response.

Using Bing Chat Enterprise to write PowerShell
Figure 2: Using Bing Chat Enterprise to write PowerShell

The script code looks like it should work except that it won’t. The command to pipe a variable to the Update-MgUser cmdlet will fail horribly because the SDK does not currently support piping. It’s one of the SDK foibles that Microsoft is working on to fix.

AI can make things up (“hallucinations”), but in this instance BCE based its answer on Microsoft documentation and contributions to the well-respected and chock-full-of-knowledge StackOverflow site.

The learning for users is to never accept what AI produces without checking the generated answer first to be sure that it is correct and answers the original question, even if the cited sources seem impeccable. Maintaining a healthy level of scepticism about AI generated text is essential because it’s possible that someone would prompt Copilot for some information, see what looks like good information coming back, and email that information without thinking that it could be wrong, contain sensitive content, or is otherwise inappropriate to share.

Learning with AI

We’re at the start of what could be a transformational phase in how we deal with Office information. Good as the technology might be at the start, it’s going to take time for people to master driving AI to do the right things. Rubbish in equals rubbish out. AI just makes rubbish generation faster, if you allow it to happen.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2023/10/05/bing-chat-enterprise-ai/feed/ 1 61792
Microsoft Makes Microsoft 365 Copilot Generally Available https://office365itpros.com/2023/09/22/microsoft-365-copilot-ga/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-ga https://office365itpros.com/2023/09/22/microsoft-365-copilot-ga/#comments Fri, 22 Sep 2023 01:00:00 +0000 https://office365itpros.com/?p=61691

Enterprise Customers Can Buy Microsoft 365 Copilot on November 1, 2023

Microsoft 365 Copilot and other AI SKUs

Originally unveiled last March and then put through a testing program involving 600 customers (who paid a substantial amount for the privilege), Microsoft announced (September 21) that Microsoft 365 Copilot will be generally available for enterprise customers on November 1, 2023. Although they didn’t expand what they mean by “enterprise customers,” I’m sure that Copilot will be available for tenants running the two “eligible” SKUs targeted at small businesses (Microsoft 365 Business Standard and Business Premium). This page covers Copilot for the SME segment.

Time to Prepare Budgets

After checking their IT budgets to see if they can find the funds necessary to upgrade to one of the eligible products and then pay the hefty $30/user per month charge for Copilot, interested customers can contact Microsoft sales to buy licenses.

The agenda for this week’s The Experts Conference (TEC) event included several sessions about using artificial intelligence with Microsoft 365. Interestingly, when polled, none of the conference attendees indicated that their companies were interested in deploying Copilot. Cost is a big issue, but so is the work necessary to prepare tenants for Copilot, including user training and support. For more information, see the Microsoft 365 Copilot overview page.

The lack of interest at TEC might be misleading. For instance, software is more interesting when it’s available and companies can learn about real-life scenarios from other customers to understand how to justify the spend. It’s also true that the Microsoft sales force hasn’t yet gone into high gear to sell Copilot. Now that a general availability date is known, that pressure can be expected to increase.

Copilot Lab the Most Interesting Part of Announcement

When I talk about Copilot, I refer to it as an inexperienced artificial assistant that needs a lot of coaching to achieve good results. Users provide coaching through the prompts they input to tell Copilot what to do. Good prompts that are concise and provide context are much more likely to generate what the user wants than fuzzy requests for help.

The average user is not an expert in prompt formulation. Even after 25 years of using Google search, many struggle to construct focused search terms. The same is true for people searching for information within a tenant using Microsoft Search. Some know how to use document metadata to find exactly what they want. Others rely on being able to find items using document titles.

Without good prompts, Microsoft 365 Copilot will fail utterly. The AI cannot read user minds to understand what someone really wants. It’s got to be told, and it’s got to be told with a level of precision that might surprise.

All of which means that the announcement of Copilot Lab is a really good idea. Essentially, Copilot Lab is a learning ground for people to discover how to construct effective prompts (Figure 1), including being able to share prompts that they create.

Copilot Lab (from Microsoft video)

Microsoft 365 Copilot
Figure 1: Copilot Lab (from Microsoft video)

The implementation seems very like the way that Power Apps allows users to create apps from a library of templates. Anyone facing into a new technology appreciates some help to get over the initial learning hurdle, and that’s what I expect Copilot Lab will do.

Microsoft Copilot Chat

The other new part of the Microsoft 365 Copilot ecosystem is a chat application that looks very much like Bing Enterprise Chat (Figure 2). The big differences are that Microsoft Copilot Chat has access to information stored in Microsoft 365 repositories like SharePoint Online that are available to the signed-in user. Microsoft 365 chat is available through https://www.microsoft365.com/copilot and in Teams chat.

Microsoft 365 Chat (from Microsoft video)
Figure 2: Microsoft 365 Chat (from Microsoft video)

The Monarch Issue

Another issue raised at TEC was the insistence Microsoft has that the Outlook Monarch client is the only version that will support Copilot. While it’s true that Microsoft wants customers to move to the new Outlook, user resistance is palpable and could become a barrier to adoption. Although there’s value to be gained by Copilot summarizing notes from a Teams meeting or creating a Word document or PowerPoint presentation based on existing content, many people still organize their working life around Outlook. And that’s Outlook classic, not a web-based version that’s still missing functionality like offline access (coming soon, or so I hear).

If Microsoft really wanted to, I think they could create an OWA Powered Experience (OPX)-based plug-in for Outlook classic (like the Room Finder) to integrate Copilot. Where there’s a will, there’s a way. In this instance, the will seems to be missing. And that’s just a little sad.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2023/09/22/microsoft-365-copilot-ga/feed/ 2 61691
Microsoft Removes Reuse Files Feature from Word https://office365itpros.com/2023/08/31/reuse-files-word/?utm_source=rss&utm_medium=rss&utm_campaign=reuse-files-word https://office365itpros.com/2023/08/31/reuse-files-word/#comments Thu, 31 Aug 2023 01:00:00 +0000 https://office365itpros.com/?p=61286

Perhaps an Indication that Copilot Does a Better Job?

When I read message center notification MC668802 (18 Aug 2023), the thought went through my mind that Microsoft’s intention to retire the Reuse Files in Word feature might be a reflection of their focus on Copilot for Microsoft 365.

Starting in August 2023, users won’t see the Reuse Files option in the Word ribbon. However, you can still search for and use the feature. When you launch Reuse Files, Word uses Graph API calls to find documents that it thinks you might want to copy content from or include a link to in your current file (Figure 1).

Reuse Files feature in Word
Figure 1: Reuse Files feature in Word

Introduced in late 2020, I thought that the idea of being able to build new documents by reusing work previously done is good. However, Microsoft says that by January 2024, they will remove all traces of the Reuse Files feature from Word. Microsoft didn’t say anything about the availability of Reuse Files in Outlook (for Windows). Nor did they say if the Reuse Slides feature in PowerPoint will disappear sometime in the future.

Improving Your Subscription by Removing Reuse Files

In MC668802, Microsoft says that they are “committed to improving your Microsoft 365 subscription” and “we occasionally remove features and benefits that duplicate equivalent offerings.”

The comment about duplicating equivalent offerings is what brings me to Copilot. It can be argued that the reuse files feature could be replicated by simply opening a Word document and copying text from it into your file. The difference is intelligence. The Reuse Files feature uses Graph API requests to find files that the app thinks might be of use. Unfortunately, the initial set of files that it lists are usually just the last set of files that you’ve worked on, and the files found when you enter a search term don’t always seem to match the request.

At $30/user/month (plus an eligible Microsoft 365 subscription), Microsoft 365 Copilot is expensive. The required investment makes it imperative that organizations select those allowed to use Copilot with care, even if you believe the hype that users only need to get a couple of dollars value from using Copilot to offset its cost. But what we know of Copilot to date is that it applies a lot of artificial intelligence technology to find information to respond to user prompts (queries). In addition, tenants that use Copilot have a semantic index to help find appropriate information. That’s something which doesn’t exist in normal tenants.

Perhaps Microsoft is removing “AI Lite” features like Reuse Files from the playing field to give Copilot a clear run. Put another way, not having features like Reuse Files in the Microsoft 365 apps emphasizes the usefulness and capabilities of Copilot for Microsoft 365.

Maybe an Innocuous Decision

It’s entirely possible that I am reading too much into an innocuous decision by Microsoft to remove a feature that isn’t used very much. Microsoft might have decided that the engineering effort required to maintain and support the Reuse Files feature isn’t worth it because of low usage (or because the feature really isn’t very good). After all, if users don’t know about a feature, they won’t use it (OWA search refiners might be another example).

Only Microsoft knows, and they cloud the decision in words that make it seem that the removal of the Reuse Files feature is for our own good. Maybe it is. Who knows?

Clearing the Deck

Microsoft removes relatively few features from Microsoft 365. Clutter is one example, replaced by Outlook’s Focused Inbox. It’s nice to think that Microsoft removes items to improve our subscriptions. I suspect that the truth is that feature removals clear the deck and make it easier for Microsoft rather than users.


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365.

]]>
https://office365itpros.com/2023/08/31/reuse-files-word/feed/ 1 61286
Microsoft Prepares Partners for Microsoft 365 Copilot https://office365itpros.com/2023/08/25/microsoft-365-copilot-partners/?utm_source=rss&utm_medium=rss&utm_campaign=microsoft-365-copilot-partners https://office365itpros.com/2023/08/25/microsoft-365-copilot-partners/#comments Fri, 25 Aug 2023 01:00:00 +0000 https://office365itpros.com/?p=61350

Get Software, Prompts, and Content Right to Make Microsoft 365 Copilot Work

Ever since Microsoft announced Copilot for Microsoft 365 last March, I’ve spent time to learn about concepts like generative AI to better understand the technology. I’ve also tracked Microsoft’s announcements to interpret their messaging about Copilot and analyzed the costs organizations face to adopt Copilot. Given the hefty licensing costs, I’ve reflected on how organizations might go about deciding who should get Copilot. You could say that I’ve thought about the topic.

Which brings me to a Microsoft partner session delivered yesterday about preparing for Microsoft 365 Copilot. I wrote on this theme last June, so wanted to hear the public messages Microsoft gives to its partners to use in customer engagements.

Get the Right Software

Mostly, I didn’t learn anything new, but I did hear three messages receive considerable emphasis. The first is that customers need the right software to run Microsoft 365 Copilot. Tenants need:

  • Microsoft 365 apps for enterprise.
  • Outlook Monarch.
  • Microsoft Loop.
  • Microsoft 365 Business Standard, Business Premium, E3, or E5.

Apart from mentioning the semantic index, nothing was said about the focus on Microsoft 365 SKUs. The semantic index preprocesses information in a tenant to make it more consumable by Copilot. For instance, the semantic index creates a custom dictionary of terms used in the organization and document excerpts to help answer queries. The idea is that the semantic index helps to refine (“ground”) user queries (“prompts”) before they are processed by the LLM.

Nice as the semantic index is, there’s nothing in the selected Microsoft 365 SKUs to make those SKUs amendable to the semantic index. Microsoft has simply selected those SKUs as the ones to support Copilot. It’s a way to drive customers to upgrade from Office 365 to Microsoft 365, just like Microsoft insists that customers use Outlook Monarch instead of the traditional Outlook desktop client.

Mastering Prompts

Quite a lot of time was spent discussing the interaction between users and Copilot. Like searching with Google or Bing, the prompts given to Copilot should be as specific as possible (Figure 1).

Constructing a Copilot prompt in Word

Microsoft 365 copilot
Figure 1: Constructing a Copilot prompt in Word (source: Microsoft)

It’s rather like assigning a task to a human assistant. Prompts are written in natural language and should:

  • Be Precise and detailed.
  • Include context (for instance, documents that Copilot should include in its processing).
  • Define what output is expected (and what format – like a presentation or document).

The aim is to avoid the need for Copilot to interpret (guess) what the user wants. A human assistant might know what their boss wants based on previous experience and insight gained over time, but Copilot needs those precise instructions to know what to do.

Constructing good prompts is a skill that users will need to build. Given that many people today struggle with Google searches twenty years after Google became synonymous with looking for something, it’s not hard to understand how people might find it difficult to coax Copilot to do their bidding, even if Copilot is patient and willing to accept and process iterative instructions until it gets things right.

Microsoft 365 Copilot is different to other variants like those for Security and GitHub that are targeted at specific professionals. A programmer, for instance, has a good idea of the kind of assistance they want to write code and the acid test of what GitHub Copilot generates is whether the code works (or even compiles). It’s harder to apply such a black and white test for documents.

The Quality of Content

Microsoft talks about Copilot consuming “rich data sets.” This is code for the information that users store in Microsoft 365 workloads like Exchange Online, Teams, SharePoint Online, OneDrive for Business, and Loop. Essentially, if you don’t have information that Microsoft Search can find, Copilot won’t be able to use it. Documents stored on local or shared network drives are inaccessible, for instance.

All of this makes sense. Between the semantic index and Graph queries to retrieve information from workloads, Copilot has a sporting chance of being able to answer user prompts. Of course, if the information stored in SharePoint Online and other workloads is inaccurate or misleading, the results will be the same. But if the information is accurate and precise, you can expect good results.

This leads me to think about the quality of information stored in Microsoft 365 workloads. I store everything in Microsoft 365 and wonder how many flaws Copilot will reveal. I look at how coworkers store information and wonder even more. Remember, Copilot can use any information it can find through Microsoft Search (including external data enabled through Graph connectors), which underlines the need to provide good guidance in the prompts given to Copilot. Letting Copilot do its own thing based on anything it can find might not be a great strategy to follow.

Lots Still to Learn

Microsoft 365 Copilot is still in private preview (at a stunning $100K fee charged to participating customers). Until the software gets much closer to general availability, I suspect that we’ll have more questions than answers when it comes to figuring out how to deploy, use, manage, and control Copilot in the wild. We still have lots to learn.

If you’re in Atlanta for The Experts Conference (September 19-20), be sure to attend my session on Making Generative AI Work for Microsoft 365 when I’ll debate the issues mentioned here along with others. TEC includes lots of other great sessions, including a Mary-Jo Foley keynote about “Microsoft’s Priorities vs. Customer Priorities: Will the Two Ever Meet?” TEC is always a great conference. Come along and be amused (or is that educated?)


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2023/08/25/microsoft-365-copilot-partners/feed/ 1 61350
Microsoft Launches Simplified Sharing for Microsoft 365 Apps https://office365itpros.com/2023/08/04/simplified-sharing-experience/?utm_source=rss&utm_medium=rss&utm_campaign=simplified-sharing-experience https://office365itpros.com/2023/08/04/simplified-sharing-experience/#comments Fri, 04 Aug 2023 01:00:00 +0000 https://office365itpros.com/?p=61049

Making Sharing of Files and Folders Easier

Apart from Microsoft 365 roadmap item 124933, I can’t find a formal announcement about the Simplified Sharing Experience, but I have been aware that Microsoft recently updated the share dialog used by Microsoft 365 apps to make it easier and more straightforward to use. According to a LinkedIn post, (Figure 1) Microsoft ran an A/B experiment to test the new dialog. I guess I was one of the testers! In any case, the new sharing dialog is now available in all Microsoft 365 tenants. Users of OneDrive consumer will see the upgraded dialog in the second half of 2023.

Microsoft spreads the news about the simplified sharing experience
Figure 1: Microsoft spreads the news about the simplified sharing experience

The Role of the Share Dialog

The share dialog is what people see when they share a document or folder with others inside or outside their organization. According to Microsoft, the dialog is used over 800 million times monthly across 52 different Microsoft 365 experiences (desktop, browser, and mobile). In other words, Microsoft 365 apps offer users the opportunity to share in 52 different places across the suite. The most common of the experiences are likely in SharePoint Online, OneDrive for Windows, and Teams.

Microsoft says that they focused on creating a dialog that makes it simpler for users to perform core sharing tasks. When someone invokes the new screen (Figure 2) to share a file or folder, they see a simpler layout pre-populated with the default sharing link as specified by the tenant or site policy (in this case, the sharing link allows access to people within the organization). The name of the sensitivity label assigned to the document is also shown to provide a visual indicator about its relative confidentiality.

Revamping sharing link dialog
Figure 2: The revamped sharing link dialog

To complete the link, add the people to notify and enter a note to tell them what to do, and click Send to have the message sent by email or Copy link to copy the sharing link to the clipboard.

If you need to change the type of sharing link, select the cogwheel to expose the link settings (Figure 3). Again, everything is very straightforward and simple. If you choose a link that allows external sharing, I’m told that the new design “makes users more comfortable with sharing.” I’m not quite sure what this means, but any of the sharing that I’ve done with people outside the organization has worked smoothly.

Editing the setting for a sharing link
Figure 3: Editing the setting for a sharing link

Microsoft has also overhauled the Manage access dialog to help people manage the set of users and groups that have access to a file or folder (Figure 4).

The revamped manage access dialog
Figure 4: The revamped manage access dialog

Microsoft says that customer feedback about the new dialog is very positive. It’s worth noting that this is not the first time that Microsoft has revamped the sharing dialog. The last major overhaul was in 2020-21 when Microsoft rationalized on a common sharing dialog for all apps, notably for Teams.

The Importance of Sharing

Getting sharing right is clearly important. When Microsoft launched the Delve app in 2015, it resulted in a crescendo of protest from tenants who suddenly found that Delve suggested documents to users when the organization thought that Delve should not. Of course, the software did nothing wrong. Delve respected the access rights given to users when it computed the set of interesting documents to suggest (using an early version of Graph document insights). The problem was entirely down to poor management and access control, often at the level of complete SharePoint Online sites. Users might not have realized that they had access to the documents in poorly-protected sites, but software can’t be blamed if it goes looking for documents to suggest to a user and finds some that are available.

We’re heading for a similar situation with Microsoft 365 Copilot. The Copilot software depends on finding information with Graph queries to help satisfy user prompts. Like Delve, Copilot will find files that are available to the user who prompts for help, and the results generated for the user might include some confidential. And if the user doesn’t bother to check the content generated by Copilot, the information might then be revealed with people who shouldn’t have it. This is the danger of oversharing, and it’s certainly an issue for organizations contemplating Microsoft 365 Copilot need to resolve before implementation.

Simplified Sharing Experience One Step Along the Path

The new sharing dialog won’t solve oversharing. It’s just one step along the path to help users share information with the right people in the right way.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2023/08/04/simplified-sharing-experience/feed/ 9 61049