Borrowing a tad from Slack (which has had the ability to upload custom emojis for years), message center notification MC795750 (updated 31 May 2024, Microsoft 365 roadmap item 80659) announces that Teams users will soon be able to add custom emojis and reactions by uploading image (PNG) or GIF files. Once uploaded, custom emojis are accessible to everyone in the tenant, which can support a maximum of 5,000 custom emojis.
Microsoft plans to make the feature available to targeted release tenants in late June 2024. General availability will follow in early July 2024 with GCC High and DoD tenants getting custom emojis in August 2024.
The ability to upload custom emojis is controlled by the CreateCustomEmojis setting in Teams messaging policies. Microsoft plans to ship the feature enabled, meaning that the setting should be True in all messaging policies. There will also be a setting in the Teams admin cenrer to disable or enable custom emojis tenant-wide.
Here’s how to use the Get-CsTeamsMessagingPolicy cmdlet from the MicrosoftTeams PowerShell module to check the values for the CreateCustomEmojis (create and upload new emojis) and DeleteCustomEmojis (delete custom emojis) settings.
Get-CsTeamsMessagingPolicy | Format-Table identity, *emojis* Identity CreateCustomEmojis DeleteCustomEmojis -------- ------------------ ------------------ Global False False Tag:Advanced True False Tag:Advanced Users True False Tag:Restricted - No Chat True False
You need the latest version of the MicrosoftTeams module to manage custom emojis.
To turn custom emojis off, run the Set-CsTeamsMessagingPolicy cmdlet to update messaging policies. In this example, custom emojis are disabled for any account assigned the Advanced messaging policy.
Set-CsTeamsMessagingPolicy -Identity Advanced -CreateCustomEmojis $false -DeleteCustomEmojis $false
Teams admin center receives an update in June (Figure 1) to allow administrators to manage the emoji settings in messaging policies without using PowerShell. Global and Teams administrators can delete custom emojis no matter what the messaging policy assigned to their account dictates.
To add a custom emoji, open the emoji and reactions menu and select the custom category (to the far right side of the other categories). If your account is allowed to add a custom emoji, you’ll see a plus sign. Click the plus sign to select the file for the new emoji. Only PNG and GIF files are supported. I took a photo from a recent trip to Disney World featuring a certain mouse and edited it to isolate the mouse character. I then saved the file as a PNG. Microsoft doesn’t say if the file should be under a certain size, but I took no chances and made sure that it was less than a megabyte. I uploaded the file and Teams invited me to name the emoji (Figure 2). You can see in the preview how the emoji will look in different situations.
Guest accounts cannot add a custom emoji. However, they can use the custom emojis created by tenant members. Seeing the custom emojis in a host tenant gives an interesting insight into the culture of that organization (Figure 3).
Once uploaded, custom emojis become available to all users and show up in the custom section. Users granted the ability to remove custom emojis can select and delete emojis from the same place (Figure 4).
Microsoft says that it can take up to 24 hours for a deleted emoji to disappear.
On May 31, 2024, Microsoft updated MC795750 to say that the custom emojis feature will not come to organizations with education licenses. I think this is a reasonable decision. There’s no doubt that teachers have better things to do than keep an eye out for inapproptiate emojis appearing in chats and channels.
In the corporate world, based on experience with Slack, it’s probable that organizations will see a surprising array of custom emojis appear after users discover that this capability exists (and they will, and fast). Some custom emojis will be marvelously witty; others will be scandalous and offensive. With up to five thousand custom emojis per tenant, there’s lots of room to experiment with all sorts of images. Let the games commence.
So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.
]]>No sooner had I published the article about creating dynamic administrative units with PowerShell, the first email arrived asking if the same was possible for dynamic Microsoft 365 groups. The answer is “of course,” but with the caveat that it’s not just a matter of some minor updates to the script.
That being said, the outline for the script to create dynamic groups is broadly the same:
Let’s examine some of the steps.
Here’s an example of creating a new dynamic Microsoft 365 group for the department whose name is stored in the $Dept variable:
Write-Host ("Checking groups for department {0}" -f $Dept) $Description = ("Dynamic Microsoft 365 group created for the {0} department on {1}" -f $Dept, (Get-Date)) $DisplayName = ("{0} Dynamic group" -f $Dept) $MailNickName = ("Dynamic.{0}.Group" -f ($Dept -replace " ","")) $MembershipRule = '(User.Department -eq "' + $Dept +'")' If ($DisplayName -in $Groups.DisplayName) { Write-Host ("Group already exists for {0}" -f $Dept) -ForegroundColor Red } Else { # Create the new dynamic Microsoft 365 Group $NewGroup = New-MgGroup -DisplayName $DisplayName -Description $Description ` -MailEnabled:$True -SecurityEnabled:$False ` -MailNickname $MailNickName -GroupTypes "DynamicMembership", "Unified" ` -MembershipRule $MembershipRule -MembershipRuleProcessingState "On" }
Flushed with the successful creation, you might want to rush to team-enable the new group. However, it’s best to wait 10-15 seconds before proceeding to allow Teams to learn about the new group from Entra ID. If you attempt to team-enable a group immediately after creation, you’ll probably see an error like this:
Failed to execute Templates backend request CreateTeamFromGroupWithTemplateRequest. Request Url: https://teams.microsoft.com/fabric/emea/templates/api/groups/bab7a3a8-2e30-4996-9405-48ca395b99c6/team, Request Method: PUT, Response Status Code: NotFound, Response Headers: Strict-Transport-Security: max-age=2592000 x-operationid: a228258204c3466dbd64c4d88373a416 x-telemetryid: 00-a228258204c3466dbd64c4d88373a416-82a9b5015f332574-01 X-MSEdge-Ref: Ref A: FC01DAADBD0D4A1A9ECBB9826707CC17 Ref B: DB3EDGE2518 Ref C: 2023-10-04T15:00:51Z Date: Wed, 04 Oct 2023 15:00:52 GMT ErrorMessage : {"errors":[{"message":"Failed to execute GetGroupMembersMezzoCountAsync.","errorCode":"Unknown"}],"operationId":"a228258204c3466dbd64c4d88373a416"}
To team-enable a group, run the New-MgTeam cmdlet and provide a hash table containing information to allow Teams to find the new group (the Graph URI for the group) plus the Teams template to use. This code does the trick.
$GroupUri = "https://graph.microsoft.com/v1.0/groups('" + $NewGroup.Id + "')" $NewTeamParams = @{ "template@odata.bind"="https://graph.microsoft.com/v1.0/teamsTemplates('standard')" "group@odata.bind"="$($GroupUri)" } $NewTeam = New-MgTeam -BodyParameter $NewTeamParams If ($NewTeam) { Write-Host ("Successfully team-enabled the {0}" -f $NewGroup.DisplayName) }
Figure 1 shows some of the dynamic Microsoft 365 groups created in my tenant. Note the groups for “Information Technology” and the “IT Department.” Obviously my checking of user departments was deficient prior to running the script. The fix is easy though. Decide on which department name to use and update user accounts to have that. Then remove the now-obsolete group. Entra ID will make sure that the accounts with reassigned departments show up in the correct group membership.
In this case, only one account had “IT Department,” so I quickly updated its department property with:
Update-MgUser -UserId Jack.Smith@office365itpros.com -Department "Information Technology"
I then removed the IT Department dynamic group:
$Group = Get-MgGroup -Filter "displayName eq 'IT Department Dynamic Group'" Remove-MgGroup -GroupId $Group.Id
Soon afterwards, the membership of the Information Department Dynamic group was correct (Figure 2) and all was well.
You can download the complete script from GitHub. It would be easy to adapt the code to run as an Azure Automation runbook to scan for new departments and create groups as necessary.
Scripting the creation of dynamic Microsoft 365 groups for each department in a tenant isn’t too difficult. The membership rule is simple but could be expanded to include different criteria. Once the groups are created, they should be self-maintaining. That is, if you make sure that the department property for user accounts is accurate.
Learn how to exploit the data available to Microsoft 365 tenant administrators through the Office 365 for IT Pros eBook. We love figuring out how things like dynamic Microsoft 365 groups work.
]]>I wrote about using dynamic Entra ID administrative units earlier this year. Not much has changed since then as the feature remains in preview, but an interesting question asked about creating dynamic administrative units with PowerShell. I could have referred the questioner to Microsoft’s documentation, but its examples feature cmdlets from the soon-to-be-deprecated Azure AD module. An example using the Microsoft Graph PowerShell SDK seems like a better idea, so that’s what I cover here.
The question asked about using a CSV file containing department names with the idea of creating a separate dynamic administrative unit for each department. Using CSV files is an effective way of driving scripts, but if the tenant directory is accurate and maintained, it’s easy to extract a list of departments from user accounts.
The steps in a script to create a dynamic administrative unit per department are as follows:
Here’s the code used to create a new administrative unit:
$Description = ("Dynamic administrative unit created for the {0} department created {1}" -f $Department, (Get-Date)) $DisplayName = ("{0} dynamic administrative unit" -f $Department) If ($DisplayName -in $CurrentAUs.DisplayName) { Write-Host ("Administrative unit already exists for {0}" -f $DisplayName) } Else { # Create the new AU $NewAUParameters = @{ displayName = $DisplayName description = $Description isMemberManagementRestricted = $false } $NewAdminUnit = (New-MgBetaAdministrativeUnit -BodyParameter $NewAUParameters) }
And here’s the code to transform it into a dynamic administrative unit:
$MembershipRule = '(user.department -eq "' + $Department + '" -and user.usertype -eq "member")' # Create hash table with the parameters $UpdateAUParameters = @{ membershipType = "Dynamic" membershipRuleProcessingState = "On" membershipRule = $MembershipRule } Try { Update-MgBetaAdministrativeUnit -AdministrativeUnitId $NewAdminUnit.Id -BodyParameter $UpdateAUParameters } Catch { Write-Host ("Error updating {0} with dynamie properties" -f $NewAdminUnit.DisplayName ) } Write-Host ("Created dynamic administrative unit for the {0} department called {1}" -f $Department, $NewAdminUnit.DisplayName)
Figure 1 shows the properties of a dynamic administrative unit created by the script, which you can download from GitHub.
The membership rule determines the membership of a dynamic administrative unit. Although you can construct filters to use with the Get-MgUser cmdlet to find licensed user accounts belonging to a department, the same flexibility doesn’t exist for the rules used to interrogate Entra ID to find members for a dynamic administrative unit (or dynamic Microsoft 365 group).
The problem is that membership rules don’t allow you to mix properties of different types. For instance, the rule can find user accounts belonging to a department (a string property), but it can’t combine that clause with a check against the assignedLicenses property to make sure that the account is licensed. That’s because assignedLicenses is a multi-value property and the rule can’t mix checks against strings with checks against multi-value properties. If you try, Entra ID signals a “mixed use of properties from different types of object” error. In effect, because we want to create dynamic administrative units based on department, the membership rule is limited to string properties.
I bet some folks reading this article ask the question “how do I find out what cmdlets to use to interact with Entra ID objects?” It’s a fair question. The SDK modules contain hundreds of cmdlets, some of which have extraordinarily long and complex names. My answer is to use the Graph X-ray add-on to gain insight into what the Entra ID admin center does to manipulate objects. If a method is good enough for the Entra ID admin center, it’s probably good enough for you.
Learn about using Entra ID, the Microsoft Graph PowerShell SDK, and the rest of Microsoft 365 by subscribing to the Office 365 for IT Pros eBook. Use our experience to understand what’s important and how best to protect your tenant.
]]>Microsoft makes a Teams Premium trial license to allow customers test whether the functionality available in Teams Premium is worth the $10/user/month cost. Some of the features, like meeting templates, might be less obviously worth the money. Others, like the advanced webinar functionality (like having a waitlist for webinar participants) might just be what you need. The trial allows you to try before you buy by testing all the features with up to 25 users for 30 days.
Once the 30-day period finishes, Microsoft automatically terminates the license validity and users lose access to the premium features. Even if you decide to go ahead with Teams Premium, it’s a good idea to clean up by removing the licenses from the user accounts that participated in the trial. This is easily done in the Microsoft 365 admin center by selecting the license, selecting all accounts holding the license and choosing Unassign licenses (Figure 1).
Given that we’re all learning how to manage licenses with the Microsoft Graph because of the imminent retirement of the Azure AD and MSOL modules, it’s good to know how to remove licenses. Let’s examine what’s needed to remove the Teams Premium trial licenses.
First, we must know the SKU identifier for the license. To do this, run the Get-MgSubscribedSku cmdlet and look through the set of licenses known to the tenant to find Teams Premium:
Get-MgSubscribedSku | Format-List SkuId, SkuPartNumber, ServicePlans SkuId : 36a0f3b3-adb5-49ea-bf66-762134cf063a SkuPartNumber : Microsoft_Teams_Premium ServicePlans : {MCO_VIRTUAL_APPT, MICROSOFT_ECDN, TEAMSPRO_VIRTUALAPPT, TEAMSPRO_CUST...}
According to the Azure AD list of licenses and identifiers, the SKU identifier for Teams Premium is 989a1621-93bc-4be0-835c-fe30171d6463 rather than the 36a0f3b3-adb5-49ea-bf66-762134cf063a shown here. This is because the first value is for the paid license. The second is for the trial license. Both SKUs have the same part number and display name (which is why the license shown in Figure 1 is called Microsoft Teams Premium). It would be nice if Microsoft added a trial suffix for its trial licenses.
In any case, both SKUs include seven separate service plans. A service plan is a license for a piece of functionality that cannot be bought. Instead, it’s bundled into a product (SKU) like Teams Premium. Service plans allow administrators to selectively disable functionality enabled by a license. For instance, you could disable advanced virtual appointments without affecting the other elements in Teams Premium. Table 1 lists the service plans covered by Teams Premium.
Service plan identifier | Service plan name | Display name |
85704d55-2e73-47ee-93b4-4b8ea14db92b | MICROSOFT_ECDN | Microsoft Content Delivery Network |
0504111f-feb8-4a3c-992a-70280f9a2869 | TEAMSPRO_MGMT | Microsoft Teams Premium Management |
cc8c0802-a325-43df-8cba-995d0c6cb373 | TEAMSPRO_CUST | Microsoft Teams Premium Branded Meetings |
f8b44f54-18bb-46a3-9658-44ab58712968 | TEAMSPRO_PROTECTION | Microsoft Teams Premium Advanced Meeting Protection |
9104f592-f2a7-4f77-904c-ca5a5715883f | TEAMSPRO_VIRTUALAPPT | Microsoft Teams Premium Virtual Appointment |
711413d0-b36e-4cd4-93db-0a50a4ab7ea3 | MCO_VIRTUAL_APPT | Microsoft Teams Premium Virtual Appointments |
78b58230-ec7e-4309-913c-93a45cc4735b | TEAMSPRO_WEBINAR | Microsoft Teams Premium Webinar |
Now that we know the SKU identifier, we can run some PowerShell to:
Connect-MgGraph -Scope User.ReadWrite.All Select-MgProfile Beta # Populate identifier for target product (SKU) $TeamsPremiumSku = "36a0f3b3-adb5-49ea-bf66-762134cf063a" [array]$Users = Get-MgUser -filter "assignedLicenses/any(s:s/skuId eq $TeamsPremiumSku)" -All If (!($Users)) { Write-Host "No Teams Premium Trial licenses found - exiting" ; break } Write-Host ("Removing {0} Teams trial licenses from {1}..." -f $Users.count, ($Users.displayName -join ", ")) ForEach($User in $Users) { Try { $Status = Set-MgUserLicense -UserId $User.Id -RemoveLicenses $TeamsPremiumSku -AddLicenses @{} } Catch { Write-Host "Error removing Teams Premium Trial license from {0}" -f $User.displayName } }
Updated with an appropriate SKU identifier, the code will remove licenses for other Microsoft 365 products.
It doesn’t matter if you leave expired licenses in place. They won’t affect how people use Microsoft 365. However, given that the paid-for and trial versions of the Teams Premium licenses have the same display name, it’s best to remove trial licenses to avoid potential future confusion.
Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.
]]>Updated 7 December 2022
With the addition of support for managed identities in V3.0 of the Exchange Online management PowerShell module, developers might be more interested in creating Azure Automation runbooks that use the Exchange Online cmdlets to process data like mailboxes. In this discussion, when I refer to a managed identity, I mean a system-assigned managed identity working within an Azure Automation Account. Essentially, a managed identity is a service principal used to access Azure resources that Azure manages automatically. No access is available to the credentials for the managed identity. Like the service principals for other apps, managed identity service principals can hold permissions to allow them access to resources like apps.
As an example, it’s now easy to connect to Exchange Online in a runbook with a command like:
Connect-ExchangeOnline -ManagedIdentity -Organization office365itpros.onmicrosoft.com
Exchange Online connects using the managed identity owned by the Azure Automation account that’s executing the runbook.
As noted above, before it can do anything interesting after connecting, the managed identity needs permissions. The essential permission for Exchange Online is Exchange.ManageAsApp, which allows an app to run Exchange Online cmdlets as if the app was an administrator account. Service principals for registered apps and managed identities both need this permission to do useful work with Exchange Online cmdlets.
In November 2020, Microsoft announced the deprecation of the Outlook REST API. This was part of a wider effort to move developers away from legacy APIs to the Graph. Microsoft also considers Exchange Web Services (EWS) to be a legacy API, but in this instance, the Exchange team focused on the Outlook REST API, which the Graph Outlook Mail API replaces.
At the same time, Microsoft said that they “removed the Exchange app permission from the Azure portal.” The Exchange.ManageAsApp permission is one of the permissions in the Office 365 Exchange Online API. Microsoft’s action didn’t remove the ability to assign the permission to apps in the Azure AD admin center. It just made the process a little harder.
To assign the Exchange.ManageAsApp permission to a registered app, select the app in the Registered Apps blade. Go to API permissions to add a permission as normal. When Azure AD displays the range of permissions to select from, click the APIs my organization uses tab, and then type Office 365 Exchange Online into the search box. Azure AD will find the Office 365 Exchange Online API (Figure 1). Note the application identifier shown here. We’ll need this later.
Now browse the set of permissions in the Office 365 Exchange Online API and select Exchange.ManageAsApp (Figure 2). Make sure that you’ve selected application permissions and click Add permission. When you return to the app details, consent to the assignment, just like you’d do for a Graph API permission.
The registered app can now run Exchange Online cmdlets as an administrator. That’s all well and good, but what about a managed identity?
Unlike registered apps, managed identities show up under the enterprise apps section of the Azure AD admin center. Open enterprise apps and apply a filter to find managed identities (Figure 3).
Azure AD lists the Azure automation accounts with managed identities. Select the automation account you want to work with. When you access its permissions, Azure AD tells you that: “The ability to consent to this application is disabled as the app does not require consent. Granting consent only applies to applications requiring permissions to access your resources.” In other words, you can’t assign an API to an automation account, or rather the service principal for the managed identity, through the Azure AD admin center.
Instead, you can do the job with PowerShell using cmdlets from the Microsoft Graph PowerShell SDK. Here’s how:
Connect-MgGraph -Scopes AppRoleAssignment.ReadWrite.All Select-MgProfile Beta $ManagedIdentityApp = Get-MgServicePrincipal -Filter "displayName eq 'ExoAutomationAccount'" $ExoApp = Get-MgServicePrincipal -Filter "AppId eq '00000002-0000-0ff1-ce00-000000000000'" $AppPermission = $ExoApp.AppRoles | Where-Object {$_.DisplayName -eq "Manage Exchange As Application"} $AppRoleAssignment = @{ "PrincipalId" = $ManagedIdentityApp.Id "ResourceId" = $ExoApp.Id "AppRoleId" = $AppPermission.Id } New-MgServicePrincipalAppRoleAssignment -ServicePrincipalId $ManagedIdentityApp.Id -BodyParameter $AppRoleAssignment
The new role assignment is effective immediately. If you make a mistake, you can remove the assignment with the Remove-MgServicePrincipalAppRoleAssignment cmdlet. Here’s how:
[Array]$SPPermissions = Get-MgServicePrincipalAppRoleAssignment -ServicePrincipalId $ManagedIdentityApp.Id $Role = $ExoApp.AppRoles | Where-Object {$_.DisplayName -eq "Manage Exchange As Application"} $Assignment = $SpPermissions | Where-Object {$_.AppRoleId -eq $Role.Id} Remove-MgServicePrincipalAppRoleAssignment -AppRoleAssignmentId $Assignment.Id -ServicePrincipalId $ManagedIdentityApp.Id
The final step is to make sure that Exchange Online recognizes the automation account which hosts the managed identity as an Exchange administrator. This is done by assigning the Exchange Administrator role to the automation account’s app in the Azure AD admin center. Figure 4 shows how to add the assignment of the Exchange administrator role to the app owned by an automation account.
If you don’t assign the Exchange administrator role to the automation account’s app, you’ll see an error telling you that the role assigned to the app isn’t supported in this scenario when you execute the runbook. For example:
“The role assigned to application 415e4ba8-635f-4689-b069-22dea1fcfdb3 isn’t supported in this scenario“
Perhaps Microsoft under-estimated the continuing need to assign the Exchange.ManageAsApp permission to apps when they made their November 2020 announcement. Although it’s a pain to have to go to PowerShell to assign the permission, it’s something that only needs to happen once, so it’s not too bad. I have other more serious things to moan about inside Microsoft 365.
Learn more about how the Microsoft 365 ecosystem really works on an ongoing basis by subscribing to the Office 365 for IT Pros eBook. Our monthly updates keep subscribers informed about what’s important across the Office 365 ecosystem.
]]>In October 2021, I wrote about how to use the Microsoft Graph PowerShell SDK to create a licensing report for a Microsoft 365 tenant. That report lists the licenses assigned to each user account together with any disabled service plans for those licenses. It’s a valuable piece of information to help tenants manage license costs.
But we can do better. At least, that’s what some readers think. They’d like to know if people use their assigned licenses so that they can remove expensive licenses from accounts that aren’t active. One way to approach the problem is to use the Microsoft 365 User Activity Report script to identify people who haven’t been active in Exchange Online. SharePoint Online, Teams, OneDrive for Business, and Yammer over the last 180 days. The report already includes an assessment of whether an account is in use, so all you need to do is find those who aren’t active and consider removing their licenses.
Another solution to the problem is to update the licensing report script. To do this, I made several changes to the script (the updated version is available from GitHub).
The first change is to the filter used with the Get-MgUser cmdlet. The new filter selects only member accounts that have licenses. Previously, I selected all member accounts, but now we’re interested in chasing down underused licensed accounts. Here’s the command I used:
[Array]$Users = Get-MgUser -Filter "assignedLicenses/`$count ne 0 and userType eq 'Member'" -ConsistencyLevel eventual -CountVariable Records -All -Property signInActivity | Sort-Object DisplayName
The filter applied to Get-MgUser finds member accounts with at least one license. The command also retrieves the values of the signInActivity property for each account. This property holds the date and time for an account’s last interactive and non-interactive sign-ins. Here’s what the data for an account looks like:
LastNonInteractiveSignInDateTime : 27/09/2022 13:04:58 LastNonInteractiveSignInRequestId : bcd2d562-76f0-4d29-a266-942f7ee31a00 LastSignInDateTime : 11/05/2022 12:19:18 LastSignInRequestId : 3f691116-5e0a-4c4c-a3a9-aecb3ae99800 AdditionalProperties : {}
The last non-interactive sign-in might be something like a synchronization operation performed by the OneDrive sync client or a sign-in using an access token for the user account to another Microsoft 365 app. I’m not too interested in these sign-in activities as I want to know about licensed accounts that aren’t taking full advantage of their expensive licenses. Hence, we focus on the timestamp for the last interactive sign-in.
Update: Microsoft now supports a timestamp for the last successful sign in for Entra ID accounts. The LastSignInDateTime property can capture an unsuccessful sign-in, so using the new lastSuccessfulSignInDateTime property is a better choice in most situations. However, Entra ID only captures data for the property from December 1, 2023.
To detect an underused account, we need to define how to recognize such an account. To keep things simple, I define an underused account as being more that hasn’t signed in interactively for over 60 days. An account in this category costs $23/month if it holds an Office 365 E3 license while one assigned an E5 license costs $38/month. And that’s not taking any add-on licenses into account. At $30/month, we’ve already paid $60 for an underused account when it matches our criterion.
The script checks to see if any Entra ID sign-in information is available for the account (i.e., the account has signed in at least once). If it does, we extract the timestamp for the last interactive sign-in and compute how many days it is since that time. If not, we mark the account appropriately.
# Calculate how long it's been since someone signed in If ([string]::IsNullOrWhiteSpace($User.SignInActivity.LastSignInDateTime) -eq $False) { [datetime]$LastSignInDate = $User.SignInActivity.LastSignInDateTime $DaysSinceLastSignIn = ($CreationDate - $LastSignInDate).Days $LastAccess = Get-Date($User.SignInActivity.LastSignInDateTime) -format g If ($DaysSinceLastSignIn -gt 60) { $UnusedAccountWarning = ("Account unused for {0} days - check!" -f $DaysSinceLastSignIn) } } Else { $DaysSinceLastSignIn = "Unknown" $UnusedAccountWarning = ("Unknown last sign-in for account") $LastAccess = "Unknown" }
Note that it can take a couple of minutes before Entra ID updates the last interactive timestamp for an account. This is likely due to caching and the need to preserve service resources.
The last change is to the output routine where the script now reports the percentage of underused accounts that it finds. Obviously, it’s not ideal if this number is more than a few percent.
I usually pipe the output of reports to the Out-GridView cmdlet to check the data. Figure 1 shows the output from my tenant. Several underused accounts are identified, which is what I expect given the testing and non-production usage pattern within the tenant. Another advantage of Out-GridView is that it’s easy to sort the information to focus in on problem items as seen here.
Seeing that the script is PowerShell, it’s easy to adjust the code to meet the requirements of an organization. Some, for instance, might have a higher tolerance level before they consider an account underutilized and some might be more restrictive. Some might like to split the report up into departments and send the underused accounts found for each department to its manager for review. It’s PowerShell, so go crazy and make the data work for you.
Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.
]]>Last week, I wrote about using the Get-AssociatedTeam cmdlet to create a membership report for Microsoft Teams. The report lists all teams that each user belongs to, including their direct membership of shared channels. Some folks got in contact to ask if they could use the cmdlet to generate a report about the membership of Teams private channels. The answer is no, probably because to be a member of a private channel, a user must first be a member of the host team. The host team is in the set returned by Get-AssociatedTeam, so the team is reported for the user anyway.
However, there’s usually another way to attack a problem. In this case, we can leverage the Get-TeamAllChannel cmdlet, another of the new cmdlets Microsoft released in V4.6 of the Microsoft Teams PowerShell module.
Because it contained much of the code I needed (never be too proud to reuse code!), I amended the script I wrote to report all the channels for all teams in a tenant as follows:
Here’s the main loop to process the private channels found in teams and capture details of their owners and members:
$ChannelsList = [System.Collections.Generic.List[Object]]::new() [int]$i = 0 ForEach ($Team in $Teams) { $i++ Write-Host ("Processing {0} ({1}/{2})" -f $Team.DisplayName, $i, $Teams.Count) [array]$Channels = Get-TeamAllChannel -GroupId $Team.Id -MembershipType "Private" ForEach ($Channel in $Channels) { Write-Host ("Found private channel {0} in team {1}" -f $Channel.DisplayName, $Team.DisplayName) [array]$ChannelMembers = Get-TeamChannelUser -GroupId $Team.Id -DisplayName $Channel.DisplayName ForEach ($Member in $ChannelMembers) { $ChannelLine = [PSCustomObject][Ordered]@{ # Write out details of the private channel and its members Team = $Team.DisplayName Channel = $Channel.DisplayName Description = $Channel.Description Member = $Member.Name MemberUPN = $Member.User Role = $Member.Role HostTeam = $Channel.HostTeamId Id = $Channel.Id } $ChannelsList.Add($ChannelLine) } } #End Foreach Member } # End ForEach Team
Figure 1 shows some data from my tenant in the PowerShell list that stores the private channel data as viewed through the Out-GridView cmdlet.
Teams reports minimal data for owners and members. If you wanted to include extra information about users like department, office, title, you’d need to look the user up in Azure AD to retrieve the information.
Even better, we can interrogate the list to calculate basic information about private channel usage in the tenant, such as the number of teams with private channels, the teams that have private channels and their owners (Figure 2).
After generating whatever data that you need, you can then create suitable output reports in CSV, HTML, Excel, or PDF formats.
You can download the full script from GitHub.
I’m not sure if Teams private channels have had the impact Microsoft expected when they revealed the new capability at the Ignite 2019 conference. However, I like private channels because I think they are an effective way to share information with a defined set of people, including sharing documents in the dedicated SharePoint Online site that each private channel has. Sure, app support is limited for shared channels, but that’s no reason to avoid using them when discussions need exists to be a little more private than regular team collaboration.
Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.
]]>About a month ago, I wrote about my experiences of creating files in SharePoint Online using a PowerShell script executing as an Azure Automation runbook. I reported that I used user credentials stored as a resource in the Azure Automation account to authenticate with the SharePoint PnP module. Once authenticated, I could use the Add-PnPFile cmdlet to create the file created by the script as a file in a SharePoint document library.
I noted that I used the stored credentials to make sure that I could create the file using the identity of a member of the Microsoft 365 group which owned the document library and hadn’t been able to find another way of doing this. I also said that I couldn’t find a way to post to Teams channels because of the way Graph permissions work. Clearly, I was exploring the limits of my knowledge.
Two comments made helpful suggestions. The first noted that the PnP PowerShell module includes a Submit-PnpTeamsChannelMessage cmdlet and suggested that this could be an answer, especially if combined with certificate-based authentication (CBA). The second suggested using the incoming webhook connector to post to a target channel.
My script created the output report in CSV and HTML files and already had a connection to PnP. The Submit-PnpTeamsChannelMessage cmdlet accepts HTML content as the message body for a channel. With the connection and body part in place, I could add post the message using this code:
$TargetTeamId = "107fe4dd-809c-4ec9-a3a1-ab88c96e0a5e" $TargetTeamChannel = "19:6d688803124c48d6bfa796284e641e9d@thread.tacv2" Submit-PnPTeamsChannelMessage -Team $TargetTeamId -Channel $TargetTeamChannel -Message $Body -ContentType Html -Important
The parameters are the identifiers for the team owning the target channel and the channel. The team identifier is easily found using the Get-Team or Get-MgGroup cmdlets:
Get-Team -DisplayName "Tenant Information" | ft GroupId, DisplayName GroupId DisplayName ------- ----------- 107fe4dd-809c-4ec9-a3a1-ab88c96e0a5e Tenant Information Get-MgGroup -Filter "displayName eq 'Tenant Information'" | ft Id, DisplayName Id DisplayName -- ----------- 107fe4dd-809c-4ec9-a3a1-ab88c96e0a5e Tenant Information
Knowing the team identifier, we can fetch the channel identifiers using the Get-TeamChannel cmdlet:
Get-TeamChannel -GroupId 107fe4dd-809c-4ec9-a3a1-ab88c96e0a5e | ft Id, DisplayName Id DisplayName -- ----------- 19:078bef3cfb6c4c519d4f585f099c9c91@thread.tacv2 General 19:6d688803124c48d6bfa796284e641e9d@thread.tacv2 Planning 2021
The other parameters are self-explanatory. The only other point of interest to note is that the Important switch applies this marking to the message. Figure 1 shows the result.
Great! We can post a message to a Teams channel using content created by a script running in Azure Automation. The only remaining challenge is how to eliminate the use of the stored credentials. I’m still exploring that point.
The Incoming Webhook Connector is one of the standard connectors supported by all teams channels. The function of the connector is to accept JSON-formatted content submitted to a URI identifying the target channel and post it as a new message to that channel. Here’s an example of using the webhook connector to post information about new Microsoft 365 roadmap items. Instead of posting a normal message to a channel, the connector posts message cards. These are intended to be notifications that new information is available and can include directions (like a hyperlink) to tell users where they can find the complete story. In my case, it was impossible to fit the complete report into a message card as this blew the maximum size limit for a card. I therefore ended up creating a card to tell the reader that a new version of the report was available together with a button for them to download the report (from SharePoint Online). Here’s the code I used:
# Post to Teams channel using an incoming webhook connector $GroupWebHookData = 'The new report is available in <a href="' + $NewFileUri + '">' + 'Microsoft 365 Groups Expiration Report</a>' $DateNow = Get-Date -format g $Notification = @" { "@type": "MessageCard", "@context": "https://schema.org/extensions", "summary": "Microsoft 365 Groups", "themeColor": "0072C6", "title": "Notification: New Microsoft 365 Groups Expiration Report is available", "sections": [ { "facts": [ { "name": "Tenant:", "value": "TENANT" }, { "name": "Date:", "value": "DATETIME" }], "markdown" : "true" }], "potentialAction": [{ "@type": "OpenUri", "name": "Download the report", "targets": [{ "os": "default", "uri": "URI" }], } ] } "@ $NotificationBody = $Notification.Replace("TENANT","$TenantName").Replace("DATETIME","$DateNow").Replace("URI","$NewFileUri") # Make sure you use the URI for your channel here. $TargetChannelURI = "https://office365itpros.webhook.office.com/webhookb2/107fe4dd-809c-4ec9-a3a1-ab88c96e0a5e@b662313f-14fc-43a2-9a7a-d2e27f4f3478/IncomingWebhook/0a3dea30f595436ead8138334516911a/eff4cd58-1bb8-4899-94de-795f656b4a18" $Command = (Invoke-RestMethod -uri $TargetChannelURI -Method Post -body $NotificationBody -ContentType 'application/json')
The resulting message card posted to the channel is simple, but it gets the job done (Figure 2).
The conclusion is that it’s possible to post messages to Teams channels using either the Submit-PnpTeamsChannelMessage cmdlet or inbound webhook connector. Both methods have their own limitations, but once you understand what the limitations are, it’s easy to decide which approach to take in different circumstances.
The full script I used to create the output in an Azure Automation runbook is available in GitHub.
Keep up with the changing world of the Microsoft 365 ecosystem by subscribing to the Office 365 for IT Pros eBook. Monthly updates mean that our subscribers learn about new developments as they happen.
]]>After writing about the recent revamp of Exchange Online dynamic distribution lists, I was asked if it was possible to create a team from the membership of a dynamic distribution list. The answer is that the steps are straightforward to create a static Microsoft 365 group. Things get more complicated if you contemplate using a dynamic Microsoft 365 group.
Available in both Exchange Online and Exchange Server, dynamic distribution lists are very powerful. That is, if the organization directory is well-maintained with details about people, job titles, department names, offices, country, and so on. The membership of dynamic distribution lists can include any kind of mail-enabled recipient, including other groups. And that’s the first challenge to face: the Microsoft 365 groups used by Teams support a flat membership (no nested groups) composed solely of accounts belonging to the host organization (members and guests): only user mailboxes can migrate to become members of a target Microsoft 365 group.
The second challenge comes into play if you decide that the target Microsoft 365 group should have dynamic membership. The issue here is that dynamic distribution lists use filters executed against Exchange Online’s directory while dynamic Microsoft 365 groups use filters based on Azure AD. Different filters, different syntax, and different properties. More on this later.
Starting with the simple issue of finding the members of a dynamic distribution list and using this information to create a new Microsoft 365 group, the steps are straightforward:
The script I created is available in GitHub. Normal caveats apply: the code works but it doesn’t have much error checking. It’s there to prove a principle, not be an off-the-shelf solution.
Multiple ways exist to identify a source dynamic distribution list. This example prompts the user to select one. The code could become a lot more complex to allow the user to make a mistake and select from a numbered list, and so on, but for the purpose of the example all we want is the object identifier for a valid dynamic distribution list:
$InputDDL = Read-Host "Enter the name of the Dynamic Distribution List to convert to a Microsoft 365 Group" [array]$SourceDDL = Get-DynamicDistributionGroup -Identity $InputDDL -ErrorAction SilentlyContinue If (!($SourceDDL)) {Write-Host ("Sorry! We can't find the {0} dynamic distribution list" -f $InputDDL); break} If ($SourceDDL.Count -gt 1) { CLS Write-Host "We found multiple matching dynamic distribution lists" Write-Host "-----------------------------------------------------" Write-Host " " $SourceDDL | Format-Table DisplayName, Alias, PrimarySMTPAddress Write-Host " " Write-Host "Please try again..."; break } [string]$SourceDDLId = $SourceDDL.ExternalDirectoryObjectId
Two methods exist to return the membership of the dynamic distribution list:
The first method resolves against the Exchange directory and its results are up to date. The second fetches membership data as at the last time Exchange processed the list (more information here). After retrieving the membership using the chosen method, we apply a filter to extract mailboxes.
# Now that we have a source DDL, let's get its membership [array]$SourceMembers = Get-Recipient -RecipientPreviewFilter (Get-DynamicDistributionGroup -Identity $SourceDDLId).RecipientFilter # could also be # [array]$SourceMembers = Get-DynamicDistributionGroupMember -Identity $SourceDDL.Id # Throw away anything but user mailboxes because that's all a Microsoft 365 group supports [array]$ValidMembers = $SourceMembers | ? {$_.RecipientTypeDetails -eq "UserMailbox"}
The next piece of code establishes the owner of the new group. Microsoft 365 groups must have an owner, so if the ManagedBy property of the source list results in an invalid result (for instance, it’s empty), we need to assign ownership to a default account. One way of doing this is to find the set of Exchange administrators for the organization and select one of them, which is done here using the Get-MgDirectoryRoleMember cmdlet from the Microsoft Graph PowerShell SDK and filtering out any service principals assigned the Exchange administrator role. You could simplify the script by hardcoding a default group member.
# We've got to assign an owner to the new Microsoft 365 group, so we need to have a default in case the source DDL doesn't have an owner # Find the set of accounts that are Exchange admins (you can also use Get-AzureADDirectoryRoleMember here) [array]$ExoAdmins = Get-MgDirectoryRoleMember -DirectoryRoleId "53add08e-5b0c-4276-a582-9ce02fb6c947" | Select Id, AdditionalProperties # Throw away any service principals which might have the Exchange Admin role $ExoAdmins = $ExoAdmins | ? {$_.AdditionalProperties.'@odata.type' -eq '#microsoft.graph.user'} | Select -ExpandProperty Id # Select the first and use them as the default owner $ExoDefaultAdmin = Get-MgUser -UserId $ExoAdmins[0] | Select -ExpandProperty UserPrincipalName # Check that the group owner is a mailbox $GroupOwner = Get-ExoMailbox -Identity $SourceDDL.Managedby -ErrorAction SilentlyContinue # If it's null or something weird like a shared mailbox, use the default owner If (($GroupOwner -eq $Null) -or ($GroupOwner.RecipientTypeDetails -ne "UserMailbox")) { $GroupOwner = $ExoDefaultAdmin } Else { $GroupOwner = $GroupOwner.PrimarySmtpAddress } # Populate other group properties $AliasDDL = $SourceDDL.Alias + "M365" $GroupDisplayName = $SourceDDL.DisplayName + " (Group)"
With everything ready, we can go ahead and create the new Microsoft 365 Group, add the members, and team-enable the group. All the members can be added with a single Add-UnifiedGroupLinks command because we have an array of email addresses. Exchange processes each item in the array and adds it as a member.
# Create the new Microsoft 365 Group Write-Host "Creating the new Microsoft 365 group..." $Description = "Created from the " + $SourceDDL.DisplayName + " dynamic distribution list on " + (Get-Date -Format g) $NewGroup = New-UnifiedGroup -DisplayName $GroupDisplayName –AccessType Private -Alias $AliasDDL -RequireSenderAuthenticationEnabled $True -Owner $SourceDDL.ManagedBy -AutoSubscribeNewMembers -Notes $Description # Add the members to the group Write-Host "Adding members from the dynamic distribution list to the Microsoft 365 group..." Add-UnifiedGroupLinks -Identity $NewGroup.ExternalDirectoryObjectId -LinkType Members -Links $ValidMembers.PrimarySmtpAddress Write-Host "Enabing Microsoft Teams for the Microsoft 365 group..." New-Team -Group $NewGroup.ExternalDirectoryObjectId
The code doesn’t add a sensitivity label, so if you use these to apply container settings to groups and teams, you should add the label when creating the new group by passing the identifier for the selected label in the SensitivityLabel parameter.
That’s it. We have a new team built from the membership of a dynamic distribution list. The code is straightforward and works without a hitch, but if we throw dynamic membership for the Microsoft 365 group/team into the equation, things become much more complex. I’ll cover that subject in another post.
Learn about Teams, Exchange Online, and the rest of Office 365 by subscribing to the Office 365 for IT Pros eBook. Use our experience to understand what’s importance and how best to protect your tenant.
]]>Updated: January 27, 2023
The Microsoft 365 Groups and Teams Activity Report is a longstanding project of mine. I originally wrote the PowerShell script when Office 365 Groups were quite new and then refreshed it to deal with Microsoft Teams. The idea is to report statistics about the activity of groups such as:
With the data, you can see what groups or teams might be inactive and are candidates for archiving or removal.
The output is a report in HTML (Figure 1) and CSV formats. Administrators can slice and dice the data in the CSV file to present it whatever way they want. Some like to import the data into Power BI and visualize it there.
Note: If your group names include non-ASCII characters like é, use the Export-Excel cmdlet from the ImportExcel module to export the report file to Excel. Exporting to a CSV does not include the non-ASCII characters in group names.
The most recent enhancement discarded many of the calls to “expensive” PowerShell cmdlets like Get-UnifiedGroup and replaced them with Microsoft Graph queries. I did this to increase performance of the script and enable it to run in some large tenants with over 20,000 groups (teams). I’m sure that the script will process more than that number, but I haven’t gone higher. In any case, if you need to process very large numbers of groups, you should probably use a different tool and maybe even split processing up across batches of groups (for instance, A-C, D-E, and so on).
The latest version of the Graph-based script is 5.13. You can download the full script from GitHub. The latest updates include:
I’ll update this post when new versions appear.
Because it’s much slower, I don’t develop the pure PowerShell version anymore. The last version that I worked on is 4.8. The pure PowerShell script lags both the performance and functionality of its Graph counterpart, but you can download it from GitHub.
Update: V5.5 and later versions remove the need to download the Teams usage report from the Teams admin center. The script now does this automatically.
If you’re going to run the report, you can speed things up even more by going to the Analytics & Reports section of the Teams admin center to download a CSV file with Teams usage data. If you don’t download the file, the script will still run. However, instead of being able to check usage data (like the number of channel posts) from the file, the script must check the number of compliance records stored in the team’s group mailbox.
Because checking compliance records uses a call to the Get-ExoMailboxFolderStatistics cmdlet instead of reading a record from a hash table, the operation is much more expensive in performance terms. On average, it takes an extra couple of seconds to process each team-enabled group, which quickly mounts up when the script must process hundreds or thousands of teams. As an example, to process 210 groups (83 teams), the script took 1034 seconds without a teams usage data file. With the file, the elapsed time for the same set reduced to 388 seconds.
On the upside, checking compliance records returns the count of every channel conversation post since the creation of a team (subject to any retention policies in force) whereas checking against the data file gives a snapshot of activity over the last 90 days. Knowing what happened over the last 90 days is usually sufficient to know if a team is active.
To generate the Teams usage data file, do the following:
If you provide the script with a teams usage data file, the data includes messages posted to private channels. It will soon include messages posted to shared channels. If you don’t use a data file, the script only includes messages posted to standard channels because it doesn’t check the mailboxes of private channel members or the special cloud mailboxes used by shared channels.
I don’t pretend this script is a work of PowerShell art. It could probably do with a complete rewrite. However, it works, and it’s something that tenants can use to create their own version of what they think an activity report should do. After all, it’s just PowerShell, so play with the code and let your imagination run riot!
Learn how to exploit the data available to Microsoft 365 tenant administrators through the Office 365 for IT Pros eBook. We love figuring out how things work.
]]>As you probably know, as part of a major revamp for the application, Whiteboard is moving its storage for its boards from Azure to OneDrive for Business. According to Microsoft 365 roadmap item 66767, general availability happened in December 2021. This refers to tenants who decided to opt-in early, or for tenants who decide to switch through the Whiteboard settings in the Microsoft 365 admin center.
OneDrive became the default for storage of new boards in January 2022. According to Message center notification MC275235, the updates for Whiteboard clients that can’t yet support OneDrive should be available by the end of March. Once the updated clients are deployed, the transition should complete.
Further good news comes in Microsoft 365 roadmap item 66759, which says that external participants in Teams meetings will be able to share boards. A dependency exists on OneDrive for Business as the new feature only works when the board being shared is in OneDrive. If not, Teams displays the polite but extremely frustrating error message shown in Figure 1. People just love being locked out of collaboration, so it’s good that Microsoft is fixing this problem.
You might not know that Whiteboard supports PowerShell. Well, it does, but only just. A bare-bones module (WhiteboardAdmin) is available in the PowerShell gallery, but it doesn’t contain many cmdlets.
Get-Command -Module WhiteboardAdmin CommandType Name Version Source ----------- ---- ------- ------ Function Get-Whiteboard 1.5.0 WhiteboardAdmin Function Get-WhiteboardOwners 1.5.0 WhiteboardAdmin Function Get-WhiteboardSettings 1.5.0 WhiteboardAdmin Function Get-WhiteboardsForTenant 1.5.0 WhiteboardAdmin Function Invoke-TransferAllWhiteboards 1.5.0 WhiteboardAdmin Function Remove-Whiteboard 1.5.0 WhiteboardAdmin Function Set-WhiteboardOwner 1.5.0 WhiteboardAdmin Function Set-WhiteboardSettings 1.5.0 WhiteboardAdmin
Not many people have downloaded the module either, possibly because they don’t know of its existence. I’ve used the Invoke-TransferAllWhiteboards cmdlet in the past to transfer ownership of boards from one user account to another (a task sometimes necessary if someone leaves the organization), but I have not played with the other cmdlets.
That is, until I noticed a tweet about a new script available in the PnP Script Samples gallery to create a report about all the boards and their owners in a tenant. The script uses the old Microsoft Online Services (MSOL) module to retrieve user information. Microsoft plans to deprecate the MSOL module at the end of 2022, so it’s a good example of a script that needs to be updated to use either Microsoft Graph queries or cmdlets from the Microsoft Graph PowerShell SDK.
Upgrading the script didn’t take much time because the only calls are to load the module and retrieve details of user accounts. My version of the code is shown below. Apart from using the Microsoft Graph PowerShell SDK, the only changes I made replaced output arrays with PowerShell lists to improve performance.
ReportWhiteBoardInfo.PS1 # Import the WhiteboardAdmin module Import-Module WhiteboardAdmin # Connect to the Microsoft Graph Connect-MgGraph -TenantId $TenantId -Scope "Directory.Read.All, User.Read.All" try { $dateTime = (Get-Date).toString("dd-MM-yyyy") $fileName = "WhiteboardReport-" + $dateTime + ".csv" $outputView = "c:\temp\" + $fileName # The geography to look for board owners in. Accepted values are: Europe, Australia, or Worldwide (all boards not in australia or europe). $supportedGeographies = @("Europe", "Australia", "Worldwide") # Array to hold Whiteboard owners $WhiteboardOwners = [System.Collections.Generic.List[Object]]::new(); $i=0 foreach ($geography in $supportedGeographies) { Write-Host "Getting Whiteboard owners for geography: $($geography)..." $GeographyOwners = Get-WhiteboardOwners -Geography $Geography foreach ($UserId in $GeographyOwners.items) { $User = Get-MgUser -UserId $UserId $i++ $ReportLine = [PSCustomObject][Ordered]@{ DisplayName = $User.DisplayName UPN = $User.UserPrincipalName Geography = $Geography UserId = $UserId } $WhiteboardOwners.Add($ReportLine) } # End ForEach Owner Write-Host "Total whiteboard owners found so far $($i)" } # EndForEach Geography # Array to hold Whiteboard details $Whiteboards = [System.Collections.Generic.List[Object]]::new() # Get whiteboards from the Microsoft Whiteboard service by owners foreach ($Owner in $WhiteboardOwners) { Write-Host "Getting Whiteboards for owner: $($Owner.UPN) ..." $whiteboardInfo = Get-Whiteboard -UserId $Owner.UserID foreach ($whiteboardInstance in $whiteboardInfo) { $ReportLine = [PSCustomObject][Ordered]@{ User = $Owner.DisplayName UPN = $Owner.UPN WhiteboardId = $whiteboardInstance.Id Title = $whiteboardInstance.Title IsShared = $whiteboardInstance.IsShared Created = Get-Date($whiteboardInstance.CreatedTime) -format g Modified = Get-Date($whiteboardInstance.LastModifiedTime) -format g Geography = $Owner.Geography UserId = $Owner.UserId } $Whiteboards.Add($ReportLine) } #End Foreach Whiteboards Write-Host "Found $($whiteboards.Count) Whiteboards owned by: $($Owner.UPN)" } # End Foreach Whiteboard owners Write-Host "Found $($whiteboards.Count) Whiteboards in the tenant." # Export the results to a CSV file and Out-GridView $Whiteboards | Export-CSV -Path $outputView -Force -NoTypeInformation $Whiteboards | Out-GridView Write-Host "Finished" } catch { Write-Host -f Red "Error:" $_.Exception.Message }
You can download the script from GitHub. I’ll update the code there when I see a fix for the problem I’m just about to describe.
All worked well and the script generated a report (Figure 2 shows some of the report data viewed through the Out-GridView cmdlet).
The problem is that the report doesn’t include any whiteboard stored in OneDrive for Business. Microsoft released Version 1.5 of the WhiteboardAdmin module a month ago, but it’s obvious that the cmdlets only work against the Azure storage and ignore the transition to OneDrive.
Microsoft’s documentation doesn’t cover migration of old boards from Azure to OneDrive. However, Microsoft 365 roadmap item 66763 covers migration of previously created boards with general availability in April 2022. The text says: “Tenants in locations that are currently storing new whiteboards in European datacenters will have previously created whiteboards migrated to European datacenters.”
This masterpiece of obfuscation implies that Microsoft plans to migrate old boards currently stored in U.S. datacenters to European datacenters, where hopefully the data will end up in OneDrive for Business. Perhaps this is a pointer to a more widespread migration. Let’s hope that this happens, and that Microsoft upgrades the WhiteboardAdmin module to deal with OneDrive.
Learn more about how the Office 365 applications really work on an ongoing basis by subscribing to the Office 365 for IT Pros eBook. Our monthly updates keep subscribers informed about what’s important across the Office 365 ecosystem.
]]>By now, most people who write PowerShell code to interact with Microsoft 365 workloads understand that sometimes it’s necessary to use Microsoft Graph API queries instead of “pure” PowerShell cmdlets. The Graph queries are usually faster and more reliable when retrieving large quantities of data, such as thousands of Microsoft 365 Groups. Over the last few years, as people have become more familiar with the Microsoft Graph, an increased number of scripts have replaced cmdlets with Graph queries. All these scripts use Entra ID (Azure AD) access tokens, as does any utility which interacts with the Microsoft Graph, like the Graph Explorer (Figure 1).
In the remainder of this article, I explore what an Entra ID access token contains.
Graph queries need authentication before they can run and the Graph API uses modern authentication. Entra ID registered applications bridge the gap between PowerShell and the Graph. The apps hold details used during authentication such as the app name, its identifier, the tenant identifier, and some credentials (app secret or certificate. The app also holds permissions granted to access data through Graph APIs and other APIs. When the time comes to authenticate, the service principal belonging to an app uses this information to request an access token from Entra ID. Once Entra ID issues the access token, requests issued to the Invoke-RestMethod or Invoke-WebRequest cmdlets can include the access token to prove that the app has permission to access information.
At first glance, an access token is a confused mass of text. Here’s how PowerShell reports the content of an access token:
eyJ0eXAiOiJKV1QiLCJub25jZSI6IlFQaVN1ck1VX3gtT2YzdzA1YV9XZzZzNFBZRFUwU2NneHlOeDE0eVctRWciLCJhbGciOiJSUzI1NiIsIng1dCI6Ik1yNS1BVWliZkJpaTdOZDFqQmViYXhib1hXMCIsImtpZCI6Ik1yNS1BVWliZkJpaTdOZDFqQmViYXhib1hXMCJ9.eyJhdWQiOiJodHRwczovL2dyYXBoLm1pY3Jvc29mdC5jb20iLCJpc3MiOiJodHRwczovL3N0cy53aW5kb3dzLm5ldC9iNjYyMzEzZi0xNGZjLTQzYTItOWE3YS1kMmUyN2Y0ZjM0NzgvIiwiaWF0IjoxNjQ0ODQ1MDc3LCJuYmYiOjE2NDQ4NDUwNzcsImV4cCI6MTY0NDg0ODk3NywiYWlvIjoiRTJaZ1lEaW1McEgwTSt5QTk5NmczbWZUUXlYN0FBPT0iLCJhcHBfZGlzcGxheW5hbWUiOiJHZXRUZWFtc0xpc3QiLCJhcHBpZCI6IjgyYTIzMzFhLTExYjItNDY3MC1iMDYxLTg3YTg2MDgxMjhhNiIsImFwcGlkYWNyIjoiMSIsImlkcCI6Imh0dHBzOi8vc3RzLndpbmRvd3MubmV0L2I2NjIzMTNmLTE0ZmMtNDNhMi05YTdhLWQyZTI3ZjRmMzQ3OC8iLCJpZHR5cCI6ImFwcCIsIm9pZCI6IjM4NTRiYjA4LTNjMmMtNGI1Ny05NWZjLTI0ZTA3OGQzODY4NSIsInJoIjoiMC5BVndBUHpGaXR2d1Vva09hZXRMaWYwODBlQU1BQUFBQUFBQUF3QUFBQUFBQUFBQmNBQUEuIiwicm9sZXMiOlsiVGVhbVNldHRpbmdzLlJlYWRXcml0ZS5BbGwiLCJUZWFtTWVtYmVyLlJlYWQuQWxsIiwiR3JvdXAuUmVhZC5BbGwiLCJEaXJlY3RvcnkuUmVhZC5BbGwiLCJUZWFtLlJlYWRCYXNpYy5BbGwiLCJUZWFtU2V0dGluZ3MuUmVhZC5BbGwiLCJPcmdhbml6YXRpb24uUmVhZC5BbGwiLCJBdWRpdExvZy5SZWFkLkFsbCJdLCJzdWIiOiIzODU0YmIwOC0zYzJjLTRiNTctOTVmYy0yNGUwNzhkMzg2ODUiLCJ0ZW5hbnRfcmVnaW9uX3Njb3BlIjoiRVUiLCJ0aWQiOiJiNjYyMzEzZi0xNGZjLTQzYTItOWE3YS1kMmUyN2Y0ZjM0NzgiLCJ1dGkiOiI3RVkyWnVXV2JFYVF0T3piVVlwOUFBIiwidmVyIjoiMS4wIiwid2lkcyI6WyIwOTk3YTFkMC0wZDFkLTRhY2ItYjQwOC1kNWNhNzMxMjFlOTAiXSwieG1zX3RjZHQiOjEzMDI1NDMzMTB9.N9yvmkCedti2fzT44VfBkN7GvuCInrIgiMgNxdyZeAyxnbdZjEhxHmNdU6HLLHQ3J-GonpPdt28dKwYxgLcrSibGzSPVHddh6MDPYutSwfIxh2oRanxhgFOWVJADfbFoCxsRFDhKJNT39bsauIUiRNzGzbb6dvWuZQ8LrgWjZzjae2qxVxj9jvYgjXEypeYZgLvPOzJiBCuluAMH3TjPuS-CuglFK_edn4CS-ztCwM0hmDFD5BLNZqng5P2KqGTEgjkMKoyIJ8yTGBJpASfdqqEFqWzQwcQ9ese924qNC3hJR_5TWHp2Fl73bpdhwBHRL5UwGTPi9_ysYdndKhXwgA
Access tokens issued by Entra ID comply with the OAuth 2.0 bearer token standard (RFC6750) and are structured as JSON-formatted Web Tokens. We can’t see the JSON content because it is base64Url encoded and signed. However, if you paste the token into a site like https://jwt.ms/, the site will decrypt the list of claims included in the token and we’ll see something like the details shown below for the access token featured above:
{ "typ": "JWT", "nonce": "gq3zmJhybfXGDGqt6RO2PX9s0cimmRpSRrTO90sQ4w4", "alg": "RS256", "x5t": "Mr5-AUibfBii7Nd1jBebaxboXW0", "kid": "Mr5-AUibfBii7Nd1jBebaxboXW0" }. { "aud": "https://graph.microsoft.com", "iss": "https://sts.windows.net/a662313f-14fc-43a2-9a7a-d2e27f4f3478/", "iat": 1644833772, "nbf": 1644833772, "exp": 1644837672, "aio": "E2ZgYJif1+eocevtzqRIrgDGA2V3AQ==", "app_displayname": "ReportDLs", "appid": "76c31534-ca1f-4d46-959a-6159fcb2f77a", "appidacr": "1", "idp": "https://sts.windows.net/a662313f-14fc-43a2-9a7a-d2e27f4f3478/", "idtyp": "app", "oid": "4449ce36-3d83-46fb-9045-2d1721e8f032", "rh": "0.AVwAPzFitvwUokOaetLif080eAMAAAAAAAAAwAAAAAAAAABcAAA.", "roles": [ "Group.Read.All", "Directory.Read.All", "User.Read.All" ], "sub": "4449ce36-3d83-46fb-9045-2d1721e8f032", "tenant_region_scope": "EU", "tid": "a662313f-14fc-43a2-9a7a-d2e27f4f3478", "uti": "BU1RVc7mHkmBq2FMcZdTAA", "ver": "1.0", "wids": [ "0997a1d0-0d1d-4acb-b408-d5ca73121e90" ], "xms_tcdt": 1302543310 } .[Signature]
The deciphered token divides into three parts: header, payload, and signature. The aim of a token is not to hide information, so the signature is not protected by encryption. Instead, it’s signed using a private key by the issuer of the token. Details of the algorithm and private key used to sign an access token are in its header. An application can validate the signature of an access token if necessary, but this is not usually done when running a PowerShell script. The payload is the location for the claims made by the token and is the most interesting place to check.
Another way to check what’s in an access token is to use the JWTDetails PowerShell module, which is available in the PowerShell Gallery. To install this (very small) module, run:
Install-Module -Name JWTDetails -RequiredVersion 1.0.0 -Scope AllUsers
Afterward, you can examine a token with the Get-JWTDetails cmdlet. Here’s an example revealing that the access token issued to an app allows it to access Exchange Online using the IMAP4 or POP3 protocols:
Get-JWTDetails -Token $Token aud : https://outlook.office.com iss : https://sts.windows.net/b662313f-14fc-43a2-9a7a-d2e27f4f3478/ iat : 1671891468 nbf : 1671891468 exp : 1671895368 aio : E2ZgYDAQS/prW6b0Zsah6KMXtnTEAQA= app_displayname : POP3 and IMAP4 OAuth 2.0 Authorization appid : 6a90af02-6ac1-405a-85e6-fb6ede844d92 appidacr : 1 idp : https://sts.windows.net/a662313f-14fc-43a2-9a7a-d2e27f4f3478/ oid : b7483867-51b6-4fdf-8882-0c43aede8dd5 rh : 0.AVwAPzFitvwUokOaetLif080eAIAAAAAAPEPzgAAAAAAAABcAAA. roles : {POP.AccessAsApp, IMAP.AccessAsApp} sid : 1475d8e7-2671-47e9-b538-0ea7b1d43d0c sub : b7483867-51b6-4fdf-8882-0c43aede8dd5 tid : a662313f-14fc-43a2-9a7a-d2e27f4f3478 uti : COCw22GGpESVXvfdhmEVAQ ver : 1.0 wids : {0997a1d0-0d1d-4acb-b408-d5ca73121e90} sig : PdScMpYqwA25qJL1z8q589sz/Ma5CGQ4ea9Bi0lnO2yByrIs530emYPnFPfQNN9EPBIvv4EaAoTLomrw4RMBWYoQSAgkBUXVrYGnC jzAU6a2ZNZgo7+AORHk4iyLO0FpbLEaMJvCvI5vWhP9PHOxnGLcIsCbOmyrCK6lxxIKtBx851EpLrhpyvJ3p05NSw0D/mKzXPRKtc rzQcUwECxOUugbm1zdq8JaE/PmSggBb87VZy7p1S2BXhxQZ5QU17JeIADyhCGm1Ml+avuIHsVS2iat/LPEi/nktbrXMcOzROpUKyZ /7uVhxQ0cscJ6WGxbd+zJm36s25Yp1vMzSHaRxQ== expiryDateTime : 24/10/2022 15:22:48 timeToExpiry : 00:59:34.7611307
The list of claims in the access token includes simple claims and scopes (groups of claims). A claim is an assertion about something related to the token. In this case, the claims tell us details like:
Get-MgServicePrincipal -Filter "Id eq '4449ce36-3d83-46fb-9045-2d1721e8f032'" DisplayName Id AppId SignInAudience ServicePrincipalTy pe ----------- -- ----- -------------- ------------------ ReportDLs 4449ce36-3d83-46fb-9045-2d1721e8f032 77c31534-ca1f-4d46-959a-6159fcb2f77a AzureADMyOrg Application
Scopes are a logical grouping of claims, and they can serve as a mechanism to limit access to resources. The roles claim contains a scope of Graph API permissions starting with Group.Read.All and ending with User.Read.All. We therefore know that this app has consent from the organization to use the permissions stated in the scope when it executes Graph API queries. The list of permissions is enough to allow the PowerShell script (in this case, one to generate a report of distribution list memberships) to query the Graph for a list of all groups and read the membership of each group.
From bitter experience, I know how easy it is to get Graph permissions wrong. One way to check is sign into the Graph Explorer and run the query (here’s an example) to check what permissions the Explorer uses to execute the query. However, you can also dump the access token to check that the set of permissions in the access token matches what you expect. It’s possible that you might have requested some application permissions for the app and failed to gain administrator consent for the request, meaning that the access token issued to the app by Entra ID won’t include the requested permissions.
Once we’re happy that we have a good access token, we can use it with Graph queries. Here’s how to fetch the list of distribution groups in a tenant. The access token is included in the $Headers variable passed to the Invoke-RestMethod cmdlet.
$Headers = @{Authorization = "Bearer $token"} $Uri = "https://graph.microsoft.com/V1.0/groups?`$filter=Mailenabled eq true and not groupTypes/any(c:c+eq+'Unified')&`$count=true" [array]$DLs = (Invoke-RestMethod -Uri $Uri -Headers $Headers -Method Get -ContentType "application/json") $DLs = $DLs.Value
And if everything goes to plan, we should have a set of distribution lists to process. If not, it’s bound to be a problem with your access token, so it’s time to return to square one and restart the acquisition process.
Learn more about how Office 365 really works on an ongoing basis by subscribing to the Office 365 for IT Pros eBook. Our monthly updates keep subscribers informed about what’s important across the Office 365 ecosystem.
]]>Updated 24 April 2023
A reader question about the Microsoft 365 Groups expiration policy caused me to review some PowerShell code I wrote to report the next renewal dates for the set of groups within the scope of the expiration policy. The question was related to Yammer (now Viva Engage), to know if the Microsoft 365 Group expiration policy covers the groups by Yammer and will remove inactive groups when necessary. The answer is yes; Microsoft updated policy processing last year to accommodate the Microsoft 365 groups used by Yammer communities when networks run in Microsoft 365 native mode. Microsoft confirmed coverage for Yammer communities by the groups expiration policy in MC324202 (published today). Microsoft 365 roadmap item 82186 also deals with the scenario and says that general availability occurred in January 2022.
In 2020, Microsoft changed the way the Microsoft 365 Groups expiration policy works to introduce automatic renewal. Instead of bothering group owners with email to remind them to renew the group, a background job looks for evidence that the group is active. If the evidence exists, Microsoft 365 renews the group automatically. Unfortunately, a limited set of signals govern renewal:
It’s debatable if a group is active if just one group member visits a Teams channel or views a post in a Yammer community. The Microsoft Graph gathers a wide array of signals about user activity and there’s surely a more precise method to determine group activity than the actions cited above. The Groups and Teams activity report is an example of how to code your own assessment of group activity.
In any case, the answer remains that you can add the Microsoft 365 groups used by Viva Engage communities to the Groups expiration policy through the Microsoft Entra admin center (Figure 1).
You can also add groups to the policy with PowerShell.
The Groups expiration policy is appliable to selected groups or to all Microsoft 365 groups in the tenant. Users who are members of the groups covered by the policy must have Azure AD Premium P1 licenses.
Writing code to report the expiration dates for groups isn’t difficult. The date when the group needs to be next renewed is in the ExpirationTime property. The only complication is to find when the group was last renewed. This data isn’t returned by the Get-UnifiedGroup cmdlet, so we need to use the Get-AzureADMSGroup cmdlet. Once we know where to get the dates, we can report what we find. This code runs after connecting to the Exchange Online and Azure AD PowerShell modules.
Write-Host "Finding Microsoft 365 Groups to check…" [array]$ExpirationPolicyGroups = (Get-UnifiedGroup -ResultSize Unlimited | ? {$_.ExpirationTime -ne $Null} | Select DisplayName, ExternalDirectoryObjectId, WhenCreated, ExpirationTime ) If (!($ExpirationPolicyGroups)) { Write-Host "No groups found subject to the expiration policy - exiting" ; break } Write-Host $ExpirationPolicyGroups.Count “groups found. Now checking expiration status.” $Report = [System.Collections.Generic.List[Object]]::new(); $Today = (Get-Date) ForEach ($G in $ExpirationPolicyGroups) { $Days = (New-TimeSpan -Start $G.WhenCreated -End $Today).Days # Age of group $LastRenewed = (Get-AzureADMSGroup -Id $G.ExternalDirectoryObjectId).RenewedDateTime $DaysLeft = (New-TimeSpan -Start $Today -End $G.ExpirationTime).Days $ReportLine = [PSCustomObject]@{ Group = $G.DisplayName Created = Get-Date($G.WhenCreated) -format g AgeinDays = $Days LastRenewed = Get-Date($LastRenewed) -format g NextRenewal = Get-Date($G.ExpirationTime) -format g DaysLeft = $DaysLeft} $Report.Add($ReportLine) } # End Foreach CLS;Write-Host "Total Microsoft 365 Groups covered by expiration policy:" $ExpirationPolicyGroups.Count Write-Host “” $Report | Sort DaysLeft | Select Group, @{n="Last Renewed"; e= {$_.LastRenewed}}, @{n="Next Renewal Due"; e={$_.NextRenewal}}, @{n="Days before Expiration"; e={$_.DaysLeft}} Total Microsoft 365 Groups covered by expiration policy: 74 Group Last Renewed Next Renewal Due Days before Expiration ----- ------------ ---------------- ---------------------- Potholers (Team) 02/02/2020 07:16 21/02/2022 07:16 13 Office 365 Questions 19/05/2017 11:12 14/03/2022 15:04 34 Corona Virus News 10/03/2020 21:56 30/03/2022 22:56 51 Contract Workers 12/03/2020 08:57 01/04/2022 09:57 52 Plastic Production (Team) 25/03/2020 08:48 14/04/2022 09:48 65
The code works, with two caveats:
The solution for both speed and supportability is to use a Microsoft Graph API query to fetch group details.
Essentially, what we need to do is to replace the call to Get-UnifiedGroup with a Graph API query to return the set of groups in the tenant. The bonus is that the query returns the last renewed time, so there’s no need to use Get-AzureADMSGroup.
As with any script that calls Graph queries from PowerShell, you need a registered application in Azure AD to hold the permissions required to run the queries used by the script. In this case, we only need the Group.Read.All permission. After securing an access token, we can fetch the set of groups in the tenant using a lambda filter. The code shown below uses a function (Get-GraphData) to execute the Invoke-RestMethod cmdlet to fetch the data and page until all groups are retrieved. You can see the code for the Get-GraphData function in this script.
After fetching the set of groups, we create a report detailing the group name, its creation date, the date last renewed, and expiration date. The code used to process the data returned by Get-UnifiedGroup is modified to deal with the property names returned by the Graph query.
$uri = "https://graph.microsoft.com/beta/groups?`$filter=ExpirationDateTime ge 2014-01-01T00:00:00Z AND groupTypes/any(a:a eq 'unified')&`$count=true" [array]$Groups = Get-GraphData -AccessToken $Token -Uri $uri If (!($Groups)) { Write-Host "No groups found subject to the expiration policy - exiting" ; break } $Report = [System.Collections.Generic.List[Object]]::new(); $Today = (Get-Date) ForEach ($G in $Groups) { $Days = (New-TimeSpan -Start $G.CreatedDateTime -End $Today).Days # Age of group #$LastRenewed = $G.RenewedDateTime #$NextRenewalDue = $G.ExpirationDateTime $DaysLeft = (New-TimeSpan -Start $Today -End $G.ExpirationDateTime).Days $GroupsInPolicy++ $ReportLine = [PSCustomObject]@{ Group = $G.DisplayName Created = Get-Date($G.CreatedDateTime) -format g "Age in days" = $Days "Last renewed" = Get-Date($G.RenewedDateTime) -format g "Next renewal" = Get-Date($G.ExpirationDateTime) -format g "Days before expiration" = $DaysLeft} $Report.Add($ReportLine) } # End ForeachCLS;Write-Host "Total Microsoft 365 Groups covered by expiration policy:" $Groups.Count Write-Host “” $Report | Sort "Days before expiration"| Select Group, "Last renewed", "Next renewal", "Days before expiration" | Out-GridView
As you’d expect, things run much faster. Retrieving data through a Graph query is always quicker than using a PowerShell cmdlet and eliminating the call to Get-AzureADMSGroup for each group helps speed things up even further. Figure 2 shows the output.
An even easier solution is to replace the calls to Get-UnifiedGroup and Get-AzureADMSGroup with the Get-MgGroup cmdlet from the Microsoft Graph PowerShell SDK. Here’s the code, which is almost as fast as using the Graph API:
Write-Host "Finding Microsoft 365 Groups to check…" [array]$ExpirationPolicyGroups = Get-MgGroup -Filter "groupTypes/any(c:c eq 'unified')" -All | ? {$_.ExpirationDateTime -ne $Null } If (!($ExpirationPolicyGroups)) { Write-Host "No groups found subject to the expiration policy - exiting" ; break } Write-Host $ExpirationPolicyGroups.Count “groups found. Now checking expiration status.” $Report = [System.Collections.Generic.List[Object]]::new(); $Today = (Get-Date) ForEach ($G in $ExpirationPolicyGroups) { $Days = (New-TimeSpan -Start $G.CreatedDateTime -End $Today).Days # Age of group $DaysLeft = (New-TimeSpan -Start $Today -End $G.ExpirationDateTime).Days $ReportLine = [PSCustomObject]@{ Group = $G.DisplayName Created = Get-Date($G.CreatedDateTime) -format g AgeinDays = $Days LastRenewed = Get-Date($G.RenewedDateTime) -format g NextRenewal = Get-Date($G.ExpirationDateTime) -format g DaysLeft = $DaysLeft} $Report.Add($ReportLine) } # End Foreach CLS;Write-Host "Total Microsoft 365 Groups covered by expiration policy:" $ExpirationPolicyGroups.Count Write-Host “” $Report | Sort DaysLeft | Select Group, @{n="Last Renewed"; e= {$_.LastRenewed}}, @{n="Next Renewal Due"; e={$_.NextRenewal}}, @{n="Days before Expiration"; e={$_.DaysLeft}}
The question about Yammer communities forced me to look at code and find an instance where I needed to replace a cmdlet before its deprecation next June. At the same time, I managed to speed up the code by introducing a Graph query. Things worked out for the best, but it does illustrate the need to check and update old scripts on an ongoing basis.
Learn how to exploit the data available to Microsoft 365 tenant administrators through the Office 365 for IT Pros eBook. We love figuring out how things work.
]]>Last year, I wrote about the need to review and clean up Entra ID integrated applications. That article describes how to extract information from Entra ID o a CSV file and use the CSV to create a Microsoft List. To make it easy to access the list, we create a channel tab in Teams. Everything works to identify suspect apps that might need removal. I think that you should perform such a review periodically. It just makes sense.
Another way to monitor potentially suspicious app activity is to review sign in data for service principals. The intention is to identify unrecognized service principals signing into the tenant and figure out what apps are involved. Sign-ins can originate from well-known service principals used by Microsoft apps, third-party apps, or the service principals automatically created by Entra ID when tenants register apps to interact with the Graph (for instance, to authenticate calls made to Graph APIs in PowerShell scripts). Sign-in data for service principals is available through the Entra admin center (Figure 1) and now it’s accessible using the Microsoft Graph List SignIns API.
The reason why this update is important is that access to sign-in data via the Graph makes it possible to download the information for analysis or store it for long-term retention in an external repository. Although you can download sign-in data as a CSV file from the Entra admin center, it’s more flexible to access the information via Graph queries, especially when you want to probe the activity patterns of certain service principals.
Any application which wants to interact with the Graph requires consent for permissions to access data. In this instance, consent is needed the Directory.Read.All and AuditLog.Read.All application permissions. Delegate permissions can also be used, and in this case the account used must hold an administrative role capable of accessing the Entra ID sign-in logs.
A suitably-permissioned application can issue queries against the SignIns API. To fetch service principal sign-in data, the query executed by the application must use a Lambda qualifier to filter data. Apart from setting a date range to search for sign-in data, the important point is to filter against the signInEventTypes property to select sign-in events for service principals. Here’s an example of a query to fetch sign-in data for between 17:30 and 22:3 on 19 January.
https://graph.microsoft.com/beta/auditLogs/signIns?&$filter=createdDateTime ge 2022-01-19T17:30:00Z and createdDateTime le 2022-01-19T22:30:00Z and signInEventTypes/any(x:x eq 'servicePrincipal')
To test the query (or one which suits your purposes), use the Graph Explorer to see what the query returns.
I wrote a simple PowerShell script (downloadable from GitHub) to fetch service principal sign-in data for the last seven days. A quick summary of the data revealed that many sign-ins came from an app named Office 365 Reports. Curiously, an app used by a PowerShell script that I had posted on GitHub also showed up with 22 sign-ins. The Information Barrier Processor is the app used by Microsoft 365 to check user accounts against information barrier policies to ensure that no one is communicating with anyone when they shouldn’t.
$Report | Group SpName | Sort Count -Descending | Select Name, Count Name Count ---- ----- Office 365 Reports 369 Graph Microsoft 365 Groups Membership Report 22 Information Barrier Processor 21 Security and Audit 5 PS-Graph 1
Resolving the large set of sign-ins was easy. The data stored in the list (Figure 2) revealed the service principal to belong to an Office 365 Reporting app originally published by Cogmotive (acquired by Quadrotech and then by Quest Software). I haven’t used the app in years, but the sign-ins kept on coming.
Over time, it’s easy to accumulate crud in the form of service principals installed for one reason or another. Testing an ISV product is a classic example, which is a good reason to always conduct tests in a test tenant instead of the production tenant. Or if you stop using an app, remember to clean up by removing service principals and other app debris that the app vendor might leave behind.
The sign-ins for the app used by the PowerShell script probably exist because I shared a copy of the script with my tenant identifier, the app identifier, and the app secret in place. I quickly replaced the script with a copy containing obfuscated credentials, but failed to change the app secret, meaning that anyone with an original copy could run the code. Now alerted, I removed the app secret. My suspicions were confirmed when a batch of failed sign-ins subsequently occurred for the app. This goes to prove how easy it is to create a potential compromise if you’re not careful.
You can clean up unwanted service principals with either the Entra admin center or PowerShell. I always have a PowerShell session open, so I chose that route. In this example, we find the object identifier for a service principal using its display name as a filter for the Get-MgServicePrincipal cmdlet. When sure that this is the right service principal to remove, we use the object identifier to remove the service principal with the Remove-MgServicePrincipal cmdlet.
$SP = Get-MgServicePrinicpal -filter "displayname eq 'Office 365 Reports'" $SP DisplayName Id AppId SignInAudience ----------- -- ----- -------------- Office 365 Reports 9ac957ae-160b-48d3-9a6f-f4c27acca040 507bc9da-c4e2-40cb-96a7-ac90df92685c AzureADMultipleOrgs Remove-MgServicePrincipal -ServicePrincipalId $Sp.id
A list of service principals known to the tenant is a valuable input to a review for unwanted or unnecessary apps holding some form of consent (permissions) to organization data. Adding context to the data by knowing which service principals are actively signing into the tenant makes it easier to prioritize action. The data is there, it’s available, and it’s up to you to decide what to do with it.
Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.
]]>Vasil Michev, the Technical Editor of the Office 365 for IT Pros eBook, comes up with all sorts of weird and wonderful insights into Microsoft 365. A recent question he discussed on his blog was how to find the creation date for a tenant. It’s a good question because it forces respondents to know where to look for this information and is exactly the kind of poser we like to tease out as we write content for the book.
As Vasil points out, the obvious answer is to fire up the Teams admin center because the tenant creation date appears on a card displayed on its home screen (Figure 1). The Teams admin center is the only Microsoft 365 portal which shows this information. Why the Teams developers thought that it was useful to highlight the tenant creation date is unknown. After all, the date won’t change over time and static information is not usually featured by workload dashboards.
Opening an administrative portal is no challenge. Vasil suggests several alternate methods to retrieve the tenant creation date. It seemed like fun to try some of these methods against my tenant. Here’s what I found.
If you’ve used Exchange Online from the start, you can check the creation date of the Exchange organization configuration object, created when an administrator enables Exchange Online for the first time.
(Get-OrganizationConfig).WhenCreated Monday 27 January 2014 20:28:45
It’s an interesting result. Exchange Online reports its initiation in January 2014 while Teams is quite sure that the tenant existed in April 2011. I’ve used Exchange Online for email ever since I had a tenant, so the disconnect between Exchange Online and the tenant creation date is interesting.
Another way of checking Exchange data is to look at the creation dates for mailboxes. This PowerShell snippet finds all user mailboxes and sorts them by creation date. The first mailbox in the sorted array is the oldest, so we can report its creation date:
[array]$Mbx = Get-ExoMailbox -ResultSize Unlimited -Properties WhenCreated -RecipientTypeDetail UserMailbox | Sort {$_.WhenCreated -as [datetime]} Write-Host ("The oldest mailbox found in this tenant is {0} created on {1}" -f $Mbx[0].DisplayName, $Mbx[0].WhenCreated) The oldest mailbox found in this tenant is Tony Redmond created on 27/01/2014 20:36:38
(Dates shown are in Ireland local format. The equivalent U.S. format date is 01/27/2014).
Grabbing all mailboxes to check their creation date will not be a fast operation. Even using the REST-based Get-ExoMailbox cmdlet from the Exchange Online management module, it will take time to retrieve all the user mailboxes in even a medium size tenant.
As it turns out, the oldest mailbox is my own, created about eight minutes after the initiation of Exchange Online. However, we’re still in 2014 when the tenant proclaims its creation in 2011, so what happened?
A search through old notes revealed that Microsoft upgraded my original Office 365 tenant created in 2011 to an enterprise version in 2014. It seems that during the tenant upgrade, Microsoft recreated the instance of Exchange Online. That explanation seems plausible.
Another method is to examine the creation dates of administrator accounts to find the oldest account. This is usually the administrator account created during tenant setup. In other words, when you create a new tenant, you’re asked to provide the name for an account which becomes the first global administrator. If we look at the administrator accounts in the tenant and find the oldest, it should be close to the tenant creation date shown in the Teams admin center. That is, unless someone deleted the original administrator account.
Azure AD is the directory of record for every Microsoft 365 tenant, so we should check Azure AD for this information. The steps are:
Here’s the code I used:
# Find the identifier for the Azure AD Global Administrator role $TenantAdminRole = Get-AzureADDirectoryRole | Where-Object {$_.DisplayName -eq ‘Global Administrator’} | Select ObjectId # Get the set of accounts holding the global admin role. We omit the account used by # the Microsoft Rights Management Service $TenantAdmins = Get-AzureADDirectoryRoleMember -ObjectId $TenantAdminRole.ObjectId | ? {$_.ObjectId -ne "25cbf210-02e5-4a82-9f5c-f41befd2681a"} | Select-Object ObjectId, UserPrincipalName # Get the creation date for each of the accounts $TenantAdmins | ForEach-Object { $_ | Add-Member -MemberType NoteProperty -Name "Creation Date" -Value (Get-AzureADUserExtension -ObjectId $_.ObjectId ).Get_Item("createdDateTime") } # Find the oldest account $FirstAdmin = ($TenantAdmins | Sort-Object {$_."Creation Date" -as [datetime]} | Select -First 1) Write-Host ("First administrative account created on {0}" -f $FirstAdmin."Creation Date")
The older Microsoft Online PowerShell module doesn’t require such a complicated approach to retrieve account creation data. Taking the code shown above and replacing the Get-AzureADUserExtension cmdlet with Get-MsOlUser, we get:
$TenantAdmins | ForEach-Object { $_ | Add-Member -MemberType NoteProperty -Name "Creation Date" -Value ((Get-MsOlUser -ObjectId $_.ObjectId ).WhenCreated) }
Using either cmdlet, the result is:
First administrative account created on 11/04/2011 17:35:11
The Teams admin center also reports April 11, 2011, so using administrator accounts might be a viable way to determine tenant age.
Microsoft 365 stores information for each tenant in the Microsoft Graph, and it’s the Graph which is the source for the Teams admin center. We can retrieve the same information by running the https://graph.microsoft.com/V1.0/organization Graph query. The createdDateTime property returned in the organization settings is what we need.
Here’s the PowerShell code to run after obtaining the necessary access token for a registered app, which must have consent to use the Organization.Read.All Graph permission. Vasil used the beta endpoint when he showed how to fetch tenant organization settings using the Graph Explorer (which saves the need to write any code), but the V1.0 endpoint works too.
$Uri = "https://graph.microsoft.com/V1.0/organization" $OrgData = Invoke-RESTMethod -Method GET -Uri $Uri -ContentType "application/json" -Headers $Headers If ($OrgData) { Write-Host ("The {0} tenant was created on {1}" -f $Orgdata.Value.DisplayName, (Get-Date($Orgdata.Value.createdDateTime) -format g)) } The Redmond & Associates tenant was created on 11/04/2011 18:35
The first administrator account appears to date from 17:35 while the tenant creation time is an hour later. This is easily explained because all dates stored in the Graph are in UTC whereas the dates extracted from Azure AD and reported by PowerShell reflect local time. In April 2011, local time in Ireland was an hour ahead of UTC.
After all the checks, it’s clear that I created my tenant in the early evening of April 11, 2011. Given that this was ahead of Microsoft’s formal launch of Office 365 in July 2011, I can claim to use an old tenant, for what that’s worth.
]]>The fastest way to fetch the list of Teams in a Microsoft 365 tenant programmatically is to use the Graph API. PowerShell is fast enough in small tenants, but once there’s more than a couple of hundred teams (groups) to process, the Graph is usually a better choice. Unless you’ve got time to wait, of course.
I cover the topic in an article explaining how to fetch a list of Teams using the Groups endpoint. Because the Groups API returns all types of groups, you apply a filter to find the set of Microsoft 365 Groups which are team-enabled. For PowerShell, the commands needed to execute the Graph API call are:
$Uri = “https://graph.microsoft.com/V1.0/groups?`$filter=resourceProvisioningOptions/Any(x:x eq 'Team')” [array]$Teams = Invoke-WebRequest -Method GET -Uri $Uri -ContentType "application/json" -Headers $Headers | ConvertFrom-Json
The filter used with the query is a Lambda operator. In this case, it requests the Graph to return any Groups it finds where the provisioning option is set to “Team.” Using a filter to find Teams is a well-known technique exploited by developers: we use it in the Graph-based version of the Teams and Groups Activity Report script.
In December, Microsoft published a new List Teams API in the beta version of the Graph. The API access the Teams endpoint `to fetch a list of teams without using a filter and works with both delegated and application permissions. Apps need consent for one of the Team.ReadBasic.All, TeamSettings.Read.All, TeamSettings.ReadWrite.All permissions to use the API. You can apply filters to the List Teams API to find specific teams if you don’t want the full set. The equivalent code to fetch all the teams in the tenant is:
$Uri = "https://graph.microsoft.com/beta/teams" [array]$Teams = Invoke-WebRequest -Method GET -Uri $Uri -ContentType "application/json" -Headers $Headers | ConvertFrom-Json
Remember that the Graph uses paging to return data, so code needs to be prepared to process multiple pages of data to acquire the full set of teams.
In both cases, the set of Teams returned by the query is in an array called Value. In other words, to see the details of individual teams, you access the array using $Teams.Value. For example, the first team is available using $Teams.Value[0], the second with $Teams.Value[1], and so on. The count of teams returned in the array is available with $Teams.Value.Count. In the case of the Teams List API, it’s also available as $Teams.’@odata.count’.
The List Teams API currently returns just three properties for a team. Here’s what you get for a team:
$Teams[0] id : 109ae7e9-1f94-48d1-9972-64abab87b89a createdDateTime : displayName : French Customers description : Delivering great service to French customers internalId : classification : specialization : visibility : webUrl : isArchived : isMembershipLimitedToOwners : memberSettings : guestSettings : messagingSettings : funSettings : discoverySettings :
To get the other team properties (for instance, the archive status for a team), you must query the properties of an individual team. This isn’t difficult, and the same downside exists if you use the Groups endpoint to fetch a list of Teams. That endpoint returns many properties for a group, but not the teams properties.
The Graph Explorer is a good way to try out the new API. Make sure that you sign into your tenant and have consent to use at least the Team.ReadBasic.All permission. Then input a query and see what happens. Figure 1 shows the result of running the query: https://graph.microsoft.com/beta/teams?$filter=startswith(displayName, ‘Office 365’) to return the set of teams whose display name starts with Office 365. As noted above, along with the properties of the individual teams, the query also returns the count in @odata.count.
If your programs or scripts often need to retrieve a list of teams for processing, you should keep an eye on the List Teams API to track its development. For now, there’s no need to update existing code. In time, it might be the case that the new API delivers better performance than going through the Groups endpoint or that the query returns all team properties without having to retrieve individual teams. Better performance and functionality are always welcome. Let’s see what happens as the new API makes its way from beta to production.
Learn how to exploit the Office 365 data available to tenant administrators through PowerShell and the Graph with the help of the many examples in the Office 365 for IT Pros eBook. We love figuring out how things work.
]]>A reader asked if it’s possible to stop Teams displaying a system-generated message when someone joins a team. It’s a reasonable question. In the past, I have pointed out the dangers of adding someone to a group too early as people can then discover that a new employee is joining the company. Conversely, it’s not good when people learn about the departure of a valued colleague through an informational message posted in Teams to say that the person has left a team.
Things used to be worse. Before May 2020, Teams posted messages about members joining and leaving a team in the team’s General channel. The introduction of the channel information pane gave these system messages a new home. Unless people open the information pane, they don’t see messages about membership changes, new owners and channels, and other developments, so there’s a fair chance that the addition of a new employee to a team will go unnoticed.
To see any of the methods to add a new member do not result in a system message in the information pane, I tested by adding a new member through:
I didn’t test using the Microsoft Graph API. The Add-TeamUser cmdlet is a wrapper around the Graph API call, so the results observed for that cmdlet are likely the same for a Graph call. System messages are retrievable using Graph API calls.
Azure AD is the directory of record. Add-AzureADGroupMember updates the Azure AD group object used by the team. Add-UnifiedGroupLinks updates the Azure AD group object and the group in the Exchange Online directory using a dual write. Add-TeamUser is like adding a new member through the client because the action updates both the team roster (to make the new team member immediately available) and Azure AD. Rosters (lists of members and owners) are how Teams organizes and manages membership.
Changes made to Azure AD or by other Microsoft 365 workloads synchronize with Teams through a background process called Microsoft Teams Aad Sync, introduced in 2020 to make the synchronization process between Teams and Azure AD more efficient and effective. Note that it can take several hours before a system message about a new member shows up. Apart from the need to run background synchronization, clients also need to refresh their cache.
In a nutshell, no matter how you add or remove a tenant or guest account, the change synchronizes back to Teams and the system message appears in the information pane (Figure 1).
Different system messages in the information pane appear depending on the method used to add an account. If you see that someone added a member (like “Tony Redmond has added Niamh Smith to the team”), it’s an indication that the action occurred through the Teams client, the Add-TeamUser cmdlet, or the Graph API (all of which execute the same code). On the other hand, if you see that someone joined the team, the source is Azure AD or Exchange Online PowerShell.
There’s no system or team setting to tweak to turn off system messages about member updates. Granular control would be best, but I guess Microsoft ignored me when I previously complained about the lack of control over system message publication, so I’ve submitted it again to the new Teams Feedback portal. Please vote there if you support the idea of having a team-level setting to control the publication of system messages.
In the interim, if you don’t want other users to discover that someone has joined a team, either wait until an appropriate time before adding them as a member or consider assigning a new display name to that person’s account until you’re ready to reveal their presence. For instance, I changed the name of a new user as follows:
Set-AzureADUser -Identity James.Baker@office365itpros.com -DisplayName "The Maestro of Office 365"
After waiting for a few hours to allow Teams to pick up details of the user account, I added them to a team. Sometime later, the information pane duly displays the system message for the addition (Figure 2):
This technique works if you want to pre-add new users to teams before they join the organization if you use suitably obscured display names, like UserAXXAD19948. Naturally, you should update their display name after they’re active in the organization. However, it’s not a great approach for people who already work there as other workloads pick up and use the changed display name.
The answer to the original question is that you can’t stop Teams posting system messages to inform team members about membership changes. No control is available at a system or individual team level, which is a pity. But life isn’t perfect, and this is a small detail in the overall scheme of things – unless you inadvertently reveal the name of a new employee before they join the company.
Learn how to exploit the Office 365 data available to tenant administrators through the Office 365 for IT Pros eBook. We love figuring out how things work.
]]>Microsoft is fond of equipping its administrative consoles with cards containing insights which administrators might action. Yesterday, I noticed that the SharePoint Online admin center highlighted that my tenant had many sites had no sensitivity label (Figure 1).
As you might recall, Microsoft 365 uses sensitivity labels to apply settings to “containers” (teams, groups, and sites). Controlling the external sharing capability of SharePoint Online sites is a good example of the power of this approach. By default, I assign sensitivity labels to when creating new Microsoft 365 groups and teams, so it surprised me to discover the unlabeled state of so many sites.
Using the Manage unlabeled sites link, I examined the sites. Because I use sensitivity labels for the sites used for groups and teams, I expected to find that some sites in the tenant had no labels. These include:
Knowing that teams created using templates didn’t ask team owners to assign a sensitivity label until Microsoft fixed the problem in October 2021 (MC281936, Microsoft 365 roadmap item 84232), I could account for some other unlabeled sites. However, stripping all the explainable sites from the 126 noted by SharePoint still left a bunch that I couldn’t explain except by concluding that at some points in the past, the synchronization of sensitivity labels didn’t work as well as it should between SharePoint Online and the other workloads. This is an important thing to fix because if SharePoint Online doesn’t know about a sensitivity label assigned to a site, it can’t apply the management controls defined in that label.
For the record, the synchronization of sensitivity labels for new groups works well. This might be the vestige of a long-solved problem.
To address the problem, I decided to write some PowerShell. The first stage was to find all the sites created for teams and Microsoft 365 Groups that didn’t have a label. To do this, the code:
Here’s the code I used:
[array]$Sites = Get-SPOSite -Limit All -Template Group#0 If (!($Sites)) { Write-Error "No sites for Microsoft 365 Groups found... exiting!" ; break} Else { Write-Host ("Processing {0} sites" -f $Sites.Count) } $SitesNoLabels = [System.Collections.Generic.List[Object]]::new() ForEach ($Site in $Sites) { #Check each site to see if it has a sensitivity label $SiteData = Get-SPOSite -Identity $Site.Url If ([string]::IsNullOrWhiteSpace(($SiteData.SensitivityLabel)) -eq $True) { Write-Host ("Site {0} has no label" -f $SiteData.Url) $SiteInfo = [PSCustomObject][Ordered]@{ URL = $SiteData.Url Title = $SiteData.Title } $SitesNoLabels.Add($SiteInfo) } } #End ForEach Sites
The properties of a Microsoft 365 group store the GUID of the sensitivity label, if one is assigned to the group/team. The next step is to retrieve the sensitivity label information for all groups. It’s possible to match a group with a site because the group properties include the site URL. I therefore:
Here’s the code for this segment:
Write-Host "Retrieving sensitivity label information for Microsoft 365 Groups" [array]$Groups = Get-UnifiedGroup -ResultSize Unlimited $Groups = $Groups | ? {$_.SharePointSiteUrl -ne $Null} $GroupsTable = @{} $Groups.ForEach( { $GroupsTable.Add([String]$_.SharePointSiteUrl, $_.SensitivityLabel) } )
We now have a list of sites without labels and a table with the labels assigned to the underlying groups. The next step is to check each site against the groups table to see if we can find what label the site should have. If we find a match, we can update the site. The next code segment does the following:
This code applies sensitivity labels to sites using the information from Microsoft 365 Groups:
[int]$Updates = 0; [int]$NoUpdates = 0 ForEach ($Site in $SitesNoLabels) { $Label = $Null $Label = $GroupsTable.Item($Site.Url) If ($Label) { # Update the site with the label we find Write-Host ("Updating site {0} with label {1}" -f $Site.Url, $Label.Guid) Set-SPOSite -Identity $Site.Url -SensitivityLabel $Label.Guid $Updates++ } Else { Write-Host ("Can't find sensitivity label for site {0} - group might be deleted" -f $Site.Url) $NoUpdates++ } } #End ForEach Sites
The complete script is available from GitHub.
Of the 126 unlabeled sites reported by SharePoint Online, 116 were team sites. The technique described above managed to apply sensitivity labels to 103 sites. The remaining 13 are deleted sites kept by SharePoint Online because of a retention policy (the associated Microsoft 365 group is gone). The card displayed in the SharePoint Online admin center looks better (Figure 2) and all the sites belonging to Microsoft 365 groups and teams have their correct labels. All is well.
Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.
]]>It’s been a busy week for anyone following the Microsoft 365 ecosystem as Microsoft released a slew of blog posts and announcements to support keynotes and other sessions at the Microsoft Ignite Fall event. You could spend hours reading about new features and functionality and wonder when the code will appear in your Office 365 tenant and if any additional licenses are necessary.
This post captures notes about several features available now that I noticed as I perused Microsoft’s coverage. By themselves, each is not enough to warrant a separate post, but they’re interesting all the same. These changes are examples of the stuff we track to maintain the content of the Office 365 for IT Pros eBook. All our chapter authors have been busy this week.
Sharing links show who you’ve shared a document with. This feature was announced in June but seems to have taken its time to roll out. The idea is simple. When you send a new sharing link, SharePoint Online and OneDrive for Business tell you who the document is already shared with (Figure 1), including a thumbnail of each person (if available in Azure AD). You can hover over a thumbnail to see who the person is. The number of active sharing links also appears. It’s a small but useful change.
Easy to overlook, the SharePoint Online admin center now displays connected channel sites when a site used by Teams creates private channels (Figure 2). If you can’t remember which sites have private channel sites, connect to SharePoint Online PowerShell and run:
Get-SPOSite -Limit All -Template TeamChannel#0 | ? {$_.TeamsChannelType -eq "PrivateChannel"}
If you click the channel sites link, the admin center displays details of those sites. Teams manages the settings for these sites, but it’s nice to be able to have easy access to the information. Shared channels, which are delayed until early 2022, also use channel sites.
OneDrive for Business supports Known Folder Move (KMF) and Files on Demand on MacOS, which is nice if you’ve invested in a brand-new M1-powered Mac.
If your tenant uses sensitivity labels and has SharePoint Syntex, you can apply sensitivity labels to protect the document understanding models. The application of a label in this manner flows through to protect individual documents identified by models. It’s another way of automatically applying labels to sensitive content.
Sensitivity label control over sharing capabilities of SharePoint Online sites is now generally available. In addition, co-authoring and autosave of protected documents is generally available in the Microsoft 365 apps for enterprise (Word, Excel, and PowerPoint). We use protected documents heavily to store chapter files for the Office 365 for IT Pros eBook, so this is a welcome advance.
Microsoft Scheduler can now dynamically adjust the scheduling of recurring meetings. This is message center notification MC295855 (November 2) and it’s a great idea. Static recurring meetings are all too often cancelled or rescheduled because someone is sick or otherwise unavailable. After a recurring meeting finishes, Scheduler looks for the best time slot for the next instance and books that time.
Everyone’s probably familiar with the Exchange Online campaign to remove basic authentication for email connection protocols (that October 2022 date is getting nearer!). PowerShell is on the list of protocols to be blocked for basic authentication, but the Exchange Online management PowerShell module still uses basic authentication to communicate with WinRM on a local workstation. Work is under way to remove the need to use WinRM. Microsoft has released a preview version (2.0.6-3preview) of the module to demonstrate how they will remove the dependency by using a REST API in the background. Exchange Online has many cmdlets, not all of which have been converted to use the new mechanism, but you can test the preview now.
On the downside, Microsoft didn’t say anything at Ignite about the next version of on-premises Exchange. This is strange given the September 2020 announcement said the next version of Exchange Server would be available in the second half of 2021.
Microsoft says that Visio web app is rolling out to Microsoft 365 commercial tenants (all tenants with Office 365 enterprise plans). The rollout goes through to the end of January 2022, so keep an eye on the app launcher to see when Visio web app (aka Visio in Microsoft 365) shows up in your tenant.
Microsoft Cloud App Security (MCAS) is now Microsoft Defender for Cloud Apps (surely MDCA?). The app governance add-on is now generally available. It’s a good way to chase down apps registered in Azure AD that are over-permissioned or not being used. If you don’t have MDCA or don’t want to pay for the add-on, use our DIY audit method for Azure AD apps.
Access to the knowledge available in topic cards created by Viva Topics has been restricted to some lesser-used applications up to now. Things will change when topic cards appear in OWA and Teams. Apparently, this will happen soon and should be a game changer for the organizations who have invested in the work needed to harvest organizational knowledge through Viva Topics.
Microsoft prioritized Teams at Ignite as the center of a new way to work (see my practical365.com article), so there were lots of Teams-related developments discussed, most of which can be left until they appear in a tenant near you. One snippet in a blog post about improving meeting quality is that noise suppression in Teams meetings will be available for iOS soon. Microsoft claims that they saw a “31% decline in comments about background noise distractions” after the launch of noise suppression. This sounds like a good thing, but a single statistic provided without any further context or detail is worthless. We don’t know the sample size, whether the clients were Windows or Mac. What kind of meetings, and what is meant by “comments” (good, bad, or indifferent). Like many Microsoft statistics, there’s plenty of room for fudging an issue.
So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what’s happening.
]]>Microsoft’s decision to enforce a new 1.5 TB limit for auto-expanding archives from November 1, 2021, caused more interest than I thought would happen. Although the idea of having a “bottomless archive” seems like a nice capability, the real situation is that relatively few of the 300-odd million Office 365 licensed users have archive mailboxes anywhere close to a terabyte.
In the note I published on Practical365.com, I included a PowerShell script to report the status of archive-enabled mailboxes. Afterwards, I was asked whether it would be easy to adapt the script to report mailboxes which might be in danger of approaching the new 1.5 TB limit.
Good idea, I thought, and set to work. The full script is downloadable from GitHub, and here’s the flow of the processing.
Here’s the main processing loop for the mailboxes:
$Report = [System.Collections.Generic.List[Object]]::new() ForEach ($M in $ExMbx) { $Status = $Null Write-Host "Processing mailbox" $M.DisplayName [int]$DaysSinceCreation = ((New-TimeSpan -Start ($M.WhenCreated) -End ($Now)).Days) $Stats = Get-ExoMailboxStatistics -Archive -Identity $M.UserPrincipalName [string]$ArchiveSize = $Stats.TotalItemSize.Value [string]$DeletedArchiveItems = $Stats.TotalDeletedItemSize.Value [long]$BytesInArchive = $Stats.TotalItemSize.Value.ToBytes() [long]$BytesInRecoverableItems = $Stats.TotalDeletedItemSize.Value.ToBytes() [long]$TotalBytesInArchive = $BytesInArchive + $BytesInRecoverableItems # Check if archive size is within 10% of the 1.5 TB limit - the size that counts is the combination of Recoverable Items and normal folders If ($TotalBytesInArchive -ge $TBBytesWarning) { Write-Host ("Archive size {0} for {1} is within 10% of 1.5 TB limit" -f $ArchiveSize, $M.DisplayName ) $Status = "Archive within 10% of 1.5 TB limit" } [long]$BytesPerDay = $TotalBytesInArchive/$DaysSinceCreation [long]$NumberDaysLeft = (($TBBytes - $TotalBytesInArchive)/$BytesPerDay) $BytesPerDayMB = $BytesPerDay/1MB $GrowthRateDay = [math]::Round($BytesPerDayMB,4) $TotalArchiveSizeGB = [math]::Round(($TotalBytesInArchive/1GB),2) $ReportLine = [PSCustomObject][Ordered]@{ Mailbox = $M.DisplayName UPN = $M.UserPrincipalName Created = $M.WhenCreated Days = $DaysSinceCreation Type = $M.RecipientTypeDetails "Archive Quota" = $M.ArchiveQuota.Split("(")[0] "Archive Status" = $M.ArchiveStatus "Archive Size" = $ArchiveSize.Split("(")[0] "Archive Items" = $Stats.ItemCount "Deleted Archive Items Size" = $DeletedArchiveItems.Split("(")[0] "Deleted Items" = $Stats.DeletedItemCount "Total Archive Size (GB)" = $TotalArchiveSizeGB "Daily Growth Rate (MB)" = $GrowthRateDay "Days Left to Limit" = $NumberDaysLeft Status = $Status } $Report.Add($ReportLine) } #End ForEach
The script generates a CSV file containing the mailbox data. I also like to output the data on screen using the Out-GridView cmdlet to get some insight into the results. For example, Figure 1 shows some output from mailboxes in my tenant. As you can see, at a 18.07 MB/day growth rate, it will take my archive 84,228 days to get from its current 9.129 GB to 1.5 TB. What a relief!
The script works as an example of how PowerShell delivers insight for Microsoft 365 tenant administrators, which is why every tenant administrator should be familiar with PowerShell and be able to run scripts and make simple code changes. Because most archives are less than 100 GB and won’t get near the new 1.5 TB limit in their lifetime, I suspect that few tenants will find the script valuable in an operational sense. However, it’s always nice to be able to answer questions with a few lines of code.
Learn more about how Office 365 really works (including PowerShell and archive mailboxes) on an ongoing basis by subscribing to the Office 365 for IT Pros eBook. Our monthly updates keep subscribers informed about what’s important across the Office 365 ecosystem.
]]>Tracking the availability of other people in your organization has been a problem for calendar management systems since the early 1980s. Microsoft’s solution since the days of Schedule+ has been to publish free and busy information from user calendars to allow other people to see if someone is available when setting up a meeting. The information is presented in time slots by apps like Outlook’s scheduling assistant (Figure 1). Good as it is to see time slots, it’s limited access to view calendar information.
As you can see from Figure 1, more insight is available about the availability of some people than others. The ability to view details of someone’s availability depends on the permission you have to their calendar (private items like the one shown are exceptions to the rule).
The default permission used within Exchange Online is AvailabilityOnly. This is one of two special permissions available for the calendar folder and it allows other users to see a graphic representation of when someone is available. However, this permission doesn’t allow you to see details of what you’re up to like the Procurement call highlighted in Figure 1. The ability to see this information is governed by the other special permission (LimitedDetails), which allows people to see the time slot reserved, the title, location, and its time status (busy, tentative, out of office, etc.).
To make sure that everyone within an organization has at least limited visibility of each other’s calendar, Exchange Online assigns the AvailabilityOnly permission to a special user called Default. People see this assignment in a slightly different manner because clients present the information in a more user-friendly manner. For instance, OWA refers to the Default user as People in my organization and interprets the permission as Can view when I’m busy (Figure 2).
Figure 2 also shows that individuals can assign specific permissions to different users to allow them to have custom access to a calendar. This is what happens when people need to manage the calendar for other users.
You’re all set and there’s no more to do if you’re happy with everyone seeing calendar slots instead of appointment details. Things become more complicated if you decide that it would be better if everyone could see more information. There will always be exceptions where you want to protect calendars against casual browsing (“I wonder what the CEO is doing today…”), but the idea is to allow general access as a default.
It sounds like this is something that should be handled by a setting in Exchange’s organization configuration to control the default permission created for new calendars. Unhappily, no such setting exists, and anyway, if it did, you’d still have the problem of retrofitting a new default permission on existing calendars, including the need to respect customized permissions set for some calendars.
Which brings us to PowerShell. The Set-MailboxFolderPermission cmdlet in the Exchange Online management module is the key to assigning a new default permission to existing mailboxes and for subsequently-created mailboxes. To set a new default calendar permission, we need a script to:
Custom mailbox attributes are a good choice for storing an indication that a mailbox has updated permissions. It’s easier and faster to check a custom attribute because these attributes support server-side filtering and don’t need to check the actual permissions in place on mailboxes. With that in mind, I elected to use CustomAttribute13 to store “Open” if the mailbox is updated with a new default permission and “Blocked” if a mailbox is to be ignored.
An admin (or the user if they know how to run Set-Mailbox in PowerShell) would set the attribute to Blocked if they don’t want their availability setting updated. You’d probably have a set of well-known mailboxes to block and would process them at one time. For instance, let’s assume you have a CSV file containing the user principal names of mailboxes to block. The code would be something like:
$Mbx = Import-CSV c:\Temp\SomeTempFile.csv ForEach ($M in $Mbx) { Set-Mailbox -Identity $M.UserPrincipalName-CustomAttribute13 "Blocked"}
The prototype code to find and update the availability setting for mailboxes (I’ve opted to process user and room mailboxes) with the new default calendar permission is shown below. You’ll note that I have some lines to deal with local language values of the name for the calendar folder. You will have to uncomment and use this line (it’s reasonably expensive to run Get-ExoMailboxFolderStatistics to find the calendar folder and extract its name) if your organization includes users who run Outlook or OWA in non-English languages. Some experimentation is required!
# Find mailboxes that we have not yet reset the default sharing view [array]$Mbx = Get-ExoMailbox -RecipientTypeDetails UserMailbox, RoomMailbox -ResultSize Unlimited -Filter {CustomAttribute13 -ne "Open" -and CustomAttribute13 -ne "Blocked"} $CalendarName = "Calendar" # English language calendar folder ForEach ($M in $Mbx) { Write-Host "Processing" $M.DisplayName # You can hard-code the calendar name (above) or try and find a local language value. This is one way to look for local values... # $CalendarName = (Get-ExoMailboxFolderStatistics -Identity $M.UserPrincipalName -FolderScope Calendar |?{$_.FolderType -eq "Calendar"}).Name # Either way, you need to end up with a valid calendar folder reference - like Tony.Redmond@office365itpros.com:\Calendar $CalendarFolder = $M.UserPrincipalName + ":\" + $CalendarName Set-MailboxFolderPermission -Identity $CalendarFolder -User Default -AccessRights LimitedDetails Set-Mailbox -Identity $M.ExternalDirectoryObjectId -CustomAttribute13 "Open" }
The first time you run the script, it will take plenty of time to process mailboxes (expect each mailbox to take between 2-3 seconds to be updated). Later, fewer mailboxes will need updating and the script will complete faster. To be sure that new mailboxes get the new default permission, you can run the script periodically, perhaps as a scheduled task or using Azure Automation.
Figure 3 shows the effect of the change. Availability information is visible for all participants.
The updated access to free and busy information will be picked up by any client which consumes this data, such as the scheduling assistant in the Teams calendar app (Figure 4).
You can argue that Microsoft should make it easier for organizations to select and apply a default calendar permission. However, you’d still probably have to run some PowerShell to adjust the permission for selected mailboxes, like those who need to preserve confidentiality. Still, it would be nice if Microsoft added the default calendar permission to mailbox plans so that new mailboxes would receive whatever permission deemed suitable by an organization. That would be a good step forward.
Learn more about how Office 365 really works on an ongoing basis by subscribing to the Office 365 for IT Pros eBook. Our monthly updates keep subscribers informed about what’s important across the Office 365 ecosystem.
]]>A reader asked about the meaning of x:x in a Graph API query included in the article about upgrading Office 365 PowerShell scripts to use the Graph. You see this construct (a Lambda operator) in queries like those necessary to find the set of accounts assigned a certain license. For example, to search for accounts assigned Office 365 E3 (its SKU or product identifier is always 6fd2c87f-b296-42f0-b197-1e91e994b900):
https://graph.microsoft.com/beta/users?$filter=assignedLicenses/any(s:s/skuId eq 6fd2c87f-b296-42f0-b197-1e91e994b900)
Find the set of Microsoft 365 Groups in the tenant:
https://graph.microsoft.com/v1.0/groups?$filter=groupTypes/any(a:a eq 'unified')
Find the set of Teams in the tenant:
https://graph.microsoft.com/beta/groups?$filter=resourceProvisioningOptions/Any(x:x eq 'Team')
As you might expect, because the cmdlets in the Microsoft Graph SDK for PowerShell essentially are wrappers around Graph API calls, these cmdlets use the same kind of filters. For example, here’s how to find accounts with the Office 365 licenses using the Get-MgUser cmdlet:
[array]$Users = Get-MgUser -Filter "assignedLicenses/any(x:x/skuId eq 6fd2c87f-b296-42f0-b197-1e91e994b900)" -all
All these queries use lambda operators to filter objects using values applied to multi-valued properties. For example, the query to find users based on an assigned license depends on the data held in the assignedLicenses property of Azure AD accounts, while discovering the set of Teams in a tenant relies on checking the resourceProvisioningOptions property for Microsoft 365 groups. These properties hold multiple values or multiple sets of values rather than simple strings or numbers. Because this is a query against a multivalue property for an Entra ID directory object, it’s called an advanced query.
Accessing license information is a good example to discuss because Microsoft is deprecating the Azure AD cmdlets for license management at the end of 2022, forcing tenants to upgrade scripts which include these cmdlets to replace them with cmdlets from the Microsoft Graph SDK for PowerShell or Graph API calls. This Practical365.com article explains an example of upgrading a script to use the SDK cmdlets.
If we look at the value of assignedLicenses property for an account, we might see something like this, showing that the account holds three licenses, one of which has a disabled service plan.
disabledPlans skuId ------------- ----- {33c4f319-9bdd-48d6-9c4d-410b750a4a5a} 6fd2c87f-b296-42f0-b197-1e91e994b900 {} 1f2f344a-700d-42c9-9427-5cea1d5d7ba6 {} 8c4ce438-32a7-4ac5-91a6-e22ae08d9c8b
It’s obvious that assignedLicenses is a more complex property than a single-value property like an account’s display name, which can be retrieved in several ways. For instance, here’s the query with a filter to find users whose display name starts with Tony.
https://graph.microsoft.com/v1.0/users?$filter=startswith(displayName,'Tony')
As we’re discussing PowerShell here, remember that you must escape the dollar character in filters. Taking the example above, here’s how it is passed in PowerShell:
$Uri = "https://graph.microsoft.com/v1.0/users?`$filter=startswith(displayName,'Tony')" [array]$Users = Invoke-WebRequest -Method GET -Uri -ContentType "application/json" -Headers $Headers | ConvertFrom-Json
The data returned by the query is in the $Users array and can be processed like other PowerShell objects.
Getting back to the lambda operators, while OData defines two (any and all), it seems like the all operator, which “applies a Boolean expression to each member of a collection and returns true if the expression is true for all members of the collection (otherwise it returns false)” is not used. At least, Microsoft’s documentation says it “is not supported by any property.”
As we’ve seen from the examples cited above, the any operator is used often. This operator “iteratively applies a Boolean expression to each member of a collection and returns true
if the expression is true
for any member of the collection, otherwise it returns false
.”
If we look at the filter used to find accounts assigned a specific license:
filter=assignedLicenses/any(s:s/skuId eq 6fd2c87f-b296-42f0-b197-1e91e994b900)
My interpretation of the component parts (based on Microsoft documentation) of the filter is:
All of this is second nature to professional developers but not so much to tenant administrators who want to develop some PowerShell scripts to automate operations. This then poses the question about how to discover when lambda qualifiers are needed. I don’t have a great answer except to look for examples in:
And when you find something which might seem like it could work, remember that the Graph Explorer is a great way to test queries against live data in your organization. Figure 1 shows the results of a query for license information.
One complaint often extended about Microsoft’s documentation for the Graph APIs is that it pays little attention to suitable PowerShell examples. The Graph SDK developers say that they understand this situation must change and they plan to improve their documentation for PowerShell over the next year. Although understandable that languages like Java and C# have been priorities up to now, Microsoft can’t expect the PowerShell community to embrace the Graph and learn its mysteries (like lambda qualifiers) without help. Let’s hope that the Graph SDK developers live up to their promise!
Learn how to exploit the Office 365 data available to tenant administrators through the Office 365 for IT Pros eBook. We love figuring out how things work.
]]>The Microsoft Graph Insights API proves different views of users and documents:
Insights are consumed by many apps and Microsoft 365 components such as MyAnalytics, Workplace Analytics, Viva Insights for Teams, and the Office 365 profile card. Figure 1 is a Microsoft graphic to explain the use of the Insights API and its value to “drive productivity and creativity in businesses.”
Delve was the first app to surface insights, but used Office Graph settings to allow users to decide if they wanted to reveal information about their document-centric activities. Some users never want details of their work exposed, even to people who have access to documents, because they either don’t see the need or because they wish to preserve the confidential nature of the information they work with. They can protect content by assigning a sensitivity label with encryption to confidential documents, but this won’t stop document metadata like titles showing up in insights. The feature settings for Delve therefore have a slider to control showing documents in Delve (trending, used, and shared). When the slider is Off, Delve blocks insights based on documents (Figure 2).
In April, I wrote about how Microsoft is replacing Office Graph controls over Item Insights with Microsoft Graph controls. The change is now effective in Microsoft 365 tenants and mean that instead of user-driven control over how the Insights API reveals information, a tenant has:
Access to these settings is available through the Search & Intelligence section of the Microsoft 365 admin center (Figure 3).
The question arises how to find the current set of accounts with the option disabled in Delve so that you can add the accounts to the Azure AD group. As it happens, I was asked this question by a Microsoft customer engineer who wanted to help their customer move to the new Microsoft Graph controls.
The first step is to find the set of accounts with Delve insights disabled. This cannot be done with PowerShell only because no cmdlet exists to retrieve the value of the Delve setting. Instead, we can combine PowerShell with a call to the Graph Users API. Here are the steps:
You can download the script I used to report users with Delve insights disabled from GitHub.
The next step is to review the report and decide which accounts to add to the Azure AD group used to control item insights. To review the data, open the CSV file generated by the script (Figure 4), and remove any accounts which should not be added to the control group.
We can then use the updated CSV file as the input for a script which:
The essential code to fetch the settings from the Graph and update the membership of the control group looks like this:
$InputCSV = "c:\temp\DelveDisabledAccounts.csv" $TenantDetails = Get-AzureADTenantDetail $TenantId = $TenantDetails.ObjectId $TenantName = $TenantDetails.DisplayName $Uri = "https://graph.microsoft.com/beta/organization/" + $TenantId + "/settings/iteminsights" $Settings = Invoke-RestMethod -Uri $Uri -Method Get -ContentType "application/JSON" -Headers $Headers -UseBasicParsing If ($Settings.isEnabledInOrganization -ne $True) { Write-Host "Insights control setting not set for" $TenantName ; break } Else { $DisabledGraphInsightsGroup = $Settings.disabledForGroup } [array]$CurrentMembers = Get-AzureADGroupMember -ObjectId $DisabledGraphInsightsGroup | Select -ExpandProperty ObjectId Write-Host "Adding users to the Disabled Graph Insights Group" $Users = Import-CSV $InputCSV ForEach ($User in $Users) { If ($User.ObjectId -notin $CurrentMembers) { Write-Host "Adding" $User.Name Add-AzureADGroupMember -ObjectId $DisabledGraphInsightsGroup -RefObjectId $User.ObjectId } }
I haven’t published a script to GitHub for this purpose because the code is straightforward and simple to plug into an existing script (or add to the bottom of the script mentioned above). Happy Insights!
Learn more about how Office 365 really works on an ongoing basis by subscribing to the Office 365 for IT Pros eBook. Our monthly updates keep subscribers informed about what’s important across the Office 365 ecosystem.
]]>Note: This post is now obsolete. Please see this article for an updated approach to the problem.
Service plans are non-saleable elements of a Microsoft licensable product (SKU or stock keeping unit). SKUs are what people often think of when they discuss licenses. Individual Microsoft 365 accounts can have multiple SKUs, such as TOPIC_EXPERIENCES, ENTERPRISEPACK, and EMSPREMIUM. The product names for these SKUs are Viva Topics, Office 365 E3, and Enterprise Mobility and Security E5. Product names appear in places like the Billing section of the Microsoft 365 admin center (Figure 1).
At a more granular level, a “bundled” SKU like Office 365 E3 includes multiple service plans, each of which enable access to some functionality like an app. This page lays details the connections between SKUS and service plans.
At the time of writing, Office 365 E3 covers 28 service plans and Office 365 E5 has 53. Office 365 E5 includes service plans to license capabilities like advanced compliance features, customer lockbox, advanced auditing, content explorer, server-based auto-labeling for sensitivity labels and retention labels, records management, and information barriers.
Microsoft introduces new service plans to enhance its ability to license new features to different user communities or to provide control over user access to a new feature. Teams is a good example. The Teams service plan (TEAMS1) is in many Office 365 and Microsoft 365 SKUs. In April, Microsoft announced they would add the Teams Pro service plan to some SKUs and will use the Teams Pro service plan to allow accounts licensed with those SKUs to access new features. To date, Microsoft has not added the Teams Pro service plan to any SKU in my tenant nor have they described what features the new service plan will cover.
In some cases, tenant administrators might not want users to be able to access a licensed app or capability. Perhaps the feature is obsolete, or the organization has different software to do the same thing, or maybe a delay is necessary to enable preparation of training, documentation, and support. Some years ago, Microsoft made a big thing about Kaizala and its impending integration into Teams. Kaizala is now an obsolete feature that’s still available in Office 365 E3 and E5. Sway is in the same category. Microsoft Bookings is an optional feature which isn’t often used by enterprise users, but it’s also part of Office 365 E3 and E5. In short, when you review the set of service plans bundled into Office 365 and Microsoft 365 SKUs, you might be surprised at the amount of unwanted debris in the mix.
Let’s say that we want to remove individual service plans from SKUs assigned to users. This post describes how to report the accounts assigned individual service plans (licenses) and explains how Azure AD stores the service plan information in user accounts. We want to go further by removing access to selected service plans, and as it turns out, we must use cmdlets from the older Microsoft Online Services module to get the job done. It’s possible to use the Set-AzureADUserLicense cmdlet to remove a service plan from an account. Laziness and the availability of some existing code to do the job stopped me using this cmdlet.
In any case, I wrote a script to demonstrate the principle of the steps to remove an individual service plan from multiple Microsoft 365 accounts. Three versions are available.
Given that Microsoft deprecated the licensing management cmdlets in the MSOL and Azure AD modules in 2023, it makes sense to focus on the version based on the Microsoft Graph PowerShell SDK.
The major steps to remove a service plan from Azure AD licenses with PowerShell are:
Figure 2 shows the MSOL version of the script in action. You can see the selection of the service domain, SKU, and service plan and processing of user accounts. In this case, the selected options remove the Sway service plan from the ENTERPRISEPACK (Office 365 E3) SKU.
The report output is a CSV file. Figure 3 shows the information captured in the report as viewed through the Out-GridView cmdlet.
I’m sure others will have different ways to solve the problem of removing service plans from SKUs, which is just fine. What’s obvious here (once again) is that PowerShell is a very flexible tool for automating administrative operations. Which is why I am so surprised when tenant administrators admit that they have never taken the time to become acquainted with the basics of PowerShell scripting. It’s not difficult; there are tons of available examples to learn from; and it gets work done. All good stuff!
Learn more about how Office 365 really works on an ongoing basis by subscribing to the Office 365 for IT Pros eBook. Our monthly updates keep subscribers informed about what’s important across the Office 365 ecosystem.
]]>Microsoft posts notifications to the message center in the Microsoft 365 admin center to inform tenant administrators about a variety of different updates made to its service. MC272885 posted on Jul 24, 2021, has the title Attachments for messages with Data Privacy Tag, which might leave you scratching your head to understand what Microsoft means. At first glance, the combination of attachments and messages points to email and tag could mean a sensitivity or retention label. But that’s not what it means.
Reading the detail reveals that Microsoft is introducing a new tag for service update messages. Let’s explore what this means.
When Microsoft publishes a service update message, it applies tags to help tenant administrators understand the importance and potential impact of the change (Figure 1).
The tags shown in the message center include:
Many updates have multiple tags. For instance, MC264095 has the major update, feature update, and user impact tags.
Using the Graph API for Service Communications, we can fetch the messages currently available in the Microsoft 365 admin center to see what tags are in use. As you’ll recall, this API spans both incidents (outages) reported in the admin center and service updates. I took the example script I created for service updates and used some of the code to pull all update messages into an array.
$Uri = "https://graph.microsoft.com/beta/admin/serviceAnnouncement/messages" [array]$Messages = Get-GraphData -AccessToken $Token -Uri $uri
I then used some simple code to analyze the tags placed on each message.
$TagAdmin = 0; $TagUpdate = 0; $TagMajor = 0; $TagNew = 0; $TagRetirement = 0; $TagUser = 0; $TagUpdatedMessage = 0; $TagDataPrivacy = 0 ForEach ($Message in $Messages) { ForEach ($Tag in $Message.Tags) { Switch ($Tag) { "Admin impact" {$TagAdmin++} "Feature update" {$TagUpdate++} "New feature" {$TagNew++} "Retirement" {$TagRetirement++} "User impact" {$TagUser++} "Updated message" {$TagUpdatedMessage++} "Data privacy" {$TagDataPrivacy++} } # End Switch } # End Foreach tag If ($Message.IsMajorChange -eq $True) {$TagMajor++} } # End ForEach message Write-Host "Admin impact messages: " $TagAdmin Write-Host "Feature update messages:" $TagUpdate Write-Host "Major update messages: " $TagMajor Write-Host "New feature messages: " $TagNew Write-Host "Retirement messages: " $TagRetirement Write-Host "User impact messages: " $TagUser Write-Host "Updated messages: " $TagUpdatedMessage Write-Host "Data privacy messages: " $TagDataPrivacy Admin impact messages: 165 Feature update messages: 65 Major update messages: 76 New feature messages: 119 Retirement messages: 31 User impact messages: 191 Updated messages: 96 Data privacy messages 0
The total count of messages was 266. You can see that:
Your mileage might vary because Microsoft issues service updates to tenants based on the feature set licensed by the tenant.
Microsoft is introducing a new Data Privacy tag to indicate messages which need administrator attention because they potentially impact sensitive data. The change is due to roll out by the end of July.
Microsoft says that messages might also contain one or more downloadable attachments (if multiple, the attachments are in a zip file) to help administrators “gain additional insight into the described scenario.” For instance, an attachment might be a PowerShell script to report data or users affected by a service update.
Only accounts holding the Global administrator and Privacy reader roles can access the downloadable attachments.
It’s hard to be certain about how Microsoft will use the new Data Privacy tag and what kind of service update messages they will tag. I guess we will see when some messages appear with the tag (none are found in the messages in my tenant) and the kind of attachments available for the messages.
So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what’s happening.
]]>Without a doubt, using Graph API calls is much faster to retrieve Office 365 than PowerShell cmdlets are. It’s more obvious in complex scripts like the Groups and Teams activity script, which is much faster in its Graph variant (5.1) than its counterpart (4.8) due to Graph API calls replacing cmdlets from the Exchange Online and SharePoint Online modules. Speed matters, especially when a tenant supports thousands of groups, and the Graph API version of the script can process large quantities of groups where the pure PowerShell version struggles to cope.
Good as it is to have a speedier script, the complexity of the activity report is possibly not a good test case to illustrate the decision process that you should go through to decide if it’s a good idea to upgrade a script to use the Graph API, and then measure the improvement. Simplicity is better, so let’s explore what needs to be done to upgrade the Microsoft 365 Groups Membership report script. This script uses the following cmdlets:
Two cmdlets are from the Exchange Online Management module, one is from the Azure AD module. A bunch of other processing is done to filter, sort, and process the data fetched by these cmdlets, but essentially the conversation involves replacing these cmdlets with Graph API calls. Sounds easy.
Before any script can use the Graph API, it needs to use an Azure AD registered app. The app serves as the holder for permissions to allow the script to access data. Every registered app has a unique identifier. When you create a registered app, you can generate an app secret. When running the app, we need to know the secret to prove we have permission to use the app.
Before making any calls, you need to know your tenant identifier. This is the GUID for a tenant as returned by the Get-AzureADTenantDetail cmdlet. The Connect-MicrosoftTeams cmdlet also returns this information. Bringing everything together, before we can use an app to access Graph APIs, we need to know the app identifier, app secret, and tenant identifier. You’ll often see code like this in scripts:
$AppId = "a09cf913-5ff9-48a2-8015-f28f2854df26" $AppSecret = "u6X7_i8K-yhh-b4-z5FEmj_wH_M~nIOz4n" $TenantId = "22e90715-3da6-4a78-9ec6-b3282389492b"
It’s an important part of working with Graph API apps to understand how permissions work and to ensure that apps receive only the permissions necessary to work with the data they process. In the case of our app, we need:
Like all programming tasks, it soon becomes second nature to assign the correct permissions, sometimes after browsing Microsoft’s documentation to find the correct permission.
An administrator gives consent to allow the app to use its assigned permissions. Hackers can use OAuth consents as a method to gain permissions to access data, so it’s wise to keep an eye on consents given within an organization, with or without Microsoft’s new App governance add-on for MCAS.
Equipped with a suitably permissioned app, app secret, and tenant identifier, the app can request an access token by posting a request to the token endpoint. For Graph API calls to Office 365 data, that’s going to be something like:
https://login.microsoftonline.com/tenant-identifier/oauth2/v2.0/token
The bearer token issued in response confirms that the app has the necessary permissions to access data like Users, Groups, and Sites and whether access is read-only or read-write. The token is included in the authorization header of the requests made to the Graph APIs. Access tokens expire after an hour, so long-running programs need to renew their token to continue processing.
All of this sounds complicated, but once you do it for one script, it becomes second nature to acquire an access token and be ready to start using Graph API calls.
Like anything else, it takes a little while to become used to fetching data using Graph API calls. You must pay attention to the data that’s fetched. First, to limit the demand on resources, the Graph fetches limited data at one time and you must iterate until no more data is available (a process called pagination). Second, in some cases, the property names used by cmdlets vary to what’s used by the Graph API. For example, the Get-UnifiedGroup cmdlet returns the description of a group in the Notes property whereas the Groups API uses description.
The Graph Explorer is an online Microsoft tool to help developers become accustomed to Graph API syntax and data. You should use the Explorer to test calls before including them in a script. Debugging calls using the Graph Explorer saves a lot of time and heartache. Figure 1 shows the Graph Explorer being used to examine the transitive set of groups returned for a user.
The Graph Explorer is a Graph app. Like any other app, it needs permissions to access data. If a call fails, the Explorer tells you which permissions are missing and you can then consent to the assignment.
The first cmdlet we need to replace in the script is the one to fetch the list of Azure AD users in the tenant. In PowerShell, this is:
$Users = Get-AzureADUser -All:$true
The Graph Users API fetches the same data. Unlike the Get-AzureADUser cmdlet, a default set of properties is returned and we must be specific if we want other properties. Here’s the call used.
$Uri = https://graph.microsoft.com/v1.0/users?&`$select=displayName,usertype,assignedlicenses,id,mail,userprincipalname $Users = Get-GraphData -AccessToken $Token -Uri $Uri
Get-GraphData is a wrapper function to take care of pagination and extraction of data returned by Graph API calls in a form like the PowerShell objects returned by cmdlets. You can make your life easier by including a function like this in any script which interacts with Graph API calls. To see an example of the function I use, download the Graph version of the group membership report script from GitHub.
The second call fetches the list of team-enabled groups. The script then creates a hash table to store the teams so that it can be used as a lookup to see if a group is team-enabled when reporting its properties. The PowerShell code uses a server-side filter with the Get-UnifiedGroup cmdlet to return the teams.
$Teams = Get-UnifiedGroup -Filter {ResourceProvisioningOptions -eq "Team"} -ResultSize Unlimited | Select ExternalDirectoryObjectId, DisplayName
The Graph Groups API equivalent uses the same kind of filter:
$Uri = "https://graph.microsoft.com/beta/groups?`$filter=resourceProvisioningOptions/Any(x:x eq 'Team')" $Teams = Get-GraphData -AccessToken $Token -Uri $Uri
The last call we make is to find the set of groups a user account is a member of. The Get-Recipient cmdlet is very fast at returning the list of groups based on the distinguished name of an account. The user data returned by Get-AzureADUser doesn’t give us the distinguished name (it is an Exchange Online property), so we must run Get-Recipient twice: once to get the distinguished name, and then use the distinguished name to find the groups.
$DN = (Get-Recipient -Identity $User.UserPrincipalName).DistinguishedName $Groups = (Get-Recipient -ResultSize Unlimited -RecipientTypeDetails GroupMailbox -Filter "Members -eq '$DN'" | Select DisplayName, Notes, ExternalDirectoryObjectId, ManagedBy, PrimarySmtpAddress)
The Graph Users API can resolve a transitive lookup against groups to find membership information for an account. We can therefor use a call like this:
$Uri = "https://graph.microsoft.com/v1.0/users/" + $user.id +"/transitiveMemberOf" $Groups = Get-GraphData -AccessToken $Token -Uri $Uri
That’s it. All the other command in the script process data fetched from Azure AD or Exchange Online. Apart from the changes detailed above, the same code is used for both the PowerShell and Graph versions of the script.
Your mileage may vary depending on the backend server you connect to, the state of load on the service, and other factors. My tests, which are surely as reliable as an EPA mileage figure, revealed that the PowerShell version processed accounts at a rate of about 0.6/second each. The Graph version reduced the time to about 0.4/second. In other words, a 50% improvement.
Not every script will benefit from such a speed boost and other scripts will need more work to install Graph turbocharging. But the point is that Graph-powered PowerShell is much faster at processing Office 365 data than pure PowerShell is. Keep that fact in mind the next time you consider how to approach building a PowerShell-based solution for Office 365.
Learn more about how Office 365 really works on an ongoing basis by subscribing to the Office 365 for IT Pros eBook. Our monthly updates keep subscribers informed about what’s important across the Office 365 ecosystem.
]]>Microsoft’s announcement of Windows 365 on July 14 created a great deal of excitement in some organizations seeking a way to deploy and manage PC assets more easily (here’s an independent view on the topic). Five days later, Microsoft notified Office 365 tenants in MC271483 that end users will be able to buy Windows 365 licenses through the self-purchase license mechanism in the Microsoft 365 admin center. By default, Microsoft enables self-service purchases of Windows 365 licenses, so if you don’t want this to happen, you must disable the self-purchase option for Windows 365 using PowerShell.
Windows 365 comes in two versions. Microsoft’s definitions for the two are:
According to Microsoft, self-service purchases are integrated into the two versions as follows:
Three Windows 365 options are available for self-purchase. Microsoft won’t confirm prices until August 1.
Self-service purchases are unavailable for government and academic tenants.
Control over Windows 365 self-service license purchases uses the same mechanism as Power Apps, Power Automate, Power BI, Visio, Project Online, and (most recently) Power BI Premium and Power Automate with RPA. Here’s what you need to do:
First, if your workstation doesn’t already have version 1.6 of the MSCommerce PowerShell module, download and install the module. After the installation finishes, run the Connect-MSCommerce cmdlet to connect to the Commerce endpoint, authenticating using a global tenant administrator account.
Connect-MSCommerce
You can disable each Windows 365 option separately. For instance, here’s how to disable Windows 365 Business:
Update-MSCommerceProductPolicy -PolicyId AllowSelfServicePurchase -ProductId CFQ7TTC0J203 -Enabled $False
To disable the three Windows 365 self-service purchase options, use this code:
$Windows365Options = @("CFQ7TTC0HHS9", "CFQ7TTC0HX99", "CFQ7TTC0J203") ForEach ($Option in $Windows365Options) { Update-MSCommerceProductPolicy -PolicyId AllowSelfServicePurchase -ProductId $Option -Enabled $False }
Finally, check the current enablement status for each product available for self-purchase with:
Get-MSCommerceProductPolicies -PolicyId AllowSelfServicePurchase ProductName ProductId PolicyId PolicyValue ----------- --------- -------- ----------- Windows 365 Enterprise CFQ7TTC0HHS9 AllowSelfServicePurchase Disabled Windows 365 Business with Windows Hybrid Benefit CFQ7TTC0HX99 AllowSelfServicePurchase Disabled Windows 365 Business CFQ7TTC0J203 AllowSelfServicePurchase Disabled Power Automate per user CFQ7TTC0KP0N AllowSelfServicePurchase Disabled Power Apps per user CFQ7TTC0KP0P AllowSelfServicePurchase Disabled Power Automate RPA CFQ7TTC0KXG6 AllowSelfServicePurchase Disabled Power BI Premium (standalone) CFQ7TTC0KXG7 AllowSelfServicePurchase Disabled Visio Plan 2 CFQ7TTC0KXN8 AllowSelfServicePurchase Disabled Visio Plan 1 CFQ7TTC0KXN9 AllowSelfServicePurchase Disabled Project Plan 3 CFQ7TTC0KXNC AllowSelfServicePurchase Disabled Project Plan 1 CFQ7TTC0KXND AllowSelfServicePurchase Disabled Power BI Pro CFQ7TTC0L3PB AllowSelfServicePurchase Disabled
Self-service licensing has its place in some organizations. Others consider it inappropriate and unhelpful to allow end users to drive what they consider should be organization-led purchasing. If you’re in the latter category, go ahead and run the couple of lines of PowerShell given above to block users. If not, consider how to educate people about how self-service licensing works and when it should be used.
So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what’s happening.
]]>Over the last few months, I’ve written many times about using Microsoft Graph API calls in PowerShell scripts to get real work done. Among the many examples are:
In addition, Microsoft has stirred the pot by announcing that they won’t support the Azure AD Graph from June 2022. This affects the Azure AD PowerShell module, one of the most heavily used modules for Office 365 tenant management. And we have examples where Microsoft introduces new features, like tenant privacy controls, which can be controlled only through Graph API calls.
As a result of this activity, I’ve received several questions about how to decide when to use Graph API calls in scripts. And as I due to speak about combining PowerShell and the Graph at the (free) TEC 2021 event in September, it seemed like a good idea to formulate some thoughts about how I approach the issue.
I use a simple four-step process when writing scripts to automate some aspect of Office 365:
Sketch out the solution: Understand what source data is available and how to access it. Define the expected output and the processing needed to achieve the result. Make an initial selection of PowerShell modules and Graph APIs which might be useful, understanding that some data is only accessible to the Graph (and might need a beta API). Do an internet search to see if anyone has already written code to do what you want or something similar. Never reinvent the wheel if someone else has one to use.
Code in PowerShell first: It’s often wise to write the initial code in PowerShell before introducing any Graph APIs. The code you write might work well enough to be the solution you need without doing any further work. This is often the case when a small amount of data is involved, in which case you don’t need the additional overhead necessary to introduce Graph API calls.
Speed Things Up: Usually, the biggest advantage gained through using Graph APIs is speed, especially when fetching large numbers of objects like user accounts or groups. The next step is to find places in your code where large delays occur to run calls like Get-UnifiedGroup and replace those cmdlets with Graph APIs.
Adjust for Production: Every tenant has their own idea of how to run PowerShell scripts in production. After developing a script which can run interactively, you might need to change it to run as a background process and deal with issues like certificate-based authentication (never store passwords in scripts). Because of the need to adjust scripts for production usage, the code I write for books and articles is to illustrate principles rather than being fully worked-out answers.
The most important point in the checklist is the internet search for code. If you don’t find a suitable script to remove the need to create anything new, you’ll probably find the basis or starting point for what you want to do. It astounds me that people post questions in forums when it is perfectly obvious that they haven’t done the basic research to uncover details which can help solve their problem. Unfortunately, too many people expect answers to be handed to them on a plate and aren’t prepared to learn through experimentation and failure. I spend most of my time in the latter state.
The Microsoft Graph isn’t scary. It’s there to be used and like any other tool, it should be used at the right time. PowerShell gets most things done really well when it comes to tenant management. It has its limitations, some of which the Graph can fill in. Starting with simple tasks and moving forward to more complex issues is a great way to learn how to use the Graph with PowerShell. Your task is to provide the brainpower to combine the two to get things done most effectively.
Learn how to exploit the Office 365 data available to tenant administrators through the Office 365 for IT Pros eBook. We love figuring out how things work.
]]>Updated 14 September 2023
Soon after Microsoft launched Teams in 2017, a question appeared in the Microsoft Technical Community to ask how to report Teams and their associated SharePoint Online sites.
Several years later, a response appeared in the thread advocating the technique of fetching the set of Teams in the tenant using these steps:
One particular joy of PowerShell is that there’s usually several answers to a question. Allied to the number of PowerShell modules available within Office 365, you end up with multiple approaches to explore to find the best answer to a relatively simple question.
Another complication is that Microsoft updates cmdlets over time, usually to good effect. They also update objects in the background to add new properties, remove old properties, and improve how things work, often to support the introduction of new features.
Which brings me to my two-line answer to the question. Use the Get-UnifiedGroup cmdlet to return the set of team-enabled groups and then list the set of teams and their SharePoint Online sites. Apart from anything else, this uses one module (Exchange Online Management) instead of two. The code is:
[array]$Teams = Get-UnifiedGroup -Filter {ResourceProvisioningOptions -eq "Team"} $Teams | Sort-Object DisplayName | Select-Object DisplayName, SharePointSiteUrl | Export-CSV -NoTypeInformation "C:\Temp\TeamsSPOList.CSV"
To be fair to the folks who responded in the thread going back to 2017, this answer wasn’t possible then. Microsoft 365 Groups had an odd provisioning flag that was never reliable and the Get-UnifiedGroup cmdlet didn’t support filtering for teams. Even in the answer cited above, a group’s ResourceProvisioningOptions property, which first appeared a couple of years ago and should have the value “Team” for team-enabled groups, wasn’t always populated. That problem appears to have gone away. At least, in my tenant, the count of objects and the actual objects returned by Get-Team and Get-UnifiedGroup (with the filter) are identical.
Having support for a filter is also important because it means that the server (Exchange Online) does the work to find team-enabled groups and only returns those objects. This is much faster than finding all Microsoft 365 Groups and then filtering them on the workstation to figure out which are team-enabled.
A two-line answer does the job, but a more complete answer is to create a nice report. Here’s a simple script to do just that:
# Check that we are connected to Exchange Online $ModulesLoaded = Get-Module | Select-Object -ExpandProperty Name If (!($ModulesLoaded -match "ExchangeOnlineManagement")) {Write-Host "Please connect to the Exchange Online Management module and then restart the script"; break} Write-Host "Finding Teams..." [array]$Teams = Get-UnifiedGroup -Filter {ResourceProvisioningOptions -eq "Team"} If (!($Teams)) { Write-Host "Can't find any Teams for some reason..." } Else { Write-Host ("Processing {0} Teams..." -f $Teams.count) $TeamsList = [System.Collections.Generic.List[Object]]::new() ForEach ($Team in $Teams) { $ManagedBy = $Team.ManagedBy; [string]$MemberDisplayName = $Null; [array]$DisplayNames = $Null ForEach ($Member in $ManagedBy) { $MemberDisplayName = (Get-ExoRecipient -Identity $Member -RecipientTypeDetails UserMailbox).DisplayName $DisplayNames += $MemberDisplayName } $TeamLine = [PSCustomObject][Ordered]@{ Team = $Team.DisplayName SPOSite = $Team.SharePointSiteURL ManagedBy = $DisplayNames -join ", " } $TeamsList.Add($TeamLine) } $TeamsList | Out-GridView $TeamsList | Export-CSV -NoTypeInformation c:\temp\TeamsSPOList.CSV }
The report output is to the screen using Out-GridView (Figure 1) and a CSV file. But as it’s PowerShell, you can change the code to do whatever you want.
Apart from anything else, this exercise proves that if you write PowerShell scripts to manage an Office 365 tenant, you need to keep an eye on changes introduced in updated modules.
The Office 365 for IT Pros eBook has a complete chapter on using PowerShell to create innovative solutions to system administration issues. Subscribe now and benefit from monthly updates.
]]>My article about how to create an auto-label policy to apply retention labels to Teams meeting recordings resulted in several questions. As I noted in the article, tracking the progress of auto-labeling can be challenging due to the black-box nature of the background processes which search for recording files to label. One suggestion was to use the technique explained in this blog post to use the SharePoint Online PnP PowerShell module to connect to sites and retrieve information about retention job activity. For example:
$SiteURL = "https://office365itpros.sharepoint.com/sites/Office365Adoption/" Connect-PnPOnline -Url $SiteURL -Interactive get-pnppropertybag -key "dlc_policyupdatelastrun" get-pnppropertybag -key "dlc_expirationlastrunv2" 2/23/2021 11:18:42 PM 2/2/2021 8:02:41 PM
The first value (dlc_PolicyUpdateLastRun) is the date when the background job to evaluate retention dates for items last ran. The second (dlc_ExpirationLastRunv2) tells you the last time the background job ran to execute the retention action defined in labels when retention periods expire.
The background jobs which evaluate retention dates and execute actions are not directly connected to auto-label processing, but they give an insight into how SharePoint Online processes sites. In a nutshell, if a site is active, the background jobs process its content. If not, the site is ignored. This makes a lot of sense because it avoids SharePoint doing a bunch of work to check items in sites where no work is necessary. I don’t know if another value stores a date when action must be taken to process expired items, but it would make sense if it did.
These values date back to legacy management processing in SharePoint on-premises and while they still work, Microsoft introduced a new retention processing engine last year which might eventually nullify their use.
Interesting as these values are, they don’t tell us anything about the application of labels. In the last article, I mentioned that the Office 365 audit log captures the TagApplied event when a person or policy applies a retention label to an item. The audit events are available roughly 15 minutes after they occur, so this source seemed like a good place to investigate.
I ended up writing a script to do the following:
Here’s the main loop of the code to process the audit records. You can download the complete script from the Office 365 for IT Pros GitHub repository.
[array]$Records = (Search-UnifiedAuditLog -Operations TagApplied -StartDate $StartDate -EndDate $EndDate -Formatted -ResultSize 2000) $TaggedRecordings = [System.Collections.Generic.List[Object]]::new() ForEach ($Rec in $Records) { $AuditData = $Rec.AuditData | ConvertFrom-Json If (($AuditData.DestinationLabel -eq $RetentionLabel) -and ($AuditData.UserType -eq "CustomPolicy")) { $RecordingFileName = $AuditData.DestinationFileName $DateLoc = ($RecordingFileName.IndexOf("-202")+1) $RDate = $RecordingFileName.SubString($DateLoc,8) $TimeLoc = $DateLoc + 9 $RTime = $RecordingFileName.SubString($TimeLoc,4) $RecordingDateTime = $RDate + $RTime [datetime]$RecordingDate = [datetime]::ParseExact($RecordingDateTime,"yyyyMMddHHmm",$null) [datetime]$TaggingDate = Get-Date($AuditData.CreationTime) $TimeToTag = ($TaggingDate - $RecordingDate) $TotalSeconds = $TotalSeconds + $TimeToTag.TotalSeconds $TimeToTagFormatted = "{0:dd}d:{0:hh}h:{0:mm}m" -f $TimeToTag # Add the data about our record $DataLine = [PSCustomObject] @{ Workload = $AuditData.Workload Recording = $AuditData.DestinationFileName "Retention Label" = $AuditData.DestinationLabel "Tagging Date" = Get-Date($AuditData.CreationTime) -format g "Recording date" = Get-Date($RecordingDate) -format g "Days to label" = $TimeToTagFormatted Site = $AuditData.SiteURL FullURL = $AuditData.ObjectId } $TaggedRecordings.Add($DataLine) } # End if } # End ForEach
After processing all the audit records, I know what Teams meeting recordings the auto-label policy has labelled and how long it took on average for an item to receive a label.
25 audit records found for auto-applying the Teams recordings retention label between 09/06/2021 19:36:43 and 23/06/2021 19:36:43 Average elapsed time to auto-label recordings: 02d:13h:28m The report file is available in C:\temp\TaggedTeamsRecordings.csv.
The average time between creation and labeling depends on the gap between the meeting and when the labeling job runs. This seems to be on a weekly workcycle and usually runs over the weekend, so labeling a recording can take anything up to a week. An average of between two and four days is normal given that Teams captures new meeting recordings over the work week.
The same technique can be applied to track the progress of any auto-label policy.
Keep up with the changing world of the Microsoft 365 ecosystem by subscribing to the Office 365 for IT Pros eBook. Monthly updates mean that our subscribers learn about new development as they happen.
]]>Sometimes I hate PowerShell. Not the language itself, just my ineptitude, or my inability to remember how to do things, or the speed of some cmdlets which deal with objects like mailboxes and groups. It’s not that the cmdlets are inefficient. They do a lot of work to retrieve information about objects, so they are slow.
This is fine for ad-hoc queries or where you only need to process a couple of hundred mailboxes or groups. The problem is accentuated as numbers grow, and once the need exists to process thousands of objects, some significant time is spent waiting for cmdlets to complete, meaning that scripts can take hours to run.
Microsoft has made significant progress in the Exchange Online PowerShell module to introduce faster cmdlets like Get-ExoMailbox and Get-ExoMailboxStatistics. These REST-based cmdlets are faster and more robust than their remote PowerShell cousins and these improvements are ample justification for the work needed to revisit and upgrade scripts. The module also supports automatic renewal of sessions to Exchange Online and the Security and Compliance endpoints, so it’s all good.
Things aren’t so impressive with Get-UnifiedGroup, which retrieves details about Microsoft 365 Groups. Reflecting the use of Microsoft 365 groups, Get-UnifiedGroup is a complex cmdlet which assembles details from Azure AD, Exchange Online, and SharePoint Online to give a full picture of group settings. Running Get-UnifiedGroup to fetch details of 200 groups is a slow business; running the cmdlet to fetch details of 10,000 groups is a day-long task. The Get-Team cmdlet is no speedster either. In their defense, Microsoft designed these cmdlets for general-purpose interaction with Groups and Teams and not to be the foundation for reporting thousands of objects over a short period.
If you only need a list of Microsoft 365 Groups, it’s also possible to create the list using the Get-Recipient cmdlet.
Get-Recipient -RecipientTypeDetails GroupMailbox -ResultSize Unlimited
Creating a list of groups with Get-Recipient is usually much faster than creating it with Get-UnifiedGroup. However, although you end up with a list of groups, Get-Recipient doesn’t return any group-related properties, so you usually end up running Get-UnifiedGroup to retrieve settings for an individual group before you can process it. Still, that overhead can be spread out over the processing of a script and might only be needed for some but not all groups.
Which brings me to the Microsoft Graph API for Groups. As I’ve pointed out for some years, using Graph APIs with PowerShell is a nice way to leverage the approachability of PowerShell and the power of the Graph. The script to create a user activity report from Graph data covering Exchange, SharePoint, OneDrive, Teams, and Yammer is a good example of how accessible the Graph is when you get over the initial learning curve.
Three years ago, I wrote about a script to find obsolete Teams and Groups based on the amount of activity observed in a group across Exchange Online, SharePoint Online, and Teams. In turn, that script was based on an earlier script which processed only Office 365 Groups. Since then, I have tweaked the script in response to comments and feedback and everything worked well. Except that is, once the script ran in large environments supporting thousands of groups. The code worked, but it was slow, and prone to time-outs and failures.
The solution was to dump as many PowerShell cmdlets as possible and replace them with Graph calls. The script (downloadable from GitHub) now uses the Graph to retrieve:
The result is that the script is much faster than before and can deal with thousands of groups in a reasonable period. Fetching the group list still takes time as does fetching all the bits that Get-UnifiedGroup returns automatically. On a good day when the service is lightly loaded, the script takes about six seconds per group. On a bad day, it could be eight seconds. Even so, the report (Figure 1) is generated about three times faster.
Results - Teams and Microsoft 365 Groups Activity Report V5.1 -------------------------------------------------------------- Number of Microsoft 365 Groups scanned : 199 Potentially obsolete groups (based on document library activity): 121 Potentially obsolete groups (based on conversation activity) : 130 Number of Teams-enabled groups : 72 Percentage of Teams-enabled groups : 36.18% Total Elapsed time: 1257.03 seconds Summary report in c:\temp\GroupsActivityReport.html and CSV in c:\temp\GroupsActivityReport.csv
The only remaining use of an “expensive” cmdlet in the script is when Get-ExoMailboxFolderStatistics fetches information about compliance items for Teams stored in Exchange Online mailboxes. The need for this call might disappear soon when Microsoft eventually ships the Teams usage report described in message center notification MC234381 (no sign so far despite a promised delivery of late February). Hopefully, that report will include an update to the Teams usage report API to allow fetching of team activity data like the number of conversations over a period. If this happens, I can eliminate calling Get-ExoMailboxFolderStatistics and gain a further speed boost.
The downsides of using the Graph with PowerShell are that you need to register an app in Azure Active Directory and make sure that the app has the required permissions to access the data. This soon becomes second nature, and once done, being able to process data faster than is possible using the general-purpose Get-UnifiedGroup and Get-Team cmdlets is a big benefit when the time comes to process more than a few groups at one time.
]]>The basics of Office 365 licensing are well known. Users access services through service plans bundled in composite plans like Office 365 E3 or E5 or individual offerings like Azure AD Premium P1. Users must have the relevant licenses to access a service like Exchange Online or Teams. Information about the licenses assigned to users are stored in their Azure AD accounts. This context helps us understand how to begin answering questions about licensing that isn’t available in the Microsoft 365 admin center (Figure 1).
The admin center tells you what licenses you have, the licenses assigned and available, and the accounts with assigned licenses. You can export lists of users with a selected license to a CSV file for reporting purposes or to import into Power BI for analysis. But one thing you can’t do is to find out what users have licenses for applications assigned through a composite license.
Take the example of Teams, Exchange Online, SharePoint Online. These are core services bundled into the Office 365 E3 and E5 plans. You could assume that everyone with an E3 or E5 license can use these applications, but that’s not true because administrators can remove the service plans for applications from individual user accounts (a service plan is effectively a license for a specific application bundled into a plan; you can’t buy a service plan). Take the example shown in Figure 2. The user has an Office 365 E3 license but the service plans for Bookings, Forms, and Kaizala have been removed.
It’s relatively common to find that organizations remove individual service plans from users until they are ready to deploy an application. For instance, you might want to use Exchange, SharePoint, and OneDrive for Business immediately but want to block user access to Teams, Forms, Stream, and other applications bundled in Office 365 E3 or E5 until local support is ready and user training is available.
While the admin center doesn’t support reporting of service plans for individual applications, it’s possible to do this with some straightforward PowerShell. The key is to discover how to retrieve the licensing information from Azure AD accounts.
Licensing information is in the AssignedPlans property of an Azure AD account. If we examine the property, you’ll see a bunch of assignments and deletions as licenses are added and removed from the account.
(Get-AzureADUser -ObjectId Andy.Ruth@office365itpros.com).AssignedPlans AssignedTimestamp CapabilityStatus Service ServicePlanId ----------------- ---------------- ------- ------------- 28/01/2021 22:11:05 Deleted OfficeForms 2789c901-c14e-48ab-a76a-be334d9d793a 28/01/2021 22:11:05 Deleted MicrosoftKaizala aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1 28/01/2021 22:11:05 Enabled CRM 95b76021-6a53-4741-ab8b-1d1f3d66a95a
The ServicePlanId is the important piece of information because it stores the unique identifier (a GUID) for the plan. Microsoft publishes an online list of application service plan identifiers for reference. The point to remember is that the same service plan identifier is always used. For instance, 2789c901-c14e-48ab-a76a-be334d9d793a is always Forms Plan E3 (the license for the Forms application included in Office 365 E3).
To confirm this, let’s use the Get-AzureADSubscribedSku cmdlet to retrieve the set of licenses known in a tenant.
$Licenses = (Get-AzureADSubscribedSku) $Licenses | Select -Property SkuPartNumber, ConsumedUnits -ExpandProperty PrepaidUnits | Format-Table SkuPartNumber ConsumedUnits Enabled Suspended Warning ------------- ------------- ------- --------- ------- STREAM 4 10000 0 0 EMSPREMIUM 5 5 0 0 ENTERPRISEPACK 22 25 0 0 FLOW_FREE 3 10000 0 0 POWER_BI_STANDARD 5 1000000 0 0 ENTERPRISEPREMIUM_NOPSTNCONF 5 5 0 0 TEAMS_EXPLORATORY 0 100 0 0 SMB_APPS 2 3 0 0 RIGHTSMANAGEMENT_ADHOC 3 50000 0 0
The online documentation tells us that the name of the Office 365 E3 SKU is ENTERPRISEPACK. It is license number three in our list, so we can look at this object to find out what’s included. As expected, the Service Plan Identifier for FORMS_PLAN_E3 is 2789c901-c14e-48ab-a76a-be334d9d793a.
$Licenses[2].ServicePlans | Format-Table ServicePlanName, ServicePlanId ServicePlanName ServicePlanId --------------- ------------- POWER_VIRTUAL_AGENTS_O365_P2 041fe683-03e4-45b6-b1af-c0cdc516daee CDS_O365_P2 95b76021-6a53-4741-ab8b-1d1f3d66a95a PROJECT_O365_P2 31b4e2fc-4cd6-4e7d-9c1b-41407303bd66 DYN365_CDS_O365_P2 4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14 MICROSOFTBOOKINGS 199a5c09-e0ca-4e37-8f7c-b05d533e1ea2 KAIZALA_O365_P3 aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1 MICROSOFT_SEARCH 94065c59-bc8e-4e8b-89e5-5138d471eaff WHITEBOARD_PLAN2 94a54592-cd8b-425e-87c6-97868b000b91 MIP_S_CLP1 5136a095-5cf0-4aff-bec3-e84448b38ea5 MYANALYTICS_P2 33c4f319-9bdd-48d6-9c4d-410b750a4a5a BPOS_S_TODO_2 c87f142c-d1e9-4363-8630-aaea9c4d9ae5 FORMS_PLAN_E3 2789c901-c14e-48ab-a76a-be334d9d793a STREAM_O365_E3 9e700747-8b1d-45e5-ab8d-ef187ceec156 Deskless 8c7d2df8-86f0-4902-b2ed-a0458298f3b3 FLOW_O365_P2 76846ad7-7776-4c40-a281-a386362dd1b9 POWERAPPS_O365_P2 c68f8d98-5534-41c8-bf36-22fa496fa792 TEAMS1 57ff2da0-773e-42df-b2af-ffb7a2317929 PROJECTWORKMANAGEMENT b737dad2-2f6c-4c65-90e3-ca563267e8b9 SWAY a23b959c-7ce8-4e57-9140-b90eb88a9e97 INTUNE_O365 882e1d05-acd1-4ccb-8708-6ee03664b117 YAMMER_ENTERPRISE 7547a3fe-08ee-4ccb-b430-5077c5041653 RMS_S_ENTERPRISE bea4c11e-220a-4e6d-8eb8-8ea15d019f90 OFFICESUBSCRIPTION 43de0ff5-c92c-492b-9116-175376d08c38 MCOSTANDARD 0feaeb32-d00e-4d66-bd5a-43b5b83db82c SHAREPOINTWAC e95bec33-7c88-4a70-8e19-b10bd9d0c014 SHAREPOINTENTERPRISE 5dbe027f-2339-4123-9542-606e4d348a72 EXCHANGE_S_ENTERPRISE efb87545-963c-4e0d-99df-69c6916d9eb0
Now that we know how service plan identifiers work and how to find their values, we can use this knowledge to build a script to interrogate Azure AD user accounts to find license data for an application.
Not everyone likes inputting GUIDs, so we’ll make it easier by allowing an application name to be used for the query. The code creates a hash table of service plan identifiers and names (feel free to add more if you want) and then retrieves details of Azure AD user accounts. We ask the user to enter an application to check and validate the response against the hash table. Finally, we loop through the set of Azure AD accounts to check if the license is in their assigned set and report the details. Here’s the code (you can download it from GitHub):
$Plans = @{} $Plans.Add(“199a5c09-e0ca-4e37-8f7c-b05d533e1ea2”, “Bookings”) $Plans.Add(“efb87545-963c-4e0d-99df-69c6916d9eb0”, “Exchange Online”) $Plans.Add(“5dbe027f-2339-4123-9542-606e4d348a72”, “SharePoint Online”) $Plans.Add(“7547a3fe-08ee-4ccb-b430-5077c5041653”, “Yammer”) $Plans.Add(“882e1d05-acd1-4ccb-8708-6ee03664b117”, “Intune”) $Plans.Add(“57ff2da0-773e-42df-b2af-ffb7a2317929”, “Teams”) $Plans.Add(“2789c901-c14e-48ab-a76a-be334d9d793a”, “Forms”) $Plans.Add(“9e700747-8b1d-45e5-ab8d-ef187ceec156”, “Stream”) $Plans.Add(“b737dad2-2f6c-4c65-90e3-ca563267e8b9”, “Planner”) Write-Host “Finding Azure AD Account Information” $Users = Get-AzureADUser -All $True -Filter "Usertype eq 'Member'" CLS $Product = Read-Host "Enter the Office 365 application for a license check" if (!($Plans.ContainsValue($Product))) { # Not found Write-Host “Can’t find” $Product “in our set of application SKUs”; break } Foreach ($Key in $Plans.Keys) { # Lookup hash table to find product SKU If ($Plans[$Key] -eq $Product) { $PlanId = $Key } } $PlanUsers = [System.Collections.Generic.List[Object]]::new() ForEach ($User in $Users) { If ($PlanId -in $User.AssignedPlans.ServicePlanId) { $Status = ($User.AssignedPlans | ? {$_.ServicePlanId -eq $PlanId} | Select -ExpandProperty CapabilityStatus ) $ReportLine = [PSCustomObject] @{ User = $User.DisplayName UPN = $User.UserPrincipalName Department = $User.Department Country = $User.Country SKU = $PlanId Product = $Product Status = $Status } $PlanUsers.Add($ReportLine) } } Write-Host "Total Accounts scanned:" $PlanUsers.Count $DisabledCount = $PlanUsers | ?{$_.Status -eq "Deleted"} $EnabledCount = $PlanUsers | ? {$_.Status -eq "Enabled"} Write-Host (“{0} is enabled for {1} accounts and disabled for {2} accounts” -f $Product, $EnabledCount.Count, $DisabledCount.Count) $PlanUsers | Sort User | Out-GridView
You can also use the Users Graph API to fetch license information for Azure AD accounts by running a call like:
https://graph.microsoft.com/v1.0/users?$filter=userType eq 'Member'&$select=id, displayName, licenseassignmentstates, assignedplans
The code to check the AssignedPlans data for a product identifier is the same. Although the Graph is usually faster than PowerShell cmdlets, in this instance only one call is needed, and the speed difference is marginal.
As ever, if you plan to use the Graph to fetch data, testing call syntax and returns using the Graph Explorer tool is a good thing to do. Figure 3 shows the result of querying the Graph to return user license data.
Because the script looks for a specific service plan identifier, it finds every instance of a licensed application. In other words, if you search for an application like Exchange Online, which included as EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0) in both Office 365 E3 and E5), the report will list accounts enabled for Exchange in both plans. If you want to differentiate between the two plans, you need to check the AssignedLicenses property of each account for the identifier of the plan. For instance, looking at Microsoft’s reference list, we find that:
The script available from GitHub includes code to output the names of license SKUs.
The information in the report can be saved to a CSV file or viewed online. Figure 4 shows the result of the script as viewed through the Out-GridView cmdlet. We can see that the user we removed the Forms license in Figure 1 is reported accurately.
You might not need to interrogate Azure AD for details of individual licenses very often, but if you do (as when preparing to enable an application for a bunch of users), it’s much faster to get the information with PowerShell than using the admin center GUI.
For more great information about how licensing works, subscribe to the Office 365 for IT Pros eBook.
]]>Updated 18 June 2023
A reader asked how to create a report of the membership of Microsoft 365 role groups. Although this sounds like a straightforward question, the answer is complex. Here’s why.
Originally, compliance functionality was workload-based. Exchange Online had its own features as did SharePoint Online. In 2016, Microsoft introduced the Office 365 Security and Compliance Center (SCC) to bring together functionality which applied across all workloads. Permissions for the SCC follow the Exchange Online Role-Based Access Control (RBAC) model. Users receive permissions to perform actions through membership of role groups. If your account is a member of the right role group, you can perform a compliance action, like running a content search or managing an eDiscovery case. If it’s not, you won’t see the options to perform those actions displayed in the SCC.
Here’s where the situation becomes complicated. We are in the middle of a transition from the SCC to the Microsoft 365 compliance center, which Microsoft launched in 2018. Three years and a lot of confusion later, an April 15 blog post warns that Microsoft will soon start to redirect users automatically from the SCC to the Microsoft 365 compliance center. Message center notification MC256030 posted on May 12 confirms that a new permissions management page in the Microsoft 365 compliance center will make role management easier (Microsoft 365 roadmap item 72239).
Update: The Microsoft 365 compliance center is now the Microsoft Purview compliance portal. I’ve also updated the PowerShell code in this post to use the Microsoft Graph PowerShell SDK.
The new permissions management page (Figure 1) allows management for both Azure AD roles and compliance center roles (more correctly, role groups). The differences between the two are:
The permission management page shows Azure AD roles used to performance compliance tasks. Currently, the page lists nine Azure AD roles like compliance administrator and compliance data administrator. Other Azure AD roles like Teams Administrator don’t appear because they are not associated with compliance management.
Returning to the original question of how to generate a report about the holders of different compliance roles, the answer depends on if you want to report the membership of compliance role groups or Azure AD roles. Given that more functionality is governed by the latter type at present, the following code is a solution.
The steps to create the report are:
Here’s the code:
Connect-IPPSSession [array]$RoleGroups = Get-RoleGroup $Report = [System.Collections.Generic.List[Object]]::new() ForEach ($RoleGroup in $RoleGroups) { $Members = $RoleGroup.Members $MemberNames = [System.Collections.Generic.List[Object]]::new() ForEach ($Member in $Members) { $MemberName = (Get-ExoMailbox -Identity $Member.SubString(($Member.IndexOf("onmicrosoft.com/")+16),36) -Erroraction SilentlyContinue).DisplayName $MemberNames.Add($MemberName) } If ($RoleGroup.WhenChanged -eq "Wednesday 1 January 2020 00:00:00") { $RoleGroupChanged = "Never" } Else { $RoleGroupChanged = Get-Date($RoleGroup.WhenChanged) -format g } $MemberNames = $MemberNames -join ", " $ReportLine = [PSCustomObject][Ordered]@{ "Role Group" = $RoleGroup.DisplayName "Members" = $MemberNames "Last Updated" = $RoleGroupChanged } $Report.Add($ReportLine) } #End ForEach $RoleGroup $Report | Sort-Object "Role Group" | Out-GridView
Figure 2 shows what the report looks like. A simple Export-CSV command will write the details out to a CSV file if you want to manipulate the data in Excel.
The same approach works to create a report for the Azure AD roles. In this case, you use the Get-MgDirectoryRole cmdlet to find the set of roles and Get-MgDirectoryRoleMember cmdlet to process each role (here’s an example of using these cmdlets to report on Microsoft 365 admin accounts which aren’t protected by multi-factor authentication).
Connect-MgGraph -Scopes Directory.Read.All [array]$RoleGroups = Get-MgDirectoryRole | Sort-Object DisplayName $Report = [System.Collections.Generic.List[Object]]::new() ForEach ($RoleGroup in $RoleGroups) { [array]$Members = Get-MgDirectoryRoleMember -DirectoryRoleId $RoleGroup.Id $MemberNames = $Members.additionalProperties.displayName -join (", ") $ReportLine = [PSCustomObject][Ordered]@{ "Role Group" = $RoleGroup.DisplayName "Members" = $MemberNames "Description" = $RoleGroup.Description } $Report.Add($ReportLine) } #End ForEach $RoleGroup $Report | Out-GridView
Simple questions often have complex answers. In this case, it’s a matter of deciding what kind of role holders you want to report. Once you know that, the PowerShell to generate the report is relatively straightforward.
Learn lots more about how different parts of Office 365 work by subscribing to the Office 365 for IT Pros eBook. We go where other writing teams don’t, and we keep our book refreshed with monthly updates.
]]>Updated 12 June 2023
In an earlier post, I cover the basics of updating the Azure AD B2B collaboration settings for a Microsoft 365 tenant. Azure AD B2B collaboration external settings allow the tenant to define a deny list of domains they do not want guest accounts to come from or an allow list to define a restrictive set of domains they’re willing to accept guests from. In my experience, most tenants use a deny list. Once implemented, any attempt to add a new guest account from one of the blocked domains will fail. This happens for applications like Teams and Outlook, and administrative interfaces like the Azure AD admin center (Figure 1).
Azure B2B Collaboration settings work well and no new guest users from domains featuring on its blacklist can be added. However, it does nothing to stop existing guests from those domains continuing to be members of groups and teams within your tenant. Microsoft doesn’t have a facility to detect and remove problem guest users, but it’s relatively easy to do with PowerShell.
We’ve posted a new script called FindBadGuestsFromBlockedDomains.PS1 in the Office 365 for IT Pros GitHub repository. The script works as follows:
When the script finishes processing the set of groups, it generates some basic statistics (Figure 2) and a CSV file.
The CSV file (Figure 3) contains the Azure AD object identifier for each guest account found from a banned domain. This is important because you can use this to drive a removal process if necessary.
Before removing a guest account, remember what it will do:
Before deleting anything, you should review the contents of the CSV file carefully to check that each account really should be deleted. Any guest account that you want to keep should be removed from the file. The updated file can then act as the input for a removal process. For instance, this PowerShell code reads the CSV file and removes the accounts included in the file.
$BadAccounts = Import-Csv c:\temp\BadGuestAccounts.CSV ForEach ($Account in $BadAccounts) { Write-Host "Removing" $Account."Guest Email" Remove-AzureADUser -ObjectId $Account.ObjectId }
After removing problem accounts, the remaining guest accounts in the tenant comply with the Azure AD B2B collaboration settings. If you decide to remove guest accounts, it’s probably a good idea to email the group/team owners to let them know what you plan to do, just in case a guest account is required.
Like any of our scripts, the code is written to explain a principal and demonstrate how to construct a solution to a problem. I’m sure the code can be improved, notably by adding better error handling. But it does work (at least in our tenant).
The Office 365 for IT Pros eBook has lots of intensely practical advice to help administrators run tenants. Subscribe to make sure that you benefit from our knowledge.
]]>Without any warning, Microsoft seems to have introduced a restriction to the Set-User cmdlet in the Exchange Online management PowerShell module. The change happens when you connect a new PowerShell session to Exchange Online and the cmdlets are downloaded into a session.
Any attempt to update a user’s business or mobile phone numbers with Set-User now generates an error saying that for security reasons these properties cannot be updated through Exchange Online. Instead, administrators are forced to update the properties through the Azure AD admin center or by using the Azure AD PowerShell module.
Set-User -id $User -Phone "+1 454 146 1412" Phone and Mobile Phone for users with Recipient Type Details "UserMailbox" cannot be updated in Exchange Online for security reason. Please do it in Azure Active Directory. + CategoryInfo : NotSpecified: (Jessica.Chen@office365itpros.com:UserIdParameter) [Set-User], ShouldNotUpdate...eInExoException + FullyQualifiedErrorId : [Server=AM4PR0401MB2289,RequestId=39d0dbcb-05d1-42e6-b8ea-4bc78fc58816,TimeStamp=10/05/2 021 22:22:12] [FailureCategory=Cmdlet-ShouldNotUpdatePhoneMobilePhoneInExoException] 1D58FC00,Microsoft.Exchange.Management.RecipientTasks.SetUser + PSComputerName : outlook.office365.com Set-user -id $User -MobilePhone "+1 464 147 4433" Phone and Mobile Phone for users with Recipient Type Details "UserMailbox" cannot be updated in Exchange Online for security reason. Please do it in Azure Active Directory. + CategoryInfo : NotSpecified: (Jessica.Chen@office365itpros.com:UserIdParameter) [Set-User], ShouldNotUpdate...eInExoException + FullyQualifiedErrorId : [Server=AM4PR0401MB2289,RequestId=4cffcdc2-5bc0-4d17-83bf-a4d7bb67d2a5,TimeStamp=10/05/2 021 22:22:21] [FailureCategory=Cmdlet-ShouldNotUpdatePhoneMobilePhoneInExoException] 1D58FC00,Microsoft.Exchange.Management.RecipientTasks.SetUser + PSComputerName : outlook.office365.com
Oddly, Set-User is still able to update other phone numbers such as a user’s phone or alternative number. You can also update someone’s pager number (if they still use one) and their fax number. The fax number also appears as a property for an Azure AD account, so if Microsoft decided to block the business and mobile numbers, it’s strange that they left the fax number alone.
Set-User has been part of Exchange PowerShell since Exchange 2007. The cmdlet is a method to update account properties stored in Active Directory and Azure Active Directory which are important to Exchange (because they appear in address lists). Changing behavior without warning is disruptive to organizations because it might impact scripts used for production purposes, such as taking a feed from a HR system and updating user accounts.
Microsoft’s error message implies that the change happened for security reasons, but as they haven’t explained any detail about why it is a security problem to allow Set-User to update phone numbers, it’s hard to assess what’s going on here. If it’s a problem for Set-User to update telephone numbers, why is it OK for Set-AzureADUser to do the same?
Set-AzureADUser -ObjectId Jessica.Chen@office365itpros.com -TelephoneNumber "+1 550 771 1314" -Mobile "+1 466 146 1453"
Accounts need different permissions to run the two cmdlets. Accounts holding the Exchange administrator role can update mailbox properties (some of which synchronize with Azure AD) but can’t do so using Azure AD interfaces like the Azure AD PowerShell module. It could be that Microsoft wants to tighten the ability of users with workload-specific roles to update Azure AD.
Although I can’t prove this, I suspect that the tightening is at the heart of problems reported with the Set-CsUser cmdlet since the release of V2.3.0 of the Microsoft Teams PowerShell module last month (see this Microsoft Technical Community post for some details of user issues).
Microsoft is much better at communicating change within Office 365 today than they used to be, notably through the Microsoft 365 roadmap and the notifications posted to the Microsoft 365 admin center. This change came out of the blue and landed without warning. People don’t like surprises and always react better if the logic behind an update is clearly explained. Everyone will get behind a change which helps to improve security – unless they find out when their scripts stop working.
]]>Updated: 5 September 2023
The ability for applications to use Entra ID B2B collaboration to add guest users is governed by external collaboration settings, aka the Entra ID B2B collaboration policy (previously the Azure AD B2B Collaboration policy). The settings are available through the External identities section of the Entra ID admin center, where they are found under Collaboration restrictions (Figure 1).
Three options are available:
The total size of the policy must be less than 25 KB (25,000 characters). Each domain in an allow or deny list counts against the limit as do other policy settings. Allowing 1,000 bytes for all other settings, an average of 15 characters per domain means that an allow or deny list can accommodate up to 1,600 domains. You can only choose to have a policy with an allow or a deny list and cannot have some domains in a deny list and others in an allow list.
In my case, I use the middle approach to block guest accounts from certain domains. For instance, these might be domains belonging to direct competitors or domains used for consumer rather than business purposes. In Figure 1, you can see that I’ve decided to block access to guests with Google.com and Yahoo.com email addresses.
Entra ID applies the block rather than applications. For example, in Figure 2, I’ve tried to add a new guest account to Teams, which doesn’t object when I enter tredmondxxxx@yahoo.com to identify the guest. The block descends when Teams tries to create the new guest account in Entra ID. The “Something went wrong” is an uncertain error, but it should be enough for the administrator to know what’s going on when they learn where the guest comes from. OWA doesn’t object to the email address for a new guest but is no more definite in its error (Figure 3). Again, this is because the application fails to create a new guest account in Entra ID.
Before going ahead to update your external collaboration settings, it’s a good idea to understand where current guest accounts come from. This code scans down through guest accounts found in Entra ID to capture details of each user’s home domain. It then populates a hash table with the domain information to create a count for each, followed by sorting in descending order to discover the most popular domains:
$Domains = [System.Collections.Generic.List[Object]]::new() Connect-MgGraph -NoWelcome -Scopes Directory.Read.All [array]$Guests = (Get-MgUser -All -Filter "UserType eq 'Guest'" | Select-Object Displayname, UserPrincipalName, Mail, Id | Sort DisplayName) ForEach ($Guest in $Guests) { $Domain = ($Guest.UserPrincipalName.Split("#EXT#")[0]).Split("_")[1] $Domains.Add($Domain) } $DomainsCount = @{} $Domains = $Domains | Sort-Object $Domains | ForEach {$DomainsCount[$_]++} $DomainsCount = $DomainsCount.GetEnumerator() | Sort-Object -Property Value -Descending $DomainsCount Name Value ---- ----- microsoft.com 59 outlook.com 11 quest.com 6 hotmail.com 5 gmail.com 4 emea.teams.ms 4
Now you know what domains are actively in use, you can decide which you might like to ban. Remember that putting a domain on the deny list stops only the creation of new guest accounts. Existing guest accounts remain in the membership of groups and teams. If you want to purge accounts from unwanted domains, you need to find the groups (teams) with guest members and examine each guest to decide if they can stay or be removed. It’s easy enough to find guests from banned domains with PowerShell, or so the saying goes…
The Office 365 for IT Pros eBook is packed full of practical information like this. Learn from the pros by subscribing to Office 365 for IT Pros and receive monthly updates during your subscription period.
]]>It’s easy for tenant administrators to add photos for guest accounts using the Azure AD portal. They can also run the Set-AzureADUserThumbnailPhoto cmdlet to do the same job. The difference is that apps can display Azure AD guest photos where otherwise they’d show the default initials (Figure 1).
What isn’t easy is for people who have guest accounts in other Microsoft 365 tenants to update their photo without administrator intervention. Microsoft blocks guest users from the features built into apps like Teams to allow users to update their photos, probably because guest accounts are not subject to the OWA mailbox policies which control this feature for tenant accounts.
Then MVP Yannick Reekmans published a blog to explain how he used the Azure AD portal to update a guest account in another tenant. The article explains how to find the GUID of the guest account in the target tenant and how to use the GUID to update the account. The method certainly works, but it’s a tad overcomplicated for my taste.
PowerShell makes the task very easy. Here’s how to do the job in three steps.
The key to this method is to use cmdlets in the Azure AD or Azure AD Preview modules. Make sure to download and install one of these modules on your workstation. Then, run the Connect-AzureAD cmdlet to connect to the service domain of the target tenant.
The service domain is the sub-domain in onmicrosoft.com used by the tenant. For example, to connect to the Office365ITPros.com tenant, we’d use the command:
Connect-AzureAD -Tenant Office365ITPros.onmicrosoft.com
Azure AD prompts you to authenticate. Use your normal account and sign in with its password (and MFA, if required by the tenant). Your normal account connects to the guest account, so when you authenticate, you use the guest account to access the target tenant. If you don’t know the service domain for the target tenant, use the What’s My Tenant ID site to find tenant GUID and use it to sign in. For example:
Connect-AzureAD -Tenant 72f988bf-86f1-41af-91ab-2d7cdxab647
Then run the Get-AzureADTenantDetail cmdlet and examine the VerifiedDomains property to find the service domain.
(Get-AzureADTenantDetail | Select-Object -ExpandProperty VerifiedDomains | Where-Object {$_.name -match "onmicrosoft"}).Name
You can reference Azure AD accounts with the GUID (object identifier) or user principal name (UPN). The UPN is usually easier to figure out because it follows a set format. For instance, the guest account for the account with UPN Tony.Redmond@office365itpros.com is:
tony.redmond_office365itpros.com#EXT#@xxxxx.onmicrosoft.com
Where “xxxxx” is the name of the target tenant.
To make things easier, we put the UPN into a variable:
$UPN = "tony.redmond_office365itpros.com#EXT#@xxxxx.onmicrosoft.com"
Azure AD needs a suitable photo file (JPEG or PNG) to update a user’s image. Unlike Exchange Online, which stores a higher resolution form of photo data for use by Microsoft 365 apps, Azure AD stores only small thumbnail images. These images are acceptable for the small photos seen in Teams conversations or in browser menu bars, but not for attendee cards used in Teams meetings, so they do not appear everywhere within Microsoft 365.
The maximum size of the input file is 100 KB. I’ve had good results with square photos measuring 500 x 500 pixels. You might have to play with a photo editor to create a good photo of the right size, but once you have one, you can write it into Azure AD using the Set-AzureADUserThumbnailPhoto cmdlet:
Set-AzureADUserThumbNailPhoto -ObjectId $UPN -FilePath c:\temp\MyPhoto.jpg
If you don’t see an error, you know Azure AD is happy with the photo. To check, you can run the Get-AzureADUserThumbnailPhoto cmdlet. Any response is good:
Get-AzureADUserThumbnailPhoto -ObjectId $UPN Tag : PhysicalDimension : {Width=500, Height=500} Size : {Width=500, Height=500} Width : 500 Height : 500 HorizontalResolution : 95.9866 VerticalResolution : 95.9866 Flags : 77842 RawFormat : [ImageFormat: b96b3caf-0728-11d3-9d7b-0000f81ef32e] PixelFormat : Format32bppArgb Palette : System.Drawing.Imaging.ColorPalette FrameDimensionsList : {7462dc86-6180-4c7e-8e3f-ee7333a7a483} PropertyIdList : {769, 305, 20752, 20753...} PropertyItems : {769, 305, 20752, 20753...}
Like any operation involving photo manipulation for Azure AD accounts, it takes some time for applications to refresh their caches and pick up new photos. You should expect that this will happen within a day. And once it does, you’ll see your bright smiling face in places where only your initials were before (Figure 2).
And then all you need to do is to rinse and repeat the process for every tenant where you have a guest account (possibly some of which you have forgotten). For whatever reason, some tenants always seem to be slower than others to respect photo updates. I have no idea why this happens. Stay patient and the photos should turn up eventually.
It’s good when guest accounts have photos. People like to know with whom they collaborate, and a photo is a much better reminder of a person than their initials can ever be. Tenant administrators might be concerned that guest users can sign into their tenant to update their photos. It’s true that guests could exploit this technique to display an inappropriate image. If they do, I’m sure that action will follow quickly, just like it would if a tenant user selected a distasteful photo. Another concern might be that guests might be able to update other account properties, like the display name. Much as I would like to do this, I haven’t been able to in any of the tenants where I tried. Azure AD allows me to update my photo but stops me doing anything else to my guest account. Which is how it should be.
Need to know more about managing guest accounts in an Office 365 tenant? The Office 365 for IT Pros eBook is packed full of advice and guidance on this and many other topics.
]]>I find writing a PowerShell script to be a peaceful activity. It focuses the mind into a gainful activity. Until something goes wrong, that is. Take the example of the script to create a report of managers and their direct reports in a tenant. Simple, I thought. Shouldn’t take longer than an hour, even to make it pretty. Then I lost myself in examining why the reported number of direct reports for a manager showed up as a blank even when I knew that the manager had a direct report. It’s amazing how deep down the rabbit hole the mind will take you if allowed.
I solved the problem by typing the variable used to accept the data returned by the cmdlet to find a manager’s direct report. The problem then is to understand why the issue exists and if it is related to a single cmdlet or all cmdlets. As it turns out, the answer is more complicated than I first thought.
Take this example. We have details of a manager’s Azure AD account stored in a variable.
$Manager | Ft DisplayName, ExternalDirectoryObjectId DisplayName ExternalDirectoryObjectId ----------- ------------------------- Oisin Johnston c6133be4-71d4-47c4-b109-e37c0c93f8d3 $Dn = $Manager.DistinguishedName
We now call the Get-User cmdlet to find the manager’s direct reports. The output goes to an untyped variable called $R.
$R = Get-User -Filter "Manager -eq '$Dn'" $R.Count
At first glance, the call doesn’t return any data and the $R variable doesn’t report a count property. However, I think some data should be there. If I type the variable to make it an array and try again, PowerShell returns the Count property as expected.
[array]$R = Get-User -Filter "Manager -eq '$Dn'" $R = Get-User -Filter "Manager -eq '$Dn'" $R.count 1 $R | ft DisplayName DisplayName ----------- Brian Weakliam (Operations)
Apparently, when PowerShell returns a single item, it unpacks the item and returns an object instead of an array. Using the GetType() method to examine the object, you’ll see that it’s a System.Object rather than a System.Array.
$R.GetType() IsPublic IsSerial Name BaseType -------- -------- ---- -------- True True PSObject System.Object
Forcing the variable to be an array means that PowerShell doesn’t unpack the single item to become an object. Because the $R variable continues to be an array, the count property is available. According to Microsoft documentation, arrays with one or zero items have count properties from PowerShell 3.0 on.
The same result happens with other Exchange Online cmdlets like Get-ExoMailbox and Get-UnifiedGroup. Here’s an example with Get-ExoMailbox:
$X = Get-ExoMailbox -Identity John.Smith $X.Count $X.GetType() IsPublic IsSerial Name BaseType -------- -------- ---- -------- True True PSObject System.Object
Great. We think we understand what’s happening and it seems like an excellent idea to type variables used to receive returns from a cmdlet. However, if I repeat the operation to return the same data as found with Get-User above using the Get-AzureADUserDirectReport cmdlet to an untyped variable, PowerShell reports the correct count!
$Q = Get-AzureADUserDirectReport -ObjectId c6133be4-71d4-47c4-b109-e37c0c93f8d3 $Q.count 1 $Q | Ft DisplayName DisplayName ----------- Brian Weakliam (Operations)
Examining the variable to see its type, we discover that the Azure AD cmdlet returns a directory object.
$Q.GetType() IsPublic IsSerial Name BaseType -------- -------- ---- ------- True False User Microsoft.Open.AzureAD.Model.DirectoryObject
The same is true for other Azure AD cmdlets like Get-AzureADUser and Get-AzureADGroup.
Apart from generating an array instead of a directory object, typing the output variable as an array has no effect on the Azure AD cmdlets. They work as expected.
It would be nice if all the PowerShell modules used a across Microsoft 365 apps were consistent, but as proven by the team template management cmdlets included in the Microsoft Teams V2.0 module, that’s not always the case. Testing should reveal issues like odd numbers reported for returned items, but who tests everything, especially these days? This old dog learned a new trick and now I type all variables used for returned data. Just in case.
]]>Last November, I wrote about why it’s important to have an accurate tenant directory (Azure AD). That article includes a script to check accounts for missing properties, one of which being the Manager. If that property isn’t populated, it means that an account is not listed as a direct report of another account (the manager), which makes it difficult for features like the Teams organizational view or Microsoft 365 profile card to work properly.
Unless Azure AD receives updates from a HR system, it’s quite unsurprising when the directory loses track of manager-direct report relationships over time. And if no one checks, the directory will gradually decay to a point where its view of the organization’s structure is worthless. All of which means that we should check what’s in the directory periodically.
I have no idea about how individual companies record management structures, so I can’t help by telling you how to link systems with Azure AD. What I can do is show how easy it is to generate a report of what’s in Azure AD which can be used to identify potential issues to fix.
The idea is to find all the accounts in the tenant which have direct reports. It’s easy enough to do this for a single user. The Get-AzureADUserDirectReport cmdlet exists for this purpose. Once you know the object identifier for an Azure AD account, you can run:
Get-AzureADUserDirectReport -ObjectId eff4cd58-1bb8-4899-94de-795f656b4a18 | ft displayname DisplayName ----------- Kim Akers Jeff Guillet Ben Owens (Business Director) Ståle Hansen (Office 365 for IT Pros) James Ryan Vasil Michev (Technical Guru)
The Get-AzureADUserManager cmdlet finds the manager of an account.
Get-AzureADUserManager -ObjectId cad05ccf-a359-4ac7-89e0-1e33bf37579e | ft DisplayName DisplayName ----------- Tony Redmond
It’s therefore possible to loop down through each user to find out who is their manager with code like this:
$Users = Get-AzureADUser -All:$true # Now get rid of all the accounts created for room and resource mailboxes, service accounts etc. $Users = $Users |?{($_.UserType -eq "Member" -and $_.AssignedLicenses -ne $Null)} ForEach ($User in $Users) { $Manager = Get-AzureADUserManager -ObjectId $User.ObjectId If ($Manager) { Write-Host $User.DisplayName "manager is" $Manager.DisplayName } Else { Write-Host "No manager found for" $User.DisplayName } }
Although this approach works, it means that we must find all users and then figure out who their manager is before reporting. A more logical approach is to find the managers in the organization and then work out their direct reports. There’s no straightforward way to do that with the Azure AD cmdlets without using intermediate arrays to store and then analyze who reports to who, but an easier way exists using the Exchange Get-User cmdlet.
The Get-User cmdlet (part of the Exchange Online Management module) retrieves information about an account. The trick is that it can run a server-side filter to retrieve the direct reports of a manager. Better again, the cmdlet can also tell us which accounts have direct reports (the managers), including those who have no current direct reports.
To find the managers, we can run:
[array]$Managers = Get-User -Filter {DirectReports -ne $null} | Select DisplayName, UserPrincipalName, ExternalDirectoryObjectId, DistinguishedName
Once we know the managers, it’s simple to loop through each account to find their direct reports. The trick here is to filter using the distinguished name attribute for a manager’s account. It’s the same technique as used when Get-Recipient finds the set of groups a user belongs to. The technique is exploited in this article about generating a report of membership of Teams and Microsoft 365 Groups. Here’s how to find the set of direct reports for a manager:
[array]$Reports = Get-User -Filter "Manager -eq '$Dn'"
Once we know how to find managers and their direct reports, it’s easy to turn the data into a report, which is what I’ve done in a PowerShell script, which you can download from GitHub. Figure 1 shows the HTML report created by the script. Feel free to customize the report to your heart’s content.
PowerShell tip: We declare variables used to receive query results as arrays to make sure that we get an accurate count if only one item is returned. If you leave PowerShell to decide, when a single item is found by a query, the result is that item. You won’t be able to find the count for the item because it’s not an array or list. But if you explicitly declare the variable as an array, PowerShell respects your choice and you’ll be able to get a count even if only one item is in the array.
The Office 365 for IT Pros eBook is packed full of useful information, tips, and suggestions. That’s why the book is over 1,250 pages long… So much knowledge, so little time to read it all!
]]>In my article about how to decrypt SharePoint Online documents with PowerShell, I explained how to use the Unlock-SPOSensitivityLabelEncryptedFile cmdlet to decrypt protected SharePoint files by removing the sensitivity labels protecting the files. The example script uses cmdlets from the SharePoint PnP module to return a set of files from a folder in a document library for processing, and the unlock cmdlet then removes protection from any file with a sensitivity label.
The script works, but it’s not as flexible as I would like. For instance, because PnP can’t distinguish files with labels, every document in the folder is processed whether it is labelled or not. This does no harm, but it’s not something that you might want to do in the case of something like a tenant-to-tenant migration where thousands of protected documents might need to be processed.
Update May 10, 2021: The latest version of the SharePoint Online PowerShell module contains the Get-FileSensitivityLabelInfo cmdlet. This can be run to return the label status of a file, including if the label assigned to the file encrypts the file. The existence of this cmdlet removes some of the need to use the Graph to find and remove labels from protected files, but the Graph is still the fastest way to get the job done.
Which brings me to an updated version of the script (available from GitHub), which uses the Sites API from the Microsoft Graph to navigate through SharePoint Online and find labelled documents to process. Apart from being able to search for documents with sensitivity labels, a Graph API is usually the fastest way to deal with large numbers of objects.
Because we’re making Graph calls from PowerShell, we need to create a registered app in Azure AD to use as the entry point to the Graph (the same steps as outlined in this post are used). The app needs to be able to read site data, so I assigned it Sites.Read.All and Sites.ReadWrite.All permissions (Figure 1).
The script accepts two parameters: the name of the site to search (not the URL) and an optional folder. If multiple matching sites are found, the user is asked to choose which one to search (Figure 2).
Once a target site is confirmed, the script figures out if a folder is specified and if that folder exists in the chosen site. In Graph terms, we’re now dealing with drive objects. The default drive is the root folder of a document library and each folder is a different drive. To find folders, we need to find the child objects in the root, identify the right folder, find its drive identifier, and use that to find the files in the folder. All good, clean Graph fun.
The Drive API returns a maximum of 200 items at a time, so some Nextlink processing is needed to fetch the complete set of files in a folder. Each file is examined to figure out if it has a sensitivity label with protection, and if so, the display name of the label. After processing all the files, we tell the user what we’ve found and ask permission to go ahead and decrypt the files (Figure 3). If the user chooses not to proceed, the script writes details of the protected files out to a CSV file.
Files are decrypted by calling the Unlock-SPOSensitivityLabelEncryptedFile cmdlet. There’s no native Graph API call to decrypt SharePoint documents. In any case, we’re running a PowerShell script so it’s easy to call the cmdlet.
The script is an example of what’s possible with a combination of PowerShell and Graph API calls. I’m sure that the code and the functionality can be improved (feel free to suggest changes and improvements via GitHub). I’m just happy to demonstrate how things work and how including the Graph enables some extra flexibility.
Read the Office 365 for IT Pros eBook to find much more information about how sensitivity labels work – and many PowerShell examples too!
]]>Without any fanfare, Microsoft released Version 2.0 of the Teams PowerShell module on March 4. You can download and install the new module from the PowerShell gallery. Here are the commands I used:
Uninstall-Module -Name MicrosoftTeams -AllVersions Install-Module -Name MicrosoftTeams -Force -Scope AllUsers
The previous production version for the Teams PowerShell module was 1.1.6. The major enhancements in this release are:
Good as these enhancements seem at first reading, issues lurk, and some might ask why Microsoft could issue a new module with such evident challenges.
You can use a script as described in this article to keep your PowerShell modules updated. I usually run the script once monthly to make sure that I pick up any updates I haven’t gone looking for.
Moving to MSAL has some downsides. For instance, connecting to Teams using the AccountId parameter to pass a user principal name for an MFA-enabled account generates an error.
Connect-MicrosoftTeams -AccountId $O365Cred.UserName Connect-MicrosoftTeams : One or more errors occurred. At line:1 char:1 + Connect-MicrosoftTeams -AccountId $O365Cred.UserName + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : AuthenticationError: (:) [Connect-MicrosoftTeams], AggregateException + FullyQualifiedErrorId : Connect-MicrosoftTeams,Microsoft.TeamsCmdlets.Powershell.Connect.ConnectMicrosoftTeams Connect-MicrosoftTeams : Integrated Windows Auth is not supported for managed users. See https://aka.ms/msal-net-iwa for details.
Basic authentication works as expected but the only reliable way I have found to sign into Teams with an MFA-enabled account (and all administrative accounts should be MFA-enabled) is to run Connect-MicrosoftTeams without a parameter and choose the account name from the login dialog. This isn’t suitable for batch processing, so if you need to run batch jobs to access Teams with PowerShell, use the older 1.1.6 release, scripts with other cmdlets if possible (for example, Get-UnifiedGroup), or the Graph API.
Update: Microsoft has released a preview version (2.1.0) of the Teams PowerShell module which works properly with modern authentication. It’s likely that this version will be pushed through to general availability quite quickly.
Microsoft is aware of the problems with authentication. We’ll update this post when further news appears.
Microsoft added template policy management to the Teams admin center in late February 2021. The cmdlets to manage teams templates are in V2.0. However, their syntax is very much like Graph API commands rather than following normal PowerShell conventions. In addition, the cmdlets to create and update templates don’t accept PowerShell objects as input. Output is odd too, with JSON the favorite output format. The way the new cmdlets behave creates a learning barrier to surmount, which isn’t helped by some odd text in the documentation. For instance:
Within the universe of templates the admin’s tenant has access to, returns a template definition object (displayed as a JSON by default) for every custom and every Microsoft en-US template which names include ‘test’.
To begin, the Get-CsTeamsTemplateList cmdlet returns the set of teams templates in the tenant:
Get-CsTeamTemplateList OdataId ------- /api/teamtemplates/v1.0/d84de63f-26bb-4c8e-a438-fa5b99ac5c5a/Tenant/en-US /api/teamtemplates/v1.0/9f90e23d-a361-4caf-ba79-9886adc93c68/Tenant/en-US /api/teamtemplates/v1.0/com.microsoft.teams.template.ManageAProject/Public/en-US /api/teamtemplates/v1.0/com.microsoft.teams.template.ManageAnEvent/Public/en-US /api/teamtemplates/v1.0/com.microsoft.teams.template.OnboardEmployees/Public/en-US /api/teamtemplates/v1.0/com.microsoft.teams.template.AdoptOffice365/Public/en-US /api/teamtemplates/v1.0/com.microsoft.teams.template.OrganizeHelpDesk/Public/en-US /api/teamtemplates/v1.0/com.microsoft.teams.template.CoordinateIncidentResponse/Public/en-US /api/teamtemplates/v1.0/com.microsoft.teams.template.CollaborateOnAGlobalCrisisOrEvent/Public/en-US /api/teamtemplates/v1.0/retailStore/Public/en-US /api/teamtemplates/v1.0/com.microsoft.teams.template.CollaborateWithinABankBranch/Public/en-US /api/teamtemplates/v1.0/healthcareWard/Public/en-US /api/teamtemplates/v1.0/healthcareHospital/Public/en-US /api/teamtemplates/v1.0/com.microsoft.teams.template.QualitySafety/Public/en-US /api/teamtemplates/v1.0/retailManagerCollaboration/Public/en-US
The first two templates listed are custom tenant templates. The others are the default set maintained by Microsoft. To make things easier to deal with, I extract the data for each template and put it into a PowerShell list:
$Templates = Get-CsTeamTemplateList $Report = [System.Collections.Generic.List[Object]]::new() ForEach ($Template in $Templates) { $ModifiedBy = $Null If ([string]::IsNullOrWhiteSpace($Template.ModifiedBy)) { $ModifiedBy = "Microsoft" } Else { $ModifiedBy = (Get-AzureADUser -ObjectId $Template.ModifiedBy).DisplayName } $ReportLine = [PSCustomObject]@{ Name = $Template.Name Apps = $Template.AppCount Channels = $Template.ChannelCount Description = $Template.ShortDescription Modified = $ModifiedBy "Last Updated" = Get-Date($Template.ModifiedOn) -format g Id = $Template.ODataId Scope = $Template.Scope Visibility = $Template.Visibility } $Report.Add($ReportLine) }
The extracted properties for a template object now looks like:
Name : R&A Events Apps : 5 Channels : 6 Description : Improve your event management and collaboration. Modified : Tony Redmond Last Updated : 26/02/2021 14:33 Id : /api/teamtemplates/v1.0/d84de63f-26bb-4c8e-a438-fa5b99ac5c5a/Tenant/en-US Scope : Tenant Visibility : Private
To retrieve full information about a template, use the Get-CsTeamTemplate cmdlet. This takes the Odata.Id as its identity. Taking the id stored in our list we can do this:
$TemplateData = Get-CsTeamTemplate -ODataId $Report[0].id
The result is a bunch of JSON formatted information about the apps, channels, and settings configured for the template. Here’s a snippet:
{ "templateId": "d84de63f-26bb-4c8e-a438-fa5b99ac5c5a", "displayName": "R\u0026A Events", "description": "Manage tasks, documents and collaborate on everything you need to deliver a compelling event. Invite guests users to have secure collaboration inside and outside of your company.", "visibility": "Private", "channels": [ { "id": "General", "displayName": "General", "description": "", "isFavoriteByDefault": true, "tabs": [ { "id": "General.tab0", "teamsAppId": "0d820ecd-def2-4297-adad-78056cde7c78", "name": "Team Information", "key": "General.tab0" }, { "id": "General.tab1", "teamsAppId": "com.microsoft.teamspace.tab.planner", "name": "Event Plan", "key": "General.tab1" }, { "id": "General.tab2", "teamsAppId": "0d820ecd-def2-4297-adad-78056cde7c78", "name": "Meeting Notes", "key": "General.tab2" } ] },
Unfortunately, if you try to convert the JSON, it fails due to an invalid primitive.
$TemplateData = Get-CsTeamTemplate -OdataId $Report[0].id | ConvertFrom-Json ConvertFrom-Json : Invalid JSON primitive: Microsoft.Teams.ConfigAPI.Cmdlets.Generated.Models.TeamTemplate.
The other cmdlets are:
If you’re used to PowerShell, the syntax created for these cmdlets is not the norm. I think this is due to the familiarity the designers have with the Graph API. This makes the cmdlets harder to use, but it’s unlikely that you will use them often as it is easier to maintain templates policies through the Teams admin center.
As an example of where you might want to use these cmdlets, consider the situation of a multinational tenant who wants to create the same policy in different languages. To do this, you:
Extract the properties of the source template policy to a text file.
Get-CsTeamTemplate -OdataId "/api/teamtemplates/v1.0/d84de63f-26bb-4c8e-a438-fa5b99ac5c5a/Tenant/en-US" > Template.json
Update the text strings in the file with translated strings and save the file. Then use it as input to the New-CsTeamTemplate cmdlet. The output after the command is the response from Teams:
New-CsTeamTemplate -Locale fr-FR -Body (Get-Content 'template-fr.json' | Out-String ) { "id": "b8e908a8-af13-42e6-86b9-eb33b8874fa5", "name": "Événements d\u0027entreprise", "scope": "Tenant", "description": "Gérez les tâches, les documents et collaborez sur tout ce dont vous avez besoin pour organiser un événement convaincant.\r\nInvitez les utilisateurs invités à avoir une collaboration sécurisée à l\u0027intérieur et à l\u0027extérieur de votre entreprise.", "shortDescription": "Améliorez la gestion et la collaboration de vos événements.", "iconUri": "https://statics.teams.cdn.office.net/evergreen-assets/teamtemplates/icons/default_tenant.svg", "channelCount": 6, "appCount": 5, "modifiedOn": "2021-03-10T08:56:55.1668488Z", "modifiedBy": "53f08764-07d4-418c-8403-a737a8fac7b3", "locale": "fr-FR", "@odata.id": "/api/teamtemplates/v1.0/b8e908a8-af13-42e6-86b9-eb33b8874fa5/Tenant/fr-FR" }
Voila! We now have a French version of the template policy. This won’t be something you do often, but when you need to do it, the cmdlets might offer an alternative to creating the templates in the Teams admin center.
Normally we like including news of updated PowerShell modules in the Office 365 for IT Pros eBook. I’m not so sure about this update, but even so we will adjust our text.
]]>Updated: 19 January 2023 – See this article for a new version of the script based on the Microsoft Graph PowerShell SDK.
Hot on the heels of the discussion about how to create a printable report listing the membership of a Microsoft 365 group (or team), the question is: “How can I create a report listing the members of all groups in my tenant?” Given the widespread use of Teams, the request is often to report teams membership.
It’s a good question, and it’s one that is answered elsewhere, such as Steve Goodman’s take on the topic. However, all the approaches I have seen to date have attacked the problem as follows:
Apart from its slowness, there’s nothing wrong with this approach. The Get-UnifiedGroup cmdlet is a “fat” cmdlet. It fetches a lot of information to deliver the set of properties for each group. And the Get-UnifiedGroupLinks cmdlet is also pretty heavy. Put the two together, and things will be slow. This is fine if you have only a couple of hundred groups to process. It’s not so good when you have thousands.
I decided to take a different tack. Instead of processing one group at a time, the script should process users. Basically:
The script can be downloaded from GitHub. In testing, it took around a half-second per account (Figure 2), which isn’t too bad considering the amount of processing done.
Groups with no members are ignored by the script. These groups might have owners, but the lack of members mean that they are not picked up when checking group membership on a per-user basis.
A script powered by the Graph API will deliver faster results in places like fetching a list of team-enabled groups (using the list groups API). You can also use the Get-MgGroup cmdlet from the Microsoft Graph PowerShell SDK to return the list of team-enabled groups. The set of groups a user belongs to is found using the list user transitive member of API. For example, a call like https://graph.microsoft.com/v1.0/users/{GUID}/transitiveMemberOf returns the set of groups that the account with the object identifier (GUID) is a member of. Using the Microsoft Graph PowerShell SDK, the code to return the set of groups a user belongs to would be something like this:
$User = Get-MgUser -UserId James.Ryan@office365itpros.com $Uri = "https://graph.microsoft.com/v1.0/users/" + $User.Id + "/transitiveMemberOf " [array]$UserGroups = Invoke-MgGraphRequest -Uri $Uri -Method Get
I also wrote a Graph version of the script, which you can also find on GitHub. Remember that you must register an app in Azure AD, assign the app the necessary permissions, and create an app secret (or other credentials) before you can use this version. On the upside, the Graph version is faster and scales better for large tenants.
Learn much more about interacting with Microsoft 365 Groups and Teams through PowerShell by subscribing to the Office 365 for IT Pros eBook. Monthly updates keep you up to date with what’s happening across the Microsoft 365 ecosystem.
]]>If you enable support for sensitivity labels in SharePoint Online and OneDrive for Business (and you should), most of the previous frustrations that organizations have experienced in dealing with protected go away. Protected (encrypted) content can be indexed and found by eDiscovery, co-authoring is supported (with Office Online), and so on. And very importantly, Office 365 captures audit events when people apply, remove, or change sensitivity labels with Office documents.
Originally, only sensitivity label actions performed by the Office Online apps were captured. This is fine, but most user interactions with Office documents occur through the desktop apps. The gap in coverage is closing and the latest versions of the Microsoft 365 apps for enterprise (aka Office click to run) now create audit records when they apply or remove labels from documents. I’m using version 2012 – current channel preview (build 13350.20316) as the basis for this article, but I can see that audit records have been generated since mid-December.
Although the latter part of December is a period of low work activity, the number of events captured since compared against previous months confirms the view that desktop apps are used more heavily to generate documents, spreadsheets, and presentations. At least, in my tenant.
Nice as it is to have the additional insight into the use of sensitivity labels, it’s regrettable that Microsoft did not use the same operation names when generating audit records for the desktop apps as they do for the online apps. The operation is the name of an auditable action.
It’s possible that the logic here is that the actions originate in two different sources and the different operations mean that administrators can conduct precise audit searches to find records for either the desktop or online apps – or both.
The new operations are:
As in many cases with Office 365 audit log records, the new events need to be parsed out before they’re useful. This is reasonably easy to do with PowerShell, albeit at the need to examine and interpret the payload content of each type of event.
Seeing is believing and it’s always easier to understand how things work when you have a practical example. I’ve written a script to grab all the events for sensitivity labels for the last three months and create a report. Each of the event types is unpacked and interpreted to make it clear what the event means. The output is a CSV file which can be analyzed in whatever way you wish. Or you can examine the output on-screen through the Out-GridView cmdlet (Figure 1).
The script is available in GitHub. You’ll need to connect to the Exchange Online management module and the security and compliance endpoint to run the cmdlets in the script. The compliance endpoint is used to fetch the list of sensitivity labels defined in the organization and create a hash table of GUIDs/identifiers (the keys) and label names (values). Some audit events contain label names but it’s more typical to only find a label identifier recorded, so lookups against the hash table translate identifiers into label names.
As you can see from the output, in my tenant most audit records are recorded when an Office desktop app opens a protected file:
Job complete. 370 Sensitivity Label audit records found for the last 90 days Labels applied to SharePoint sites: 51 Labels applied to new documents: 45 Labels updated on documents: 5 Labeled files renamed: 29 Labeled files opened (desktop): 200 Labels removed from documents: 40 Mismatches detected: 0 ---------------------- Report file written to C:\temp\SensitivityLabelsAuditRecords.csv
In this case, no mismatches are noted between the label applied to a site (container management) and those assigned to documents stored in the site. My users might just be learning how to label documents properly!
We write tons of PowerShell scripts to check out how Office 365 really works and understand where any fault lines might be. Our GitHub repository is available to all. Even better, we explain how to use our scripts and other PowerShell commands to manage Office 365 in the Office 365 for IT Pros eBook.
]]>Office 365 Notification MC234048 published on January 12 announces that Microsoft is improving the security of the “Microsoft Teams connector apps webhook.” That mouthful refers to the incoming webhook connector which can be attached to Microsoft 365 Groups or Teams channels to allow the posting of cards generated from information in network sources.
The webhook is a unique URL which applications use to address the channel where they wish to publish information. The change being made adds the tenant name to the webhook URL to make it more apparent than previous. According to Microsoft, security is improved through the presence of the tenant name because organizations can then use the information to filter traffic logs. Quite how this improves security is beyond me because the tenant identifier (a GUID) has always been present in the webhook URL.
The changeover to the new format started on January 11, 2021. Old format webhooks will continue working for three months. The webhooks must be updated to the new format by April 11, 2021 to ensure that information continues to flow.
The new format webhook looks like this:
https://office365itpros.webhook.office.com/webhookb2/7aa49aa6-7840-443d-806c-08ebe8f59966@c662313f-14fc-43a2-9a7a-d2e27f4f3478/IncomingWebhook/8592f62b50cf41b9b93ba0c0a00a0b88/eff4cd58-1bb8-4899-94de-795f656b4a18
The component parts are of the webhook are:
The same connector is used to bring data in from external sources to both Teams (data is posted in a channel) and Microsoft 365 Groups (data is posted as topics in email conversations).
Updating a connector to use the new format webhook isn’t hard. The trick is to know where the connectors are currently configured. When you know that, a group owner can access the set of connectors and check the incoming webhook connector. If “Attention required” appears for the connector, the webhook must be updated (Figure 1). If not, the connector is using the new format.
Clicking Manage brings you to the connector settings (Figure 2). Click Update URL to generate a new format webhook URL and then Save to update the connector. Before you save, make sure that you copy the webhook URL as you’ll need this to update the applications which send data to Microsoft 365 Groups or Teams through the connector.
The applications might be as simple as a PowerShell script (here’s an example of posting Microsoft 365 roadmap items to Teams and another to post notifications about inactive mailboxes). In other cases, the webhook URL might be used to post information coming from an application.
If you have only a few teams configured with the incoming webhook it won’t be hard to find and update the URL. Things are a little more complex in large organizations where many webhooks might be in use for both Teams and Microsoft 365 Groups. In these circumstances, some help might be needed to find all the connectors.
Some time ago, I wrote a PowerShell script to show how to report the channels and tabs connected to Teams. The point of that post was to demonstrate the retrieval of large amounts of data from Teams. As each team has standard apps (like Calls)), can have up to 200 channels, and each channel can have multiple tabs (including standard tabs), the script collects a bunch of information. In this instance, we can use it to find which teams have connectors configured with the incoming webhook connector.
After running the script, we can query the output report to find instances of teams with the connector:
$Report | ?{$_.App -eq "incoming webhook"} | Format-Table Team Team ---- Technology News and Views Engineering Colleagues Human Resources Group PL Test Industry News
The report doesn’t tell us for which channel the connector is configured, but that should be easily found.
In passing, if you want to get an insight into the number of standard apps installed by Teams, run this command to group the results in the report:
$Report | Group App | Sort Count -Descending | Format-Table Name, Count
As they say, the output is “interesting”!
Interpreting the real meaning of a Microsoft announcement takes experience and background knowledge. Learn from the best by subscribing to the Office 365 for IT Pros eBook. No bumpf, just knowledge.
]]>Microsoft is previewing the ability to create an Azure AD Access Review to cover guest access to every group (and team) in a tenant. The idea is that group owners are asked to approve or deny the access granted to guest users to their groups. With the caveat that all previews come with rough edges, the review works well enough for organizations to assess if the feature is valuable for them.
A challenge facing every GUI is how to achieve the balance of usability for both large and small organizations. Getting an oversight of an access review for guests in 27 teams in a small tenant makes certain demands on the GUI to make the data comprehensible and enable administrators to figure out where the overall review is at. Doing the same for a large tenant where reviews might be ongoing for thousands of teams poses a different test. However, the Identity Governance section in the Azure AD admin center has just one interface to manage access reviews (Figure 1).
When an access review is viewed through the Azure AD admin center, you see10 groups at a time (it’s a preview), but even if the admin center showed a hundred groups, paging through large numbers of groups to find what’s happening in an individual review can be painful.
Which brings us to the Graph API for Azure AD Access Reviews, the basis for DIY management of access reviews. To test how the API worked, I wrote a PowerShell script to find the review for all groups and create a report of the review decisions made to date.
The steps taken in the script are:
Figure 2 shows the output at the end of the script.
The script generates a CSV file to allow the decision data to be analyzed in whatever way you wish. Piping the data to the Out-GridView cmdlet is a good way to get a quick overview of the current state of reviews across all groups (Figure 3).
The sample script can be downloaded from GitHub. It doesn’t exercise all the functionality available in the API. For example, to accelerate the process of completing the review, you could look for outstanding reviews of guests in groups and call the acceptRecommendations API to accept the automatic recommendations as made by Azure AD. However, as I explain here, accepting automatic recommendations is not always the wisest thing to do, especially when Azure AD makes decisions based on limited data.
You’ll find full details about Azure B2B collaboration (the basis of guest access to Teams and Groups) plus a ton of insight about how guest access works in the Office 365 for IT Pros eBook. And because we keep the book updated, new developments like the Azure AD Access Review for all guests in a tenant are mentioned there too.
]]>Delegate access to a mailbox is a popular feature supported by Outlook desktop, OWA, and Outlook Mobile. In some cases, you only want to allow access to a specific folder rather than the complete mailbox. Calendar access is often granted to delegates to allow other people to deal with someone’s schedule. It’s easy for users to assign delegate access to their calendar. For instance, in OWA, go to the calendar, click the […] beside the calendar you want to share, select Sharing and permissions, and then add the new delegate. In Figure 1, we’ve elected to give the delegate the ability to view private calendar events too.
Once applied, the delegate will be able to open the delegator’s calendar and Exchange will send calendar invitations and responses to the delegate for their attention.
Delegate access usually works without a hitch, but when things go wrong administrators will probably need to resort to PowerShell to understand what’s happening. The first thing is to establish what kind of access someone has to a problematic calendar. The Get-MailboxFolderPermission cmdlet shows the permissions set on a folder. In this case, we pass the user principal name of the account we want to check and “:\Calendar” to indicate the folder name.
Get-MailboxFolderPermission -Identity Jane.Sixsmith@office365itpros.com:\Calendar FolderName User AccessRights SharingPermissionFlags ------------- ---- ------------ ---------------------- Calendar Default {AvailabilityOnly} Calendar Anonymous {None} Calendar Ken Bowers {Editor} Delegate, CanViewPrivateItems
According to Microsoft, the most common error met with delegate access happens when a user cannot add a new delegate or remove an existing delegate from their mailbox. The root cause is usually a corrupted hidden item in the mailbox which stores the delegate information. Microsoft publishes a comprehensive support article outlining the steps to take to recreate the hidden item. The steps work, but assume that:
Because of the multi-step recipe to fix the problem and the need to use an unfamiliar program, some people never manage to get to the end and resolve the issue. This is a classic example of where software can help.
Microsoft has released a new switch parameter for the Remove-MailboxFolderPermission cmdlet called ResetDelegateUserCollection. When you run the cmdlet with the parameter, Exchange Online essentially does all the work outlined in the support article to replace the potentially corrupted mailbox items. For example:
Remove-MailboxFolderPermission -Identity Jane.Sixsmith@office365itpros.com:\Calendar -ResetDelegateUserCollection Confirm Are you sure you want to perform this action? Using ResetDelegateUserCollection changes existing calendar Delegate permissions. You will need to re-assign the Delegate flag to these recipients using Set-MailboxFolderPermission -SharingPermissionFlags Delegate. It is suggested that this ResetDelegateUserCollection option is only used when you believe there is corruption that is preventing managing calendar permissions. [Y] Yes [A] Yes to All [N] No [L] No to All [?] Help (default is "Y"): Y WARNING: Resetting DelegateUserCollection... WARNING: DelegateUserCollection is reset.
Note the warning. If we run Get-MailboxFolderPermission again, we’ll see that the sharing permission flags which make someone into a delegate are gone.
Get-MailboxFolderPermission -Identity Jane.Sixsmith@office365itpros.com:\Calendar FolderName User AccessRights SharingPermissionFlags ---------- ---- ------------ ---------------------- Calendar Default {AvailabilityOnly} Calendar Anonymous {None} Calendar Ken Bowers {Editor}
To complete the fix, we need to add delegate permissions again. You could ask the user to do this by updating the permissions assigned to their calendar, but it’s easier and more polite for the administrator who’s just reset the delegate information to do the job for the user by running the Set-MailboxFolderPermission cmdlet. If you don’t do reset permissions, delegates will have editor permission for the calendar folder, but they won’t be able to process calendar invitations on behalf of the mailbox owner. Here’s how to reset the permissions for Ken Bowers:
Set-MailboxFolderPermission -Identity Jane.Sixsmith@office365itpros.com:\Calendar -User Ken.Bowers@office365itpros.com -SharingPermissionFlags Delegate, CanViewPrivateItems -AccessRights Editor
After the cmdlet completes, you can run Get-MailboxFolderPermission again to verify that the delegate sharing permission flag is present once again (and optionally the flag allowing the delegate to view private items too).
Of course, it’s fine if you’d prefer to follow the MFCMAPI recipe to fix the delegate issue, but it’s a lot easier and faster to run a couple of lines of PowerShell!
The upgraded version of Remove-MailboxFolderPermission is rolling out now. If your RBAC configuration is higher than 15.20.3722, the cmdlet should be available in your tenant. To check, run the Get-OrganizationConfig cmdlet to check the value of RBACConfigurationVersion:
Get-OrganizationConfig | Select RBACConfigurationVersion RBACConfigurationVersion ------------------------ 0.1 (15.20.3763.11)
This is just the kind of detailed how-to information we love reading about. It might only end up as a line or two in the Office 365 for IT Pros eBook, but that’s no reason not to share the knowledge with you.
]]>Bing publishes a daily photo which it uses as the daily background for its home page. The daily photo varies from country to country and is usually a high-quality visually attractive picture. Some of the content presented by Bing is market-dependent too. For example, Figure 1 shows the Bing page for Ireland on December 8, 2020 with a scrolling set of local news at the bottom.
Figure 2 shows the same page as presented in the French market. The same background image is used along with some details from my Office 365 tenant presented above the local news. This happens when a tenant integrates Microsoft Search results in Bing.
Clicking through to the search page makes other information from the tenant accessible. Figure 3 shows how a search for the term “Microsoft Search” locates a bunch of Teams conversations. If the term were found in a Yammer community, it would turn up here too. Results from matching files and sites are also located in the same way. The only major source of data which is not searched is email.
Office 365 content is presented in the Bing home page for other markets like the U.S., Canada, U.K., Germany, and Australia, but not in Switzerland, Belgium, Netherlands, and New Zealand. You can test your own country by including a country code qualifier with the Bing URL. For example, https://www.bing.com/?cc=nz is for New Zealand while https://www.bing.com/?cc=ca is for Canada. Of course, the inconsistency might be due to a rolling update across Bing to treat all countries in the same way.
Coming back to Bing’s daily images, some PowerShell code is enough to download the images and make them available for use as background for Teams meetings. In this case, the code downloads the images for the Irish market (en-IE). Change this to the market appropriate to your needs (for example, (fr-FR for France, en-UK for the UK, en-US for the U.S., and so on).
# Code to download background images files from Bing $TeamsBackgroundFiles = $env:APPDATA + "\Microsoft\Teams\Backgrounds\Uploads\" $Market = "en-IE" # Check that the Teams background effects folder exists. If not, create it If (-not (Test-Path -LiteralPath $TeamsBackgroundFiles)) { Try { New-Item -Path $TeamsBackgroundFiles -ItemType Directory -ErrorAction Stop | Out-Null } Catch { Write-Error -Message "Unable to create directory '$TeamsBackgroundFiles'. Error was: $_" -ErrorAction Stop } Write-Host "Folder for Teams background effect files created: '$TeamsBackgroundFiles'" } Else { Write-Host "Teams background effects folder exists" } # Download the last seven days of Bing images CLS ; For ($i=0; $i -le 7; $i++) { $BingUri = "https://www.bing.com/HPImageArchive.aspx?format=js&idx=$i&n=1&mkt=$Market" $BingResponse = Invoke-WebRequest -Method Get -Uri $BingUri $BingContent = ConvertFrom-Json -InputObject $BingResponse.Content # Unpack content $BingBackgroundFile = "https://www.bing.com/"+$BingContent.Images.Url $BingFileName = $BingContent.Images.UrlBase.Split(".")[1]; $BingFileName = $BingFileName.Split("_")[0]+".jpg" $TeamsBackgroundFile = $TeamsBackgroundFiles + $BingFileName If (([System.IO.File]::Exists($TeamsBackgroundFile) -eq $False)) { # File isn't there, so we can download Try { Invoke-WebRequest -Method Get -Uri $BingBackgroundFile -OutFile $TeamsBackgroundFile Write-Host "Downloaded" $TeamsBackgroundFile } Catch { Write-Host "Error occurred when downloading from Bing"} } #End If } #End loop
A longer version of script can be downloaded from GitHub. This version cleans up background files downloaded from Bing after 30 days to keep the number of files to a reasonable number.
Once the Bing images are downloaded, they can be used like any other of the Microsoft-provided or custom background images. Figure 4 shows the Bing image featured above being chosen as a custom background for a Teams meeting. As shown, the image is reversed, but it will appear as normal when projected to other meeting participants.
To ensure that users download images from Bing regularly, you can create a scheduled job to run the script when a PC starts. The exact details of how to do this will differ from organization to organization.
The information presented here is based on the content of this Codeproject.com post. An earlier version of this content and script was included in the post covering using background images with Teams meetings. We decided to move the text to a separate post to present more context and make some improvements in the GitHub script.
The Office 365 for IT Pros eBook has lots of information about things like Microsoft Search and Teams meetings. Details like those covered here are interesting, but barely get a mention in the book. We have lots of other things to discuss which are more important for tenant management.
]]>As has been well advertised, Microsoft will retire Skype for Business Online on July 31, 2021. Organizations should now be well on the way to deploying and using Teams to replace Skype for Business Online. Or, if they’re really brave, moving to a different communications platform.
The Skype for Business Connector includes the New-CsOnlineSession cmdlet, used to establish a new remote connection to the Skype for Business Online endpoint. Once a connection is made, the cmdlets used for policy management can be imported into a session and used. Documented in Office 365 notification MC230065 on 15 December, Microsoft will retire the Skype for Business Online connector on 15 February 2021.
Teams shares a common policy framework with Skype for Business Online, which means that to interact with Teams management policies through PowerShell, many scripts use the Skype for Business Online Connector.
Before the Connector is retired, organizations need to check any PowerShell scripts used to manage Teams management policies, such as:
The Teams PowerShell module replaces the connector. Launched in September, version 1.1.16 and future versions contain New-CsOnlineSession and other cmdlets previously found in the connector:
Get-Command -Module MicrosoftTeams -Name *-Cs* CommandType Name Version Source ----------- ---- ------- ------ Function Get-CsBatchPolicyAssignmentOperation 1.1.6 MicrosoftTeams Function Get-CsGroupPolicyAssignment 1.1.6 MicrosoftTeams Function Get-CsUserPolicyAssignment 1.1.6 MicrosoftTeams Function New-CsBatchPolicyAssignmentOperation 1.1.6 MicrosoftTeams Function New-CsGroupPolicyAssignment 1.1.6 MicrosoftTeams Function Remove-CsGroupPolicyAssignment 1.1.6 MicrosoftTeams Cmdlet Get-CsOnlinePowerShellEndpoint 1.1.6 MicrosoftTeams Cmdlet Get-CsPolicyPackage 1.1.6 MicrosoftTeams Cmdlet Get-CsUserPolicyPackage 1.1.6 MicrosoftTeams Cmdlet Get-CsUserPolicyPackageRecommendation 1.1.6 MicrosoftTeams Cmdlet Grant-CsUserPolicyPackage 1.1.6 MicrosoftTeams Cmdlet New-CsBatchPolicyPackageAssignmentOperation 1.1.6 MicrosoftTeams Cmdlet New-CsOnlineSession 1.1.6 MicrosoftTeams
To use the cmdlets used for Teams policy management, you don’t need to connect to Teams before connecting to the Skype for Business Online endpoint; all that’s necessary is to install the latest version of the Teams PowerShell module and run these commands (Figure 1):
$TeamsPolicySession = New-CsOnlineSession -Credential $O365Cred Import-PsSession $TeamsPolicySession -AllowClobber
Update March 6, 2021: Microsoft has updated the Teams PowerShell module to V2.0. In general, it’s best to use the latest version of a module but test it first! This version doesn’t require using New-CsOnlineSession to connect to the management end point.
Once a remote connection is made and the cmdlets are imported into a session, you can use policy management cmdlets like Get-CsTeamsMeetingPolicy and Cs-TeamsMessagingPolicy. Updating scripts should be a matter of making sure that the cmdlets are loaded from the Teams module and removing any reference to the Skype for Business Online connector.
The Teams PowerShell module doesn’t currently feature the Enable-CsOnlineSessionForReconnection. This cmdlet enables a session connected to the Skype for Business endpoint to reconnect it it times out. A timeout can happen at random intervals. In most cases this isn’t a problem as the interaction with Teams policies and other cmdlets accessed through this endpoint is often brief. If you work with the policy and management cmdlets for extended period, consider using this script to mimic the functionality of the Enable-CsOnlineSessionForReconnection cmdlet. Run it in your session after importing the Skype for Business Online cmdlets.
It’s easy to miss out on small but important details like the retirement of a connector. Other books don’t cover stuff like this because they are written once and then published. The Office 365 for IT Pros eBook is refreshed with changes and republished monthly to our subscribers.
]]>Every Microsoft 365 tenant uses Azure Active Directory to store information about the tenant configuration, accounts, and groups. Maintaining accurate Entra ID user account properties is important. Whether data comes from an external source like a HR feed or is maintained manually, people depend on directory information to find others, or even understand how the organization works. If the data in your directory is inaccurate, some features won’t work properly or at all. For example:
It’s always been important to maintain an accurate directory. Perhaps it was less so in the on-premises world where fewer application features are built with an expectation that directory data is accurate, but it’s obvious that Microsoft 365 just works better with a solid directory.
You can invest in a product like Hyperfish to help analyze and maintain your Entra ID data, but before you rush into acquiring a sticking plaster to cure your directory woes, it’s a good idea to set down some threshold for directory quality. For example, you could say that your baseline measurement for a healthy directory is that all the properties displayed on the people card should be fully populated for every user account. Separate guidelines might be defined for guest accounts and groups.
Figure 2 shows a customized people card. Being able to customize the people card using Microsoft Graph commands allows tenants to expose the information they consider essential in the card, and it’s important to consider customization when setting your threshold.
Setting an aspirational goal is nice, achieving that goal is even better. We need to understand how healthy our directory is in terms of missing properties that show up in the people card. Fortunately, this is easy to create a PowerShell script to:
I’ve written a quick and dirty script which you can download from GitHub. It uses the Get-User cmdlet from the Exchange Online Management module to fetch account information. The Get-MgUser cmdlet from the Microsoft Graph PowerShell SDK could also be used, but it’s easier to filter out mailbox-enabled accounts with Get-User, which exposes the Entra ID user properties we want to check. Remember that you’ll need to modify the script to suit the circumstances in your organization. For instance, if you place particular importance on a specific property, you might want to amend the script to include that property in the checks.
Figure 3 shows how the script reports the problems it finds with missing properties in user accounts. The results shown here are from a small test tenant so it’s unsurprising to discover that so many accounts have missing properties. It’s reasonable to expect better results in a production tenant.
To make it easy for administrators to track down and fix missing properties. a CSV file is also generated with details of the accounts which need adjustment (Figure 4).
Although it can be a boring task, maintaining the accuracy of Entra ID user data can be boring. It’s much more interesting to read the Office 365 for IT Pros eBook and learn about changes in Office 365 through the updates we release every month.
]]>Last Updated: January 9, 2023
Released on January 9, 2023, this version removes the dependency for basic authentication through the WinRM component. V3.1 completes the transition for Exchange Online Management cmdlets to use the REST API instead of Remote PowerShell and lays the foundation for the removal of remote PowerShell connections to Exchange Online (now due in September 2023).
On September 20, 2022, Microsoft released V3.0 of the Exchange Online Management module. The updated module is available in the PowerShell gallery. The highlights of the release include:
See this page for more information.
On May 11, Microsoft released V2.0.5 of the Exchange Online Management PowerShell module to general availability. This is an update of what’s sometimes called Exchange Online PowerShell V2 (introduced at Ignite 2019). It is recommended that you update to the latest version at your earliest convenience.
The case for using the Exchange Online Management module instead of the older remote PowerShell cmdlets has been made many times. By now it should be a no-brainer, especially with Microsoft’s avowed intention to remove basic authentication for PowerShell as soon as possible and the consequent need to upgrade interactive PowerShell sessions and background scripts to use modern authentication. Here are the highlights of recent releases.
V2.0.5 contains the cmdlets needed to manage the Ownerless Group policy (Get/Set-OwnerlessGroupPolicy) and features in the Viva Insights app for Teams (Get/Set-VivaInsightsSettings).
As announced at Ignite 2020, this is the version version of the Exchange Online Management module to support Linux and MacOS. For Linux, you need to run Ubuntu version 18.04 or above. For MacOS, it’s Mojave (10.14), Catalina (10.15), and Big Sur (11) and above.
In PowerShell 7, the 2.0.4 module supports browser-based single sign on. See this page for more information.
Real-time policy evaluation (Continuous Access Evaluation or CAE) is supported.
The cmdlets used to update user preferences for MyAnalytics have been renamed to make their use more obvious.
Get-UserAnalyticsConfig is now Get-MyAnalyticsFeatureConfig.
Set-UserAnalyticsConfig is now Set-MyAnalyticsFeatureConfig.
The Get-ExoMailboxStatistics cmdlet supports two new properties: LastUserActionTime and LastInteractionTime.
The Exchange Online Management module comes with full support for modern authentication, multi-factor authentication, and now (in this version), certificate-based authentication (CBA) to allow scripts to run unattended as background jobs. Certificates can be stored in the certificate store of the local machine or current user. You can also use the CertificateFilePath parameter for the Connect-ExchangeOnline cmdlet to specify the file path to a .pfx file for a certificate. For more information, see this page.
Following previous releases of the module, I complained bitterly that running the Connect-IPPSSession cmdlet to connect to the Security and Compliance endpoint removed the session connected to Exchange Online. In other words, you couldn’t do something like run Get-ExoMailbox to fetch a list of mailboxes, then run Connect-IPPSSession, do some work, and then run Get-ExoMailbox again. I may have used some bad words to fully express my opinion on the inanity of this approach.
The developers listened and V2.0.3 includes support for simultaneous connections to Exchange Online and the Security and Compliance endpoints.
One of the original characteristics of using the REST-based cmdlets like Get-ExoMailbox or Get-ExoMailboxStatistics was a need to “warm up” the connection. In other words, it took a while for the first connection to be established and ready for use. Microsoft says that V2.0.3 is much faster at making the initial connection and in practice it seems like the improvement is marked. Results will vary depending on the cmdlet and number of objects in the tenant, but the connections are certainly snappier than before.
Only 17 cmdlets are in the Exchange Online Management module, but when you connect to Exchange Online, over 700 cmdlets are imported into the session, all of which demand some memory. If you want to restrict memory usage to a minimum, you can specify the list of cmdlets needed by a session or script when you run the Connect-ExchangeOnline cmdlet. For example, this command will create a session with the 17 cmdlets from the module plus two imported from Exchange Online:
Connect-ExchangeOnline -CommandName Set-Mailbox, Set-CASMailbox
After the session starts, you will only be able to run Set-Mailbox and Set-CASMailbox from the set available for Exchange Online. Other cmdlets like Get-PublicFolder, New-TransportRule, or Get-UnifiedGroup are unavailable.
When you do update the Exchange Online Management module, make sure that you include the Scope parameter to force the install of the module files onto the local disk. Otherwise you might end up like me and have some modules in OneDrive for Business and others local, with all the confusion that entails. After removing all traces of previous versions to give myself a clean start, I ran:
Install-Module ExchangeOnlineManagement -Scope AllUsers -Force
To check that the module is in the right place, run the command below and make sure that the module isn’t located in OneDrive for Business:
Get-Module ExchangeOnlineManagement | Select Path Path ---- C:\Program Files\WindowsPowerShell\Modules\ExchangeOnlineManagement\2.0.3\ExchangeOnlineManag...
For more information and lots of examples of using PowerShell to manage Exchange Online, subscribe to the Office 365 for IT Pros eBook.
]]>In July 2020, the Teams development group started the process of removing the dependency on the Skype for Business Online PowerShell connector to manage Teams policies through PowerShell. At the time, Teams introduced a preview version of the MicrosoftTeams module (1.1.3-preview) which included the New-CsOnlineSession cmdlet needed to create a connection to the Skype for Business Online endpoint and download the other Skype for Business Online cmdlets.
Update March 6, 2021: Microsoft has updated the Teams PowerShell module to V2.0. In general, it’s best to use the latest version of a module but test it first!
On September 14, Microsoft shipped version 1.1.6 of the MicrosoftTeams module. This is a full-blown production-quality release that includes New-CsOnlineSession. It’s recommended that you should download and use this module for PowerShell activity against Teams and Skype for Business Online.
To upgrade a workstation from a previous version of the MicrosoftTeams module, run the Update-Module cmdlet. For example:
Update-Module MicrosoftTeams -Force -Scope AllUsers
Once the new module is installed, you can connect to the Teams and Skype for Business Online endpoints as normal:
Connect-MicrosoftTeams -Credential $O365Cred $SfbSession = New-CsOnlineSession -Credential $O365Cred Import-PSSession $SfbSession
In this example, the $O365Cred variable contains credentials prepopulated with a call to the Get-Credential cmdlet. After the session is established, you will be able to execute the cmdlets which used to be in the Skype for Business Online connector to manage Teams policies. For instance, you can call Get-CsTeamsMeetingPolicy to work with Teams meeting policies.
A small problem exists in that 1.1.6 does not include the Enable-CsOnlineSessionForReconnection cmdlet, which is used to maintain a connection to the Skype for Business Online endpoint. This is not an issue for short sessions where you connect, do some stuff, and terminate. It is if you want to leave a session open for hours. I am sure that Microsoft will update the module quickly to reintroduce the cmdlet, but in the interim you can use the workaround described here to get the cmdlet working as a script.
Alternatively, if you don’t remove the Skype for Business Online connector from your workstation, the Enable-CsOnlineSessionForReconnection cmdlet should be available after you connect to Skype for Business Online. I only noticed that the cmdlet was missing after removing the connector using Control Panel.
This is one of the small but important changes which happen all the time within Office 365. Stay up to date by subscribing to the Office 365 for IT Pros eBook. We’ll keep an eye on the important stuff for you!
]]>For the last few months, I have been dabbling with a PowerShell script to extract and report usage data for multiple Office 365 workloads from the Microsoft Graph. The idea is that an Office 365 user activity report generated by fetching activity data from all the workloads reported in the Graph helps administrators to figure out if accounts are in use and if so, what they are used for. If an account isn’t in use, then you might remove it and save some licenses.
One of the joys of PowerShell is how quickly you can put a solution together. The corollary is sometimes that the solution isn’t as efficient as it could be, which often happens when you’re not a professional programmer. When I write a script, the most important thing is often to illustrate a principle and show how something works. When PowerShell scripts are deployed into production, they’re usually upgraded and improved by programmers to meet organizational standards and fit in with other scripts used to manage the infrastructure. For this reason, I don’t bother too much with tweaking for performance.
This script is different. It’s been picked up by several tenants who reported that the script works but it’s slow when asked to process data for thousands of accounts. This deserved some investigation which produced some improvements, such as using PowerShell’s Where method to filter data.
But PowerShell is not a database and storing data about account usage in PowerShell list objects only scales so far. There are many web articles covering PowerShell performance with large amounts of data, many of which point to using hash tables because they are very efficient for finding and retrieving data (see this article about how to use hash tables).
A hash table is a collection of key/value pairs. The keys are unique, and the values are often some information associated with the key. For instance, because Office 365 objects like groups and sites store sensitivity labels as GUIDs, I often create a hash table composed of the GUID (key) and label display name (value) which I can then use to interpret the GUIDs stored in objects. Here’s what the code looks like:
$Labels = Get-Label # Get set of current labels $HashLabels = @{} # Create hash table $Labels.ForEach( { # Populate the hash table with the GUID and display name of each label $HashLabels.Add([String]$_.ImmutableId, $_.DisplayName) } )
Anytime I need to find the display name of a label, I can do something like this:
$GUID = (Get-UnifiedGroup -Identity “Office 365 for IT Pros”).SensitivityLabel.GUID Write-Host “Display name of label is” $HashLabels[$GUID] Display name of label is Limited Access
Apart from their usefulness in situations like described above, hash tables are very fast when you use keyed access. Speed being of the essence when thousands of records are to be processed, I decided to investigate if hash tables could replace the list objects used by the script.
Finding a key is no problem because the user principal name is unique for each account. Figuring out how to store all the data in the hash table value was another matter. That is, until I noticed that: ”the keys and values in a hash table can have any .NET object type…” In other words, you’re not limited to storing simple values in a hash table.
When the script extracts usage data for a workload (like Teams or Exchange) from the Graph, it processes each record to create a list of accounts and their usage data for that workload. After some experimentation, I was able to populate the hash table by:
This might be inelegant, but it works. After all workloads are processed, the result is a hash table keyed on the user principal name with a value composed of an array containing the usage data for all workloads for that user. Access to the data is via the user principal name. For example:
$datatable["Kim.Akers@Office365itpros.com"] TeamsUPN : Kim.Akers@office365itpros.com TeamsLastActive : 05-Sep-2020 TeamsDaysSinceActive : 5 TeamsReportDate : 07-Sep-2020 TeamsLicense : POWER BI (FREE)+ENTERPRISE MOBILITY + SECURITY E5+OFFICE 365 E5 WITHOUT AUDIO CONFERENCING TeamsChannelChats : 7 TeamsPrivateChats : 10 TeamsCalls : 0 TeamsMeetings : 5 TeamsRecordType : Teams ExoUPN : Kim.Akers@office365itpros.com ExoDisplayName : Kim Akers ExoLastActive : 20-Aug-2020 ExoDaysSinceActive : 21 ExoReportDate : 08-Sep-2020 ExoSendCount : 8 ExoReadCount : 19 ExoReceiveCount : 392 ExoIsDeleted : False ExoRecordType : Exchange Activity
The display is truncated here to show two of the six workload usage data extracted for an account.
Creating the report is then a matter of processing each account to extract the information and format the data. To do string comparisons and other calculations, I found that it was necessary to use the Out-String cmdlet to make the properties taken from the array into trimmed strings. It might be something to do with the way that the hash table values are stitched together from multiple arrays.
After changing to hash tables, I observed a 70% performance gain in script execution time in my (small) tenant. I expect a much better gain in larger tenants where the advantages of hash table access become more pronounced. This feeling was realized in a test against 20K accounts which proved that the script is now capable of processing at circa 1,000 accounts per minute (Figure 1).
Update September 18: I received a note saying that the script processed 26,808 accounts at the rate of 3184.71 per minute!
The time required to fetch data from the Graph is the same as previous versions as is the time to prepare data for processing. All the improvement is in the report generation, which is where the hash tables excel. The tenant who processed the script against 20,000 accounts used the Office 365 user activity report (example shown in Figure 2) to identify 70 accounts assigned Office 365 E5 licenses that can now be reallocated or released (a potential saving of $29,400 annually).
The Office 365 user activity report script is available from GitHub. If you have a suggestion for improving the performance further, please let comment on GitHub.
OK, we should be writing text for the Office 365 for IT Pros eBook instead of trying to work out how to speed up PowerShell scripts. But you learn a lot about an infrastructure when you program against it, so we’ll keep on scripting…
]]>Using the Where-Object cmdlet to select items from a set is a common operation for any Office 365 administrator who uses PowerShell to manage applications like Exchange Online, SharePoint Online, and Teams. From a performance perspective, it’s better to use a filter with the original call to find objects (like finding the Microsoft 365 Groups enabled for Teams or Yammer) because that avoids the need to extract a subset and a server-side filter might be available, but sometimes that’s not possible.
Those of you who use our Graph-based script to report usage across multiple Office 365 workloads know that some recent work has been done to improve performance. This provoked a comment that replacing Where-Object with the Where method would deliver further improvement in instances where the script filters objects because it avoids the need to pipe objects to Where-Object. The Where method only works with arrays. For example, let’s assume that you load a bunch of mailboxes with a call like:
$Mbx = Get-ExoMailbox -ResultSize Unlimited
To filter the mailboxes with Where-Object and find any mailbox with the string “Tony” in the display name, you might do something like:
$Mbx | ? {$_.DisplayName -Like “*Tony*”}
The same can be done with the Where method by changing the code slightly:
$Mbx.Where({$_.DisplayName -Like “*Tony*”})
(In these examples, the ? shortcut is used instead of spelling out the full Where-Object cmdlet name). Note that if no objects are found by the filter, Where-Object returns $Null while Where returns an empty array. Read this article for more information about the ups and downs of the Where method.
Some initial tests proved that substituting the Where method is easy and delivered an improvement of between 12% and 15% in performance. We therefore went ahead and made the change in V1.3 of the GetGraphUserStatisticsReport.PS1 script, which is available from GitHub.
My original tests were done with a few thousand records. I was curious if the performance improvement was maintained with larger data sets. I therefore downloaded 75,000 Office 365 audit records and stored the items in an array, a data table, and a PowerShell list. In all cases, the Where method is faster.
Here’s an example of using the Where-Object cmdlet to extract a subset of records from a list. Six records are returned from 75,000 in 98 seconds.
Measure-command {$LabelRecs = $Report| ? {$_.Operation -eq "Get-Label"}} Days : 0 Hours : 0 Minutes : 1 Seconds : 37 Milliseconds : 959 Ticks : 979599545 TotalDays : 0.00113379576967593 TotalHours : 0.0272110984722222
Changing the code to use the Where method is very simple because the same script block is used, passed in parenthesis. The code returns the same six records in 78.5 seconds.
Measure-command {$LabelRecs = $Report.Where({$_.Operation -eq "Get-Label"})} Days : 0 Hours : 0 Minutes : 1 Seconds : 18 Milliseconds : 540 Ticks : 785406751 TotalDays : 0.000909035591435185 TotalHours : 0.0218168541944444 TotalMinutes : 1.30901125166667 TotalSeconds : 78.5406751 TotalMilliseconds : 78540.6751
It’s hard to see why you should not consider upgrading Where-Object calls to use the Where method anywhere the output is not piped to another cmdlet. The switchover is easy and the performance gains are obvious. The next time you’re editing a script, consider using this approach. It won’t make slow cmdlets like Get-UnifiedGroup any faster, but it might quicken the surrounding processing.
Worrying about getting an extra 15% performance for PowerShell scripts might seem a strange thing for the Office 365 for IT Pros eBook team to worry about, but we write so much code to illustrate principles and give examples that we care about this kind of thing!
]]>Last June, I wrote about a PowerShell script to interrogate the Microsoft Graph to retrieve usage data from workloads like Exchange Online, SharePoint Online, and Teams. Some of this data is available via PowerShell cmdlets like Get-ExoMailboxStatistics and Get-SPOSite, but using the Graph is usually faster.
The idea behind the script is that if you retrieve usage information from a bunch of workloads, you have the basis to figure out what accounts are in active use. Potentially, you might even find some accounts that haven’t been used for a while that you can archive or remove to save some Office 365 licenses. Figure 1 shows the script output as viewed through the Out-GridView cmdlet.
The original script worked, but it had some performance issues once people tried to run it in tenants with thousands of accounts. Long-running PowerShell scripts are not usual, but the issues people ran into pointed to unreasonable delays.
One of the advantages of storing scripts on GitHub is that others can work with the code to detect where problems might lurk. In this case, suspicion fell on how the script processed data in the array storing information fetched from the Graph. It was suggested that better results might be achieved by changing to a PowerShell DataTable object. I did some work to test the assertion but didn’t see a huge gain, probably because I don’t have a big enough tenant to test.
In any case, a simpler fix became obvious. Instead of scanning the big list containing all the records for the records (usually six) for each user, it’s much better to extract the records for the user into an array and use that array. The change is made in version 1.2 of the script, which is now available to download from GitHub. (update: The current version of the script is 2.0).
It would be helpful if those who have admin permissions in tenant with more than a few thousand accounts could test the script (after carefully checking that the script is OK to run in your tenant, creating the necessary registered app in Azure AD, and so on). If you find problems, record them in GitHub and the community will have a chance to work the issues.
We’ve just updated the script to V1.3 to incorporate another speed tweak. The Where method is more appropriate and faster when you only want to extract data from a set and don’t need to feed that data through the pipeline. The script extracts records for each user to create an assessment of the user’s activity and using the Where method is faster… A good lesson learned!
Learn more about Office 365 tenant administration by subscribing to the only eBook that’s updated monthly: Office 365 for IT Pros.
]]>Updated 1-Nov-2023
The writing team for the Office 365 for IT Pros eBook very much appreciate the work done by our redoubtable technical editor, Vasil Michev, when he probes and questions text in the book chapters. It’s nice to get some of our own back when Vasil commits himself to print.
Take his article on certificate-based authentication for Exchange Online PowerShell. There’s a lot of good stuff here, but Vasil’s mind runs at such a fast rate that he sometimes omits details when he explains something that end up stopping people being able to master a topic. This is when people with slower CPUs, like me, step in to ask irritating questions.
Batch jobs that need to interact with Exchange Online PowerShell cannot use basic authentication any longer. The solution is to use certificate-based authentication instead. The mechanism might seem complex when you first read Microsoft’s instructions, but it can be boiled down to three points:
The same process to enable certificate-based authentication can be used with other Microsoft 365 modules that support certificate-based authentication such as the Microsoft Teams module.
Creating the app and assigning the necessary Exchange.AsManageApp permission and administrative role is quickly done in the Entra ID admin center. These are one-time operations that don’t need to be automated in code. However, it’s worth noting that role assignments can be made with PowerShell. It’s useful to know how to do this because you might use the technique in future to assign a role to a user.
This code fetches the object identifier for the app (it’s called “Exo Background process”) and assigns the Exchange Administrator directory role to the app.
$ExoAppSp = (Get-MgServicePrincipal -Filter "DisplayName eq 'Exo Background Process'").Id $ExoRoleId = (Get-MgDirectoryRole | Where-Object {$_.DisplayName -eq "Exchange Administrator"}).Id $NewAssignee = @{ "@odata.id" = ("https://graph.microsoft.com/v1.0/directoryObjects/{0}" -f $ExoAppSp) } New-MgDirectoryRoleMemberByRef -DirectoryRoleId $ExoRoleId -BodyParameter $NewAssignee
The same approach is used to assign a role to a user. The difference is that the object identifier for the user is fetched with the Get-MgUser cmdlet. This example shows how to assign the Global Reader role to a user.
$UserId = (Get-MgUser -UserId Otto.Flick@Office365itpros.com).Id $RoleId = (Get-MgDirectoryRole | Where-Object {$_.DisplayName -eq "Global Reader"}).Id $NewAssignee = @{ "@odata.id" = ("https://graph.microsoft.com/v1.0/directoryObjects/{0}" -f $UserId) } New-MgDirectoryRoleMemberByRef -DirectoryRoleId $RoleId -BodyParameter $NewAssignee
With that diversion taken care of, we can proceed to obtaining a self-signed X.509 certificate, which is where people sometimes become stuck (I did).
You need to upload a suitable X.509 self-signed certificate to the Azure AD app to create an association between the two. Certificates are not the easiest of objects to work with, but in this case it’s straightforward.
If you don’t have a suitable certificate to hand, you must generate one. Vasil was short on detail on this point (until I asked him how he had generated a certificate). Microsoft recommends using either the Create-SelfSignedCertificate.ps1 script or the MakeCert command-line utility. These are certainly viable options, but the easiest way is to run the New-SelfSignedCertificate cmdlet using a command like this:
New-SelfSignedCertificate -Subject "Exo Background Process" -CertStoreLocation "cert:\CurrentUser\My" -KeySpec KeyExchange -FriendlyName "For EXO V2 Background Jobs"
This command creates a certificate valid for a year in the personal store of the user, which is fine for testing (Figure 1). Obviously, in a production environment, you’d create the certificate in the personal store of the account that will be used to run batch jobs.
To associate the certificate with the app, export the certificate as a DER-encoded binary X.509 file (Figure 2). You can call the file anything you like, but you should give it a .CER extension.
Finally, upload the exported file to the app in the Azure AD portal. This will generate a thumbprint that you need to note (Figure 3).
After setting up the app, the three vital pieces of information we need to connect are:
With these values, you can connect to Exchange Online using certificate-based authentication with a command like:
Connect-ExchangeOnline -CertificateThumbprint "40EED7993F65D8CF13D5ABAC87F3AAD307012D22" -AppId "b83c46c6-044e-40e5-929c-634f80045a11" -ShowBanner:$false -Organization tenant.onmicrosoft.com
There are obviously more complications that await the unwary along the way, but this is enough to connect and play with Exchange Online PowerShell in batch jobs. Vasil’s post contains a lot of detail and there will be more articles and guidance published as we approach the deadline for basic authentication to disappear next year, which is a good reason to subscribe to a reliable source of information like the Office 365 for IT Pros eBook.
]]>In a previous post, we cover the basics of reviewing email quarantined by Exchange Online Protection using the Security and Compliance Center. As discussed there, it’s important to review quarantined email to understand if any messages which shouldn’t be blocked are trapped there waiting for release. No one wants to have an important message expire in the quarantine (after 15 days by default) and not get to its intended recipient.
The problem is the time needed to review quarantined messages for a busy tenant. Scrolling up and down a large list to decide whether to release messages can consume hours, especially if you don’t allow users to release quarantined email.
Exchange Online includes several cmdlets to work with quarantined messages. It might be easier to run a daily job to grab details of what’s waiting in the quarantine, do some basic analysis, and create a CSV file of the messages that can be reviewed. Any messages that shouldn’t be released can be removed from the file, and the remainder released for delivery.
I created a script (downloadable from GitHub) to illustrate the principal. The script fetches details of messages in quarantine using the Get-QuarantineMessage cmdlet and populates a PowerShell list with details of each message. You could use the output of Get-QuarantineMessage directly, but this approach allows for some additional processing of each message, such as extracting its source domain and calculating how long more it will remain in quarantine.
We then use the list to do some basic analysis to find out why messages are being quarantined, who’s receiving these messages, and where the messages come from:
$Report | Group Type | Sort Count -Descending | Format-Table Name, Count Type of Quarantined messages Count Name ----- ---- 10 Spam 5 Phish 3 HighConfPhish 1 Malware Messages quarantined per recipient address
Finally, we export the messages to a CSV file. The intention here is that someone can review the list of messages and decide which to release for onward delivery. All other lines in the CSV file are removed. To release the messages, we can then import message details from the CSV and use the Release-QuarantineMessage cmdlet to release them:
Import-CSV c:\temp\QuarantinedMessages.csv | Release-QuarantineMessage -ReleaseToAll
It’s all very straightforward PowerShell so you can customize it to add whatever idea you think is valuable. For instance, you could email the CSV file to reviewers.
Simple ideas can be the best. And applying PowerShell to solve problems is a simple idea that works well in lots of places within Office 365. Which is why the Office 365 for IT Pros eBook includes so many examples of PowerShell in action.
]]>Chapter 13 of Office 365 for IT Pros is where we tackle the subject of how to use PowerShell to manage Microsoft 365 Groups and Teams. For the last two versions, we’ve included a script to demonstrate how to archive a group at the end of its useful lifetime, such as when a project finishes.
The script works by removing all the current owners and members and replacing them with a single owner/member, which we’ll call the compliance manager. The script also removes the group from address lists and hides it from Exchange clients to make sure that users can’t browse to find it and updates the SMTP address and display name to show that the group is archived.
The idea is to keep the group and its resources online in a state where eDiscovery can still find information easily and the group can be resuscitated quickly if necessary. Teams has a menu option to archive a team by making it read-only. This is essentially the same, but the approach works for all groups.
We received a request through the Microsoft Technical Community for a script that could archive 600 teams at the end of an academic year. When you think about it, archiving the teams (groups) used for classes is something that schools and universities probably must do each year.
Everyone loves the chance to revisit old code (don’t they?). In response, we’ve updated the script (you can download a copy from GitHub) and will include it in the August update for the Office 365 for IT Pros (2021 edition) eBook.
The updated script:
Everything that the script does is logged and output to a CSV file (Figure 1 shows the output as piped to the Out-GridView cmdlet).
The script uses the Exchange Online PowerShell module and you must connect with a tenant admin account to be able to update group settings.
An approach which might be easier to take when processing hundreds of groups (as in at the end of a school term or academic year) is to:
For example, this code finds groups and creates a CSV file for review:
# Find details of all groups in the tenant and export to CSV $Groups = Get-UnifiedGroup -ResultSize Unlimited | Select DisplayName, Notes, DistinguishedName, Alias, PrimarySmtpAddress, SensitivityLabel $Groups | Sort DisplayName | Export-CSV -NoTypeInformation c:\temp\GroupsForReview.CSV
After editing the CSV file (Figure 2) to remove the groups you don’t want to archive and saving the file, we can then replace the call to Get-UnifiedGroup in the script with:
$ArchiveGroups = Import-CSV c:\temp\GroupsForReview.CSV
The rest of the code in the script is unaltered.
If you’re not in an education environment, you could use the same approach to archiving Groups and Teams. In this case, you could run the Groups and Teams Activity Report to find underused or inactive groups to archive.
As with everything in PowerShell, you can update the code as you like to fit the circumstances of your tenant. Have fun!
]]>Updated: 15 August 2023
It’s important to keep PowerShell modules updated because Microsoft introduces new cmdlets and cmdlet parameters to support new functionality, like SharePoint Online site URL rename. Or when Microsoft updates the Microsoft Graph PowerShell SDK or MicrosoftTeams modules, both of which receive updates on a monthly (or even more frequent) basis.
Most of the important PowerShell modules used with Microsoft 365 are now available in the PowerShell Gallery and can be installed and updated from there. It seemed appropriate to write a script to:
Here’s the basic processing steps for the original version of the script (I update the code on an ongoing basis to cope with changes, such as the split of the Microsoft Graph PowerShell SDK into V1.0 and beta modules from V2 onward). The script must be run after starting PowerShell as an administrator (otherwise you won’t be able to install the updates). Make sure that all other PowerShell sessions are ended to avoid the possibility of trying to update a module that’s in use.
# Define the set of modules installed and updated from the PowerShell Gallery that we want to maintain $O365Modules = @("MicrosoftTeams", "MicrosoftTeams", "Microsoft.Graph", "ExchangeOnlineManagement", "Microsoft.Online.Sharepoint.PowerShell", "ORCA") # Check and update all modules to make sure that we're at the latest version ForEach ($Module in $O365Modules) { Write-Host "Checking and updating module" $Module Update-Module $Module -Force -Scope AllUsers } # Check and remove older versions of the modules from the PC ForEach ($Module in $O365Modules) { Write-Host "Checking for older versions of" $Module $AllVersions = Get-InstalledModule -Name $Module -AllVersions $AllVersions = $AllVersions | Sort-Object PublishedDate -Descending $MostRecentVersion = $AllVersions[0].Version Write-Host "Most recent version of" $Module "is" $MostRecentVersion "published on" (Get-Date($AllVersions[0].PublishedDate) -format g) If ($AllVersions.Count -gt 1 ) { # More than a single version installed ForEach ($Version in $AllVersions) { #Check each version and remove old versions If ($Version.Version -ne $MostRecentVersion) { # Old version - remove Write-Host "Uninstalling version" $Version.Version "of Module" $Module -foregroundcolor Red Uninstall-Module -Name $Module -RequiredVersion $Version.Version -Force } #End if } #End ForEach } #End If } #End ForEach
You can download the latest version of the script from GitHub and amend it to update the modules you use. Because I use the script to keep my PCs updated with the latest PowerShell modules, the current code is more developed and comprehensive than the snippet shown above. The script isn’t perfect, but it gets the job done for me.
Figure 1 shows some typical output as the script processes the set of Microsoft 365 PowerShell modules and updates those when a new version is available in the PowerShell gallery.
The script installs and updates modules in the $env:ProgramFiles\PowerShell\Modules location to make sure that the files are on the local workstation and available to all users. From PowerShell 6 onward, the default is to install modules in $HOME\Documents\PowerShell\Modules, which might mean that they end up in OneDrive.
Given the cadence of Microsoft updates for PowerShell modules, it’s a good idea to update PowerShell modules every month or six weeks. Perhaps reflecting the ongoing growth in Graph APIs, the Microsoft Graph PowerShell SDK receives even more frequent updates. If you use the Graph SDK with Azure Automation, make sure that you update PowerShell modules for the Graph SDK in your Azure Automation accounts too.
Checking, updating, and removing PowerShell modules is not a fast process, but it does help to keep your scripting in good health.
The Office 365 for IT Pros eBook includes extensive coverage of managing Office 365 applications with PowerShell, so we pay attention to this kind of stuff – just like you should.
]]>Let’s assume that you’ve decided to replace the text-only classifications defined in the Azure Active Directory policy for Groups with Office 365 Sensitivity Labels. All is well, and you might even have used the PowerShell code explained in this article to do the job and all your teams, groups, and sites are now labelled properly.
You then consider an issue that will be dealt with differently from tenant to tenant: Assigning a label to a container that limits guest members does not affect access for existing guests. In other words, assigning a label to block guest access to a team, group, or site does precisely zero to remove any existing guests. They remain in the group membership and their access to group resources continues unimpeded.
If this is a concern and you want to be sure that containers marked with a high degree of sensitivity do not have guest members, you should check the membership of these groups and remove any guests. This is simple to do with PowerShell. In this example, we find the groups stamped with a specific sensitivity label that have guest members and report who those guests are.
The code is straightforward.
CLS; Write-Host "Finding confidential Microsoft 365 Groups..." $Groups = Get-UnifiedGroup | ? {$_.SensitivityLabel -eq "1b070e6f-4b3c-4534-95c4-08335a5ca610" -and $_.GroupExternalMemberCount -gt 0} If (!$Groups.Count) { Write-Host "No Microsoft 365 Groups found with that label"} Else { $Report = [System.Collections.Generic.List[Object]]::new(); $NumberGuests = 0 Write-Host "Now examining the membership of" $Groups.Count "groups to find guests..." ForEach ($Group in $Groups) { Write-Host "Processing" $Group.DisplayName $Users = Get-UnifiedGroupLinks -Identity $Group.Alias -LinkType Members ForEach ($U in $Users) { If ($U.Name -Match "#EXT#" -and $U.Name -NotLike "*teams.ms*") { ## Remember to edit the string to make sure it’s your tenant name… $CheckName = $U.Name + "@EditMeTenantName.onmicrosoft.com" $User = (Get-AzureADUser -ObjectId $CheckName).DisplayName $ReportLine = [PSCustomObject]@{ Email = $U.Name User = $User Group = $Group.DisplayName Site = $Group.SharePointSiteURL } $Report.Add($ReportLine) $NumberGuests++ } }}} Write-Host "All done." $NumberGuests "guests found in" $Groups.Count "groups" $Report | Sort Email | Out-GridView
The output is in a PowerShell list that we can review through the Out-GridView cmdlet (Figure 1) or by writing to a CSV file. After finding guests in groups where they are now prohibited, you can make the decision to leave them in place or remove them from the membership.
A more developed version of the script would first figure out which labels block guest access and then loop through all groups with these labels to create a report for all such labels. We explain how in the Office 365 for IT Pros eBook.
It’s worth noting that, if necessary, a global administrator can add a guest to a group even when blocked by policy.
Thinking about problems like this is what drives the Office 365 for IT Pros writing team to continually improve and refine our text about different aspects of Office 365. It’s why we issue a completely new book to our subscribers every month. Join us by taking out a subscription.
]]>One of my posts describes a PowerShell script which uses the Send-MailMessage cmdlet to send messages from an Exchange Online mailbox, in this case to generate a welcome message to new tenant users. Send-MailMessage uses a SMTP AUTH connection, one of the methods the Exchange Online development group wants to upgrade to use modern authentication in their drive to eliminate basic authentication.
Microsoft hasn’t said exactly when SMTP AUTH connections will be able to use modern authentication, but it will happen, and it’s good to be prepared.
Update: OAuth support is now available for SMTP AUTH.
If you use the Send-MailMessage cmdlet with Exchange Online, you’ll have to upgrade your scripts to make sure that they continue working when the block on basic authentication descends. It’s a good idea to take an inventory of scripts to know where work needs to be done.
But you shouldn’t only focus on Send-MailMessage. Older scripts might use a combination of the .NET SmtpClient and MailMessage classes to send email from PowerShell. These classes are more general-purpose because they’re designed to be used to connect with a range of SMTP servers from programs built with different languages. However, the same SMTP AUTH restrictions apply when scripts use these classes to send email with Exchange Online.
Here’s an example of using the .NET classes to send a message. You can do everything shown here with Send-MailMessage except add an x-header.
# Example of how to send a message using the .NET SmtpClient and MailMessage classes If (-not $O365Cred) { #Make sure we have credentials $O365Cred = (Get-Credential)} $MsgFrom = "Info@office365itpros.com" $MsgTo = "Mybestfriend@outlook.com" $SmtpServer = "smtp.office365.com" ; $SmtpPort = "587" # Build Message Properties $Message = New-Object System.Net.Mail.MailMessage $MsgFrom, $MsgTo $Message.Subject = "Example Message" $Message.Attachments.Add("C:\Temp\Office365TenantUsage.csv") $Message.Headers.Add("X-O365ITPros-Header","Important Email") $Message.Body = Get-Content c:\temp\textforemail.html $Message.IsBodyHTML = $True # Build the SMTP client object and send the message off $Smtp.EnableSsl = $True $Smtp.Credentials = $O365Cred $Smtp = New-Object Net.Mail.SmtpClient($SmtpServer, $SmtpPort) $Smtp.Send($Message)
Apart from the need to update scripts using the .NET classes when Microsoft deprecates basic authentication for SMTP AUTH, you should be aware that the SmtpClient class is already deprecated and could therefore become unavailable to PowerShell in the future. For that reason, while you’re scanning scripts to prepare to upgrade them for modern authentication, make a mental note to look out for these classes so that the code is also upgraded when you update with a more modern method to send email.
Stay updated with change inside Exchange Online by subscribing to the Office 365 for IT Pros eBook. We stay ahead of the game so you don’t have to worry.
]]>Generating a report about some aspect of Office 365 is all very well, but it doesn’t lead to much unless there’s some action that can be easily taken due to the reported data. Take the report on Exchange Online SendAs and other permissions. It’s nice to know which accounts hold permissions over different mailboxes, but what will you do with that information?
In a small tenant, it might be easy to review the data and identify problems, like someone keeping Send As permission for a shared mailbox long past the time when their job mandates this access. In a medium to large tenant, you can slice and dice the report data to highlight issues, but it’s a lot harder to pinpoint definite problems.
Microsoft is adding a lot of machine learning and artificial intelligence to Office 365 at present. Taking that as a hint, we can use technology to help filter the report data and identify the accounts to focus on. And best of all, this is easy to do in PowerShell.
The output from the script is a CSV file listing all the Send As, Full Access, and Send On Behalf Of permissions assigned to accounts. The first step is to read in the data from the CSV file, filtering items so that only SendAs records are loaded into an array.
The next step is to find a way to check each SendAs assignment against usage. The easiest way I know is to search the Office 365 audit log for SendAs events. Data is kept for 90 days for Office 365 E3 accounts (but only if their mailboxes are enabled). Data is kept for 365 days if an account has an Office 365 E5 license. Either way, I think 90 days is enough on the basis that if someone hasn’t used their right to impersonate another user or shared mailbox to send email in the last three months, maybe they don’t need that permission and it can be removed.
We can collect the SendAs events for the last 90 days by running the Search-UnifiedAuditLog cmdlet, unpacking the AuditData content in each audit event, and storing the data. Fortunately, we already have a script to do the job, which stores its output in another CSV file.
A few lines of code later, we have the SendAs audit events loaded and we’re ready to start checking. The basic idea is to go through the assignments and check each against the audit data to see if the permission has been used. If it has, we store some usage details (last time and number of uses), and if it hasn’t, we note that fact too.
Hey Presto! After running through the permissions, we have a filtered list of accounts who haven’t used their assigned Send As permissions in the last three months and another set who have. You can review the assignments by piping the analyzed data to Out-GridView (Figure 1) or use the output CSV file for further processing.
You can pick up a copy of the script to analyze and filter SendAs records from GitHub. Remember that you need the other scripts to fetch SendAs records from the Office 365 audit log and report mailbox permissions to provide the two inputs.
At this point, the script finishes and it’s up to the tenant administrators to decide what to do about the defunct permissions. Perhaps you want to send a polite email to users to tell them that you plan to remove the permission in a week’s time, or maybe you just go ahead and remove the permission on the basis that if anyone misses it, they’ll scream.
This is a great example of how to put together PowerShell scripts as building blocks for a solution. The code isn’t all that complex. It’s simply a matter of knowing where to find the data and how to use it. Isn’t that always the case?
]]>A question popped up in an online group: How can I create a report for each user detailing the interaction with documents stored in SharePoint Online libraries? The answer seems straightforward: search the Office 365 audit log for SharePoint document operations and create a report from the events found, outputting it in CSV or HTML format. Chapter 21 of the Office 365 for IT Pros eBook includes many examples of how to extract information from the audit log that could be used as the basis for a solution. The post covering how to answer the question of who updated a document is also helpful.
Often the reports generated from the audit log cover actions taken by multiple users. In this case, the request is to generate a report on a per-user basis. Possibly the desire is to email the report to the user, or maybe the feeling is that it is easier to review access to sites and documents on a personal level.
The first thing to resolve is what’s intended by “user”? We need to know this to generate the reports. A user could mean:
From a PowerShell perspective, you can generate a list of mailbox owners with Get-ExoMailbox (people with mailboxes are likely to have SharePoint Online licenses).
$Users = Get-ExoMailbox -RecipientTypeDetails UserMailbox -ResultSize Unlimited | Select UserPrincipalName,DisplayName
Alternatively, if you want to include guest accounts, you can create a list with Get-AzureADUser and include accounts of type Member (tenant account) and Guest.
$Users = Get-AzureADUser -All $True -Filter ("UserType eq 'Guest' or UserType eq 'Member'") | Select UserPrincipalName, DisplayName
You could filter the list further by removing tenant accounts who aren’t licensed for SharePoint Online. This is easy to do, but it’s probably not necessary because the report is generated from audit events that won’t exist unless an account is licensed.
We’re going to search the Office 365 audit log for events generated by all users. The other search parameters needed are:
The events to look for: Depending on the applications used in a tenant, the audit log could include up to 1,500 different events. In this case, we want to know about events which manipulate documents stored in SharePoint or OneDrive for Business. Five events should suffice:
Although you can input the events directly into the search command, it’s easier to declare the set of events in an array:
$Operations = @('FileAccessed', 'FileDownloaded', 'FileModified', 'FileDeleted', 'FileUploaded')
The start and end date for the search. SharePoint Online is a verbose application when it comes to the generation of audit log records. To make processing easier, restrict the date range as much as possible. You can go back 90 days for Office 365 E3 accounts and 365 days for Office 365 E5 accounts.
In large tenants, consider splitting the processing up over several batches as otherwise the script will likely take a long time to complete. The easiest way to do this is to amend the script to create a filtered set of users and use the filtered list as input to the audit log search. This example uses the Get-ExoMailbox cmdlet with a filter applied to the CustomAttribute1 property to find a set of users:
$Users = Get-ExoMailbox -Filter {CustomAttribute1 -eq "Sales"} | Select -ExpandProperty UserPrincipalName Search-UnifiedAuditLog -Operations $Operations -UserIds $Users -StartDate $StartDate -EndDate $EndDate -ResultSize 5000
The Search-UnifiedAuditLog cmdlet is restricted to returning a maximum of 5,000 audit records at one time. More records might exist, and in this case, you must run the cmdlet until all available data is retrieved. Search-UnifiedAuditLog supports the retrieval of large amounts of data (up to 50,000 records) by allowing you to declare a session identifier (a value to link calls together) together with the ReturnLargeSet parameter. The data is unsorted when fetched, so it must be sorted for reporting purposes. If more than 50,000 audit records are available, you’ll have to divide processing up across multiple runs.
It’s possible to take the raw data from audit records and output the records to a CSV file. However, I like to process Office 365 audit records to make more sense of what they contain. In this case, the script does the following:
The processed audit records go into a PowerShell list object. This is much more efficient than adding records to an array. And we can do some rudimentary processing to generate some insight into what’s happening. For example, what kind of file operations are performed:
$Report | Group Operation | Format-Table Name, Count Name Count ---- ----- FileAccessed 3594 FileUploaded 341 FileModified 3186 FileDownloaded 86 FileDeleted 327
Or the people who are creating documents:
$Report | Group UPN | Sort Count -Descending | Format-Table Name, Count Name Count ---- ----- tony.redmond@office365itpros.com 5669 michael.van.horenbeeck_thecollective.eu#ext#@office365itpros.onmicrosoft.com 690 jcgonzalez_itechcs.onmicrosoft.com#ext#@office365itpros.onmicrosoft.com 374
To create a report for each active user, we can loop through the set of users we created beforehand and extract the records for the selected user and write them out to a CSV file:
$UserRecords = $Report | ? {$_.UPN -eq $U.UserPrincipalName} If ($UserRecords) { $UserReports++ Write-Host "Writing out data for" $U.DisplayName $FileName = "c:\Temp\AuditHistory" + $U.UserPrincipalName + ".csv" $UserRecords | Export-CSV -NoTypeInformation $FileName }
Figure 1 shows what the contents of a CSV file looks like:
The per-user CSV files are created in the c:\temp\ directory (Figure 2), so it would be easy to find them and email them to the users… But that’s another day’s work.
In the meantime, the complete script containing everything described above is available for download from GitHub. Happy PowerShell!
]]>Some important changes are available in recent refreshes for the Exchange Online and SharePoint Online PowerShell modules. In general, it’s good practice to download and use the latest available module to take advantage of bug fixes and new functionality. The problem is knowing when these updates are available as few of us have the time to check.
The latest versions of these modules are:
Exchange Online PowerShell V2 is the module containing the new-REST based cmdlets (like Get-ExoMailbox). The module also includes access to the older Remote PowerShell cmdlets (like Get-Mailbox). You should be using this module whenever possible, especially when needing to deal with large sets of mailboxes or mailbox-associated objects.
Over time, as Microsoft removes the ability to connect to PowerShell with basic authentication (along with ActiveSync, IMAP4, POP3, and SMTP), the V2 module will become the only way to access Exchange Online with PowerShell.
Notable updates in the current Exchange Online PowerShell V2 module are:
Version 0.4368.1 introduced the Connect-IPPSSession cmdlet as a way to connect to the Compliance Center endpoint. There’s no logic behind the name, which some speculate means Information Protection PowerShell (IPPS). The cmdlet has been around for a while and now joins the Exchange Online management module.
For the moment, I don’t recommend that you use the Connect-IPPSSession cmdlet. Although it does load the Compliance Center cmdlets into a session, it does so by removing any previous session connected to Exchange Online, which means that you end up in a situation where you can’t use the two sets of cmdlets in the same session. This problem has been around since 2017 and Microsoft didn’t fix it when the cmdlet transitioned to the Exchange Online management module.
The older approach supports the use of both sets of cmdlets, even if cmdlets with the same names exist in the two sets using the AllowClobber parameter to import the cmdlets with Import-PSSession. A great example of how this is done is in Michel de Rooij’s mega-script to connect to Office 365 services with PowerShell. You can also use a prefix to identify the cmdlets from the different sets.
Version 16.0.19927.0 of the SharePoint Online PowerShell module supports some new functionality with Conditional Access policies. Normally, updating this module is a matter of downloading the latest version from Microsoft’s site and installing it on a workstation.
In this case, my PC had version 16.0.19418.12000 installed and after the update, I was puzzled that PowerShell continued to load that version. I blamed a bad download, so I downloaded and installed the new module again. Version 19418.12000 persisted. And persisted.
Even a cycle of removing the module, rebooting the PC, and installing the new module refused to dislodge 19418.12000. Eventually, I discovered that the files for this version were in C:\Program Files\WindowsPowerShell\Modules\Microsoft.Online.SharePoint.PowerShell while those for 16.0.19927.0 were installed into C:\Program Files\SharePoint Online Management Shell. After I deleted the older files, PowerShell picked up the new version.
This is obviously not the way that things should work. Microsoft is investigating… In the meantime, I’m chalking these problems down to yet another event along my rich voyage among PowerShell modules, just like the issue I had with OneDrive’s known folders and the Active Directory module.
]]>A recent question asked how to use the SharePoint Online PnP PowerShell module to extract the version history of a document. The PnP (Patterns and Practices) module contains cmdlets to handle complex SharePoint provisioning and management scenarios. If you get to know PnP, you probably like it because it can handle actions from update a SharePoint document to create a new folder. However, the nature of PnP is that its interaction with objects is more complicated than other PowerShell modules.
The usual reason why people want to look at the version history for a document is to know who made a change to its content. Given how autosave captures document updates, the number of versions available for a document stored in SharePoint Online or OneDrive for Business can be large (Figure 1).
If you’re not used to PnP, you might find it easier to extract information about events to update a SharePoint document from the Office 365 audit log. Every time a document is uploaded or updated in a SharePoint Online or OneDrive for Business document library, SharePoint creates an audit event that is later ingested into the Office 365 audit log (the event should be available about 15 minutes after the update). If we know the name of a document, it’s easy to search the audit log with the Search-UnifiedAuditLog cmdlet and find its audit records.
The PowerShell script below uses the $FileName variable to hold the name of the document to search for. If events occurred for this document over the last 90 days, the search should find events to record the initial upload of the document to the library (FileUploaded) and subsequent updates (FileModified) and views (FileAccessed). If the AutoSave feature is enabled for the document, multiple update records can accumulate over a short period. As is normal with audit records, a lot of interesting information is found in the AuditData property.
$FileName = (Read-Host "Enter file name to search") $Records = (Search-UnifiedAuditLog -StartDate (Get-Date).AddDays(-90) -EndDate (Get-Date).AddDays(+1) -Operations FileModified, FileAccessed, FileUploaded -ObjectIds $FileName -ResultSize 1000) If ($Records.Count -eq 0) { Write-Host "No audit records found for file names beginning with" $FileName } Else { Write-Host "Processing" $Records.Count "audit records..." $Report = [System.Collections.Generic.List[Object]]::new() ForEach ($Rec in $Records) { $AuditData = ConvertFrom-Json $Rec.Auditdata $ReportLine = [PSCustomObject]@{ TimeStamp = $Rec.CreationDate User = $AuditData.UserId Action = $AuditData.Operation SiteUrl = $AuditData.SiteUrl Site = $AuditData.SourceRelativeUrl File = $AuditData.SourceFileName IpAddress = $AuditData.ClientIP App = $AuditData.UserAgent } $Report.Add($ReportLine) }}
After analyzing the audit records, we can list the set of actions found for the document:
$Report | Select Timestamp, User, Action TimeStamp User Action --------- ---- ------ 22 Apr 2020 14:40:41 Jane.Maloney@office365itpros.com FileModified 21 Apr 2020 15:19:03 Jane.Maloney@office365itpros.com FileModified 21 Apr 2020 15:02:34 Kim.Akers@office365itpros.com FileModified 21 Apr 2020 15:01:39 Jane.Maloney@office365itpros.com FileUploaded
To distribute the report, you could simply print it or create a CSV file. Other distribution methods include:
If your accounts have Office 365 E5 or Microsoft 365 E5 compliance licenses, audit records are available for 365 days. However, 90 days is usually enough to find out who made a change to an important document. Unless the change was overlooked and has only just been noticed!
Practical information about using PowerShell to solve common Office 365 administrative problems is a hallmark of the Office 365 for IT Pros eBook. Subscribe today and learn from our experience!
]]>Following an article I wrote about using PowerShell to report SharePoint Online site storage usage, a reader asked if it is possible to create such a report without needing to sign in with a SharePoint Online administrator account. They’d like non-admins to be able to report data but don’t want them to run PowerShell scripts or access the admin center.
The Global Reader role allows non-privileged access to reporting data exposed in some interfaces like the Microsoft 365 Admin Center. The reports section of the admin center (Figure 1) includes reports on SharePoint user and site activity.
The Admin Center includes an export option to download data as a CSV file. If you select the site usage report, this seems promising, until you realize that the report includes redirect sites and even the SharePoint Online Tenant Fundamental Site created when the tenant is initialized. Some of the fields output could be better formatted too.
The usage reports available in the Microsoft 365 Admin Center use the Graph reports API to generate their data. It’s very easy to write a quick and dirty PowerShell script to call the SharePoint Site Storage Usage API to return the same data as you see in the usage reports. Once the data is downloaded, you can manipulate it to meet your needs.
To gain access to the Graph, PowerShell needs to use an app to authenticate with the Reports.Read.All and Sites.Read.All permissions. The basic idea is that you register an app for your tenant to use with PowerShell and generate an app secret to use for OAuth authentication. You then assign the necessary Graph permissions to the app. The request to add the permissions to the app must be approved by an administrator before the permissions can be used. You can then use the app identifier and secret to generate an access token for the Graph which contains the permission. Read this post for detailed steps of creating such an app for PowerShell to use.
Update: The current version of the script has been updated to use the Microsoft Graph PowerShell SDK. This removes the requirement to use an app. The information described below shows how to use an app to authenticate and make Graph API requests. The same results are obtainable using the Graph SDK.
After the app is authorized with the necessary permissions, we can use it with PowerShell. This snippet:
$AppId = "e716b32c-0edb-48be-9385-30a9cfd96155" $TenantId = "c662313f-14fc-43a2-9a7a-d2e27f4f3478" $AppSecret = 's_rkvIn1oZ1cNceUBvJ2or1lrrIsb*:=' # Build the request to get the OAuth 2.0 access token $uri = "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token" $body = @{ client_id = $AppId scope = "https://graph.microsoft.com/.default" client_secret = $AppSecret grant_type = "client_credentials"} # Request token $tokenRequest = Invoke-WebRequest -Method Post -Uri $uri -ContentType "application/x-www-form-urlencoded" -Body $body -UseBasicParsing # Unpack Access Token $token = ($tokenRequest.Content | ConvertFrom-Json).access_token $headers = @{Authorization = "Bearer $token"} $ctype = "application/json"
With a token, we can issue the Graph request to fetch the SharePoint Online storage usage data:
# Get SharePoint files usage data $SPOFilesReportsURI = "https://graph.microsoft.com/v1.0/reports/getSharePointSiteUsageDetail(period='D7')" $Sites = (Invoke-RestMethod -Uri $SPOFilesReportsURI -Headers $Headers -Method Get -ContentType "application/json") -Replace "", "" | ConvertFrom-Csv
All that remains to be done is to parse the returned data and generate a report (a CSV file). You can download the script from GitHub. As always, the code is bare-bones and doesn’t include much in terms of error checking.
I also output the report data to Out-GridView (Figure 2) as it’s the easiest way to browse the information.
The big advantage of this approach is that no dependency exists on cmdlets in the PowerShell module for SharePoint Online or an administrator account. All the code is basic PowerShell that can be run by any user.
Because this approach uses data fetched from the Graph, the code is fast too – much faster than the version based on the SharePoint Online cmdlets, and the speed advantage becomes larger as the number of sites grows. This is because the Graph generates the report data and has it ready for fetching while the other approach requires you to generate the data for each site with the Get-SPOSite cmdlet. On the other hand, the Graph data is at least two days old, something that might not be too much of a concern when reviewing storage usage.
The downside is that the Graph usage data includes a limited set of properties. Some useful properties, like site files, active files, and views, aren’t returned by the Get-SPOSite cmdlet, but Get-SPOSite returns information like the site title, group identifier (to get a list of site owners), and sensitivity label among others.
Combining data fetched from the Graph with that fetched by Get-SPOSite is the best of both worlds, even if you’ll need to use a SharePoint administrator account. The question is what data are needed. If you really need the extended information about a site, you’ll have to use the SharePoint Online module. But if all you need is simple storage data, the Graph can provide that information quickly, albeit if it’s slightly out-of-date.
]]>The OneDrive Known Folder Move feature has been around for a couple of years. Basically, this allows you to redirect common (well-known) folders from your PC to OneDrive so that anything created in Documents, Pictures, and the desktop is automatically saved in your OneDrive for Business account. Generally, everything works well, and I have been very happy.
Except until the time came to update the Azure Active Directory preview module from 2.0.2.77 to 2.0.2.89.
I followed my normal routine of upgrading the module from the PowerShell Gallery, but things didn’t work. And no combination of removing and reinstalling modules worked either, despite setting a required version for the Install-Module cmdlet. Each time I started PowerShell and connected to Azure Active Directory, version 2.0.2.78 was used.
Eventually I discovered that the 2.0.2.77 files were installed in OneDrive by examining the module properties:
>Get-Module -Name AzureADPreview | Format-List Name : AzureADPreview Path : C:\Redmond\OneDrive – Office365ITPros\Documents\WindowsPowerShell\Modules\AzureADPreview\2.0.2.77\ Microsoft.Open.AzureAD16.Graph.PowerShell.dll
My speculation is that PowerShell installed the 2.0.2.77 files in OneDrive the last time I updated the module.
To clean up the mess, I uninstalled the module and then deleted all the files from OneDrive. A retention label stopped OneDrive deleting the files, so it was a matter of removing the retention label and then deleting the files and folders.
I then reinstalled the module, making sure to select the correct version and to install the module for everyone who uses the PC.
Install-Module AzureADPreview -RequiredVersion "2.0.2.89" -Scope AllUsers
After the installation, the module files are in:
Get-Module -Name AzureADPreview | fl Name : AzureADPreview Path : C:\Program Files\WindowsPowerShell\Modules\AzureADPreview\2.0.2.89\ Microsoft.Open.AzureAD16.Graph.PowerShell.dll
The next time I started a PowerShell session and ran the Connect-AzureAD cmdlet, I got the right version.
All of which goes to prove that you should pay attention to how you install PowerShell modules, just in case the files end up in OneDrive. PowerShell works when modules are installed to OneDrive, but upgrades become a little more interesting.
]]>We live in fast-changing times. The results of the Covid-19 pandemic are being felt in many ways. Many people are working from home, conferences are being rescheduled until next year or going virtual, and Microsoft is being forced to reschedule planned developments in Office 365. Some things, like increasing the membership limit for Teams to 10,000 are being accelerated. Others, like the plan to remove basic authentication for five Exchange Online connection protocols, are being pushed out.
Basic authentication is bad for Exchange Online because it is a vulnerability often used as an attack vector. I strongly supported the original plan to remove basic authentication for ActiveSync, PowerShell, Exchange Web Service, and especially POP3 and IMAP4 in October 2020.
All the signs from Microsoft were that the Exchange product group wanted to make this happen and would hold the line. But pandemics have a funny habit of changing things, and so the product group has been forced to postpone removing basic authentication for the famous five protocols until some time in the second half of 2021.
The lack of a definite target date is because no one know when the world will resume normal working. No doubt Microsoft wants to set a date that’s sooner rather than later, but for now July 1, 2021 is a good target date for planning.
In the meantime, new Office 365 tenants won’t get the chance to develop a bad habit because Microsoft is disabling basic authentication for the five protocols by default in those tenants. And in October 2020, they’ll get some satisfaction by disabling the protocols in tenants with no recorded use of basic authentication (in other words, Microsoft’s telemetry only records connections using modern authentication in the tenant).
Updated 30 April 2020:
Microsoft is rolling out support for OAuth 2.0 support for SMTP AUTH and IMAP4 to allow developers to upgrade clients that use these now ancient (but beloved in parts) protocols. Support for POP3 is also in the works.
OAuth support is especially important for SMTP AUTH connections (used by applications and appliances to send email via Exchange Online). Although I can see how programmers will update POP3 and IMAP4 email clients to keep them working with Exchange Online, I have a harder job imagining how device manufacturers will get to update all the multi-function devices that send email like job completion notifications, Which is why Microsoft is holding to the line that they don’t plan to disable SMTP AUTH (for now).
Remote PowerShell will also be updated, and anyone using PowerShell to work with Exchange Online today is advised to start using the Exchange Online Management module, which supports MFA and OAuth. More work is needed to allow PowerShell scripts to run in unattended mode. That’s expected to appear in a future update for the module quite soon.
The funny thing is that I was sure in my own mind that this would not happen and said so quite passionately when Paul Robichaux and I recorded episode 18 of our Office 365 Exposed podcast last night. I’ll have to see if Paul can edit those words out as he tweaks the recording for release…
Tracking dates is hard, especially inside an environment like Office 365 changes all the time. Subscribe to the Office 365 for IT Pros eBook and let us do the heavy lifting of date checking.
]]>Shortly after publishing Reporting Exchange Online Mailbox Permissions, I received two notes. One was from Vasil Michev to say that I should have used the REST-based cmdlets from the Exchange Online Management module. The other was a reply to my note in the Microsoft Technical Community to point me to some script snippets covering “the other” mailbox-level permissions that you might assign over mailboxes. These are the Send As and the Send on Behalf Of permissions.
The snippets were just that: bits of code. These bits are valuable because the nature of PowerShell and the way the community works is that you can always (try to) improve what’s gone before. As it happens, I found much the same code in examples in my Exchange Inside Out 2010 book (still available from Amazon). In any case, the point was that knowing all about FullAccess permissions assigned to users is all very well, but to get a full perspective of the permissions set on mailboxes, you should include details of the sending permissions as well.
It would be nice if Exchange returned all mailbox permissions with a single cmdlet, but three are needed:
The script uses the REST-based cmdlets but it’s easy to convert the calls to use the older Remote PowerShell cmdlets if you prefer.
The reasons why three cmdlets are needed are hidden in the mists of time and go back to the first implementation of PowerShell in Exchange 2007. The situation is unlikely to change now.
The script is shown below. It’s a modified version of the previous script and you’ll need to connect to the Exchange Online Management module with an administrator account to run it. You can also download a copy from GitHub.
# ReportMailboxSendPermissionsMailboxes.PS1 # Quick and simple script to generate a report of non-standard permissions applied to Exchange Online user and shared mailboxes # Needs to be connected to Exchange Online PowerShell with an administrative account to run # V1.0 16-Mar-2020 # https://github.com/12Knocksinna/Office365itpros/blob/master/ReportMailboxSendPermissionsMailboxes.PS1 CLS Write-Host "Fetching mailboxes" $Mbx = Get-ExoMailbox -RecipientTypeDetails UserMailbox, SharedMailbox -ResultSize Unlimited -PropertySet Delivery -Properties RecipientTypeDetails, DisplayName | Select DisplayName, UserPrincipalName, RecipientTypeDetails, GrantSendOnBehalfTo If ($Mbx.Count -eq 0) { Write-Error "No mailboxes found. Script exiting..." -ErrorAction Stop } CLS $Report = [System.Collections.Generic.List[Object]]::new() # Create output file $ProgressDelta = 100/($Mbx.count); $PercentComplete = 0; $MbxNumber = 0 ForEach ($M in $Mbx) { $MbxNumber++ $MbxStatus = $M.DisplayName + " ["+ $MbxNumber +"/" + $Mbx.Count + "]" Write-Progress -Activity "Checking permissions for mailbox" -Status $MbxStatus -PercentComplete $PercentComplete $PercentComplete += $ProgressDelta $Permissions = Get-ExoRecipientPermission -Identity $M.UserPrincipalName | ? {$_.Trustee -ne "NT AUTHORITY\SELF"} If ($Null -ne $Permissions) { # Grab information about SendAs permission and output it into the report ForEach ($Permission in $Permissions) { $ReportLine = [PSCustomObject] @{ Mailbox = $M.DisplayName UPN = $M.UserPrincipalName Permission = $Permission | Select -ExpandProperty AccessRights AssignedTo = $Permission.Trustee MailboxType = $M.RecipientTypeDetails } $Report.Add($ReportLine) }} # Grab information about FullAccess permissions $Permissions = Get-ExoMailboxPermission -Identity $M.UserPrincipalName | ? {$_.User -Like "*@*" } If ($Null -ne $Permissions) { # Grab each permission and output it into the report ForEach ($Permission in $Permissions) { $ReportLine = [PSCustomObject] @{ Mailbox = $M.DisplayName UPN = $M.UserPrincipalName Permission = $Permission | Select -ExpandProperty AccessRights AssignedTo = $Permission.User MailboxType = $M.RecipientTypeDetails } $Report.Add($ReportLine) }} # Check if this mailbox has granted Send on Behalf of permission to anyone If (![string]::IsNullOrEmpty($M.GrantSendOnBehalfTo)) { ForEach ($Permission in $M.GrantSendOnBehalfTo) { $ReportLine = [PSCustomObject] @{ Mailbox = $M.DisplayName UPN = $M.UserPrincipalName Permission = "Send on Behalf Of" AssignedTo = (Get-ExoRecipient -Identity $Permission).PrimarySmtpAddress MailboxType = $M.RecipientTypeDetails } $Report.Add($ReportLine) }} } $Report | Sort -Property @{Expression = {$_.MailboxType}; Ascending= $False}, Mailbox | Export-CSV c:\temp\MailboxAccessPermissions.csv -NoTypeInformation Write-Host "All done." $Mbx.Count "mailboxes scanned. Report of send permissions available in c:\temp\MailboxAccessPermissions.csv"
The output is a CSV file sorted by mailbox type (user mailboxes then shared mailboxes) and mailbox name. You can also pipe the output to Out-GridView (Figure 2) to quickly sort and review the results.
The call to Get-ExoMailbox is a good example of how you need to pay attention to upgrading scripts from the older Get-Mailbox cmdlet. Get-ExoMailbox speeds access to data fetched from Exchange Online by forcing coders to specify the properties that they need to process. In this case, we need the Delivery property set (to access the GrantSendOnBehalfTo property) as well as the DisplayName and RecipientTypeDetails properties, which are specified individually.
As always, feel free to customize the script code to your heart’s content. Happy scripting!
Exchange Online is a well-known product at this point. Even so, a new development can throw up something that you don’t know about, just like the property sets used by the EXO cmdlets. Stay current by subscribing to the Office 365 for IT Pros eBook and let us do the heavy lifting of staying updated.
]]>I love receiving suggestions from readers. After posting my note about Microsoft’s epic failure to communicate the real facts about mailbox auditing by default, Ofir Doron pointed out a good way to create a list of Office 365 E3 accounts that could then be used to enable mailbox auditing. You still need to take care of shared mailboxes, so I’ve updated the script to process both user and shared mailboxes.
The basis of the suggestion is to use the GUID of the Office 365 E3 license to identify accounts to check. The GUID is found by running the Get-AzureADSubscribedSku cmdlet from the Azure Active Directory module to return details of the SKUs (stock control units) available in the tenant. The EnterprisePack SKU in the list is Office 365, so the SKU identifier we need to use is 6fd2c87f-b296-42f0-b197-1e91e994b900. This value seems to be the same for all tenants.
Get-AzureADSubscribedSku | Select Sku*, ConsumedUnits SkuId SkuPartNumber ConsumedUnits ----- ------------- ------------- 1f2f344a-700d-42c9-9427-5cea1d5d7ba6 STREAM 6 b05e124f-c7cc-45a0-a6aa-8cf78c946968 EMSPREMIUM 5 6fd2c87f-b296-42f0-b197-1e91e994b900 ENTERPRISEPACK 25 f30db892-07e9-47e9-837c-80727f46fd3d FLOW_FREE 3 a403ebcc-fae0-4ca2-8c8c-7a907fd6c235 POWER_BI_STANDARD 6 26d45bd9-adf1-46cd-a9e1-51e9a5524128 ENTERPRISEPREMIUM_NOPSTNCONF 5 90d8b3f8-712e-4f7b-aa1e-62e7ae6cbe96 SMB_APPS 3 8c4ce438-32a7-4ac5-91a6-e22ae08d9c8b RIGHTSMANAGEMENT_ADHOC 4
The script in the post about mailbox auditing uses this command to fetch a set of mailboxes for Office 365 E3 accounts:
$Office365E3 = "6fd2c87f-b296-42f0-b197-1e91e994b900" $Mbx = Get-AzureADUser -All $True | ? {$_.AssignedLicenses -Match $Office365E3}
We then process those mailboxes to make sure that they are enabled for mailbox auditing.
Moving on from mailbox auditing, a delight of PowerShell is that you can repurpose some new knowledge to solve other problems. In this case, I want to create a report of license assignments in the tenant. In other words, which accounts have been assigned the licenses returned by Get-AzureADSubscribedSku.
This script loops through all the license SKUs to find the accounts assigned each SKU. We put things together in a PowerShell list object and output it via Out-GridView (Figure 1). You need to be connected to Azure Active Directory before running the script.
$Report = [System.Collections.Generic.List[Object]]::new() # Create output file $Skus = Get-AzureADSubscribedSku | Select Sku*, ConsumedUnits ForEach ($Sku in $Skus) { Write-Host "Processing license holders for" $Sku.SkuPartNumber $SkuUsers = Get-AzureADUser -All $True | ? {$_.AssignedLicenses -Match $Sku.SkuId} ForEach ($User in $SkuUsers) { $ReportLine = [PSCustomObject] @{ User = $User.DisplayName UPN = $User.UserPrincipalName Department = $User.Department Country = $User.Country SKU = $Sku.SkuId SKUName = $Sku.SkuPartNumber} $Report.Add($ReportLine) }} $Report | Sort-Object User | Out-GridView
Simple, easy, and a great example of how a little PowerShell can go a long way. And because the code is PowerShell and available to all, you can take it and amend it to match your needs.
Important Update: Microsoft will deprecate the Azure AD and MSOL PowerShell modules in June 2023. After Microsoft 365 moves to a new license management platform, scripts that use the license assignment cmdlets from the Azure AD and MSOL modules will cease working after March 31, 2023. With this in mind, you should update any scripts that use these modules for automation tasks, including license management, to use cmdlets from the Microsoft Graph PowerShell SDK instead. Fortunately, I’ve written a Practical365.com article to explain how to create a licensing report with SDK cmdlets.
Need more information about how to manage Office 365 licenses with PowerShell? Look no further than the Office 365 for IT Pros eBook. It’s packed full of examples.
]]>One of the recommendations made in the Office 365 for IT Pros eBook is that tenant administrators should conduct periodic reviews of permissions assigned to mailboxes to ensure that the right people (other than the mailbox owners) have access, perhaps by creating an Exchange Online mailbox permissions report. A recent request in the Microsoft Technical Community prompted me to look at the situation again to make sure that our advice was still accurate (it is).
I responded to the original question with some quick and dirty PowerShell but decided that a better job could be done. If you use the Get-MailboxPermission cmdlet to examine permissions on an Exchange Online mailbox, several types exist:
For the purpose of this exercise we don’t care about these permissions because they exist on all mailboxes. What we’re looking for are delegated permissions used to grant non-owner accounts access to the mailbox. Vasil Michev, our esteemed technical editor, has a script in the TechNet Gallery to report non-standard permissions, but there’s always room for another PowerShell answer to a problem.
My script (the full version of the Exchange Online mailbox permissions report is available on GitHub) selects user and shared mailboxes (those most likely to have extra permissions). For each mailbox, we extract the permissions and look for those assigned to other Office 365 accounts. We store details of these permissions into a list that is written out to a CSV file after all mailboxes are processed. Here’s the basic idea:
# Quick and simple script to generate a report of non-standard permissions applied to Exchange Online user and shared mailboxes # Needs to be connected to Exchange Online PowerShell with an administrative account to run CLS Write-Host "Fetching mailboxes" [array]$Mbx = Get-Mailbox -RecipientTypeDetails UserMailbox, SharedMailbox -ResultSize Unlimited | Select DisplayName, UserPrincipalName, RecipientTypeDetails If ($Mbx.Count -eq 0) { Write-Error "No mailboxes found. Script exiting..." -ErrorAction Stop } # We have some mailboxes, so we can process them... CLS $Report = [System.Collections.Generic.List[Object]]::new() # Create output file $ProgressDelta = 100/($Mbx.count); $PercentComplete = 0; $MbxNumber = 0 ForEach ($M in $Mbx) { $MbxNumber++ $MbxStatus = $M.DisplayName + " ["+ $MbxNumber +"/" + $Mbx.Count + "]" Write-Progress -Activity "Processing mailbox" -Status $MbxStatus -PercentComplete $PercentComplete $PercentComplete += $ProgressDelta $Permissions = Get-MailboxPermission -Identity $M.UserPrincipalName | ? {$_.User -Like "*@*" } If ($Null -ne $Permissions) { # Grab each permission and output it into the report ForEach ($Permission in $Permissions) { $ReportLine = [PSCustomObject] @{ Mailbox = $M.DisplayName UPN = $M.UserPrincipalName Permission = $Permission | Select -ExpandProperty AccessRights AssignedTo = $Permission.User MailboxType = $M.RecipientTypeDetails } $Report.Add($ReportLine) } } } $Report | Sort -Property @{Expression = {$_.MailboxType}; Ascending= $False}, Mailbox | Export-CSV c:\temp\MailboxPermissions.csv -NoTypeInformation Write-Host "All done." $Mbx.Count "mailboxes scanned. Report of non-standard permissions available in c:\temp\MailboxPermissions.csv"
The CSV file is stored by user mailbox and then shared mailbox (you must use a calculated expression to sort by multiple properties when the first property is sorted in descending order).
Note: The full Exchange Online mailbox permissions report script uses the REST-based cmdlets in the Exchange Online Management module.
As you can see from Figure 1, the Exchange Online mailbox permissions report details FullAccess and SendAs permissions assigned to mailboxes. The fact that these permissions exist isn’t an issue by itself as the permissions are usually well-justified. For instance, FullAccess permission is needed by delegates to have full control over a shared or user mailbox (as in the case of Outlook Mobile delegation). However, it’s important to review each assignment to understand if it is still valid and necessary. If not, the permission should be removed.
The Exchange Online mailbox permissions report doesn’t include folder-level permissions assigned by Outlook. These permissions can be reviewed with the Get-MailboxFolderPermission cmdlet. To find all such permissions for a mailbox, you would need to run Get-MailboxFolderStatistics to generate a list of mailbox folders and then check each folder to see if any permissions exist. I’ll cover how to do this in a future post.
For many more examples of using PowerShell to manage Exchange Online and other Office 365 components, subscribe to the Office 365 for IT Pros eBook and find some hidden jewels.
]]>In August 2019, Microsoft started to roll out the SharePoint Online site swap feature. The functionality swaps out the tenant’s root site and replaces it with another site, usually a communications site. When a root site swap happens, the old root site is archived and remains available for administrative access while user traffic is directed to the site that’s swapped in.
The root site is automatically provisioned for all SharePoint Online tenants and has a URL of https://tenantname.sharepoint.com/, so it’s something like https://office365itpros.sharepoint.com/. The root site is typically the starting point for a company’s intranet, so it’s important that the site works well.
Originally sites swaps were only possible by using the Invoke-SPOSiteSwap cmdlet. In November 2019, Microsoft updated the SharePoint Admin Center to include the Replace site option, which is only exposed when the current root site is selected (Figure 1).
With just a few details to worry about, like choosing the right type of site to become the new root (it can’t be connected to an Office 365 group), the technology worked well. However, Microsoft restricted site swaps to Office 365 tenants with fewer than 1,000 seats. The reason for the restriction is that Microsoft wanted to be sure that everything about site swaps worked perfectly. After all, if a tenant loses access to its root site because of a bug, it will affect a lot of functionality.
Gradually Microsoft eased back the restriction to make site swap available to more tenants until they reached the 10,000 seat level and then halted. As announced in Office 365 notification MC204488 on February 22, they’re now ready to let the largest tenants go ahead and swap root sites.
Large tenants often have the same kind of SharePoint activity as small tenants do; the difference is that the traffic generated by large tenant tends to expose any flaw in a process. For this reason, it’s important to do some up-front planning to make sure that the replacement root site is ready before it is swapped in.
Microsoft recommends that administrators use the Page Diagnostics tool for SharePoint Online to check replacement root sites before proceeding with a swap. This tool is an add-in for Chrome or Edge that analyzes page components to identify potential issues. For instance, if some graphics used by the page are large files, they might slow page loading. This is a bigger issue for larger sites because the higher traffic volume will accentuate the effect of the larger files.
The page diagnostics tool reports warnings and errors. It’s up to administrators if they want to heed the warnings before proceeding (an automated checker might miss something a human knows, or humans just know best), but they can’t go ahead with a site swap if errors exist. Those errors must be fixed before a site swap is possible.
When everything is ready, you can run the Invoke-SPOSiteSwap cmdlet (support in the SharePoint Admin Center for site swaps in large tenants is coming). You must update the SharePoint Online PowerShell module to version16.0.19807.1200 or higher to be able to execute a site swap in a large tenant. The easiest way to do this is by updating the module from the PowerShell Gallery:
Update-Module Microsoft.Online.Sharepoint.PowerShell -Force
The upgraded version of the cmdlet includes an integrated page diagnostic check for errors and warnings plus a new Force parameter to allow administrators to override warnings (but never errors). To perform a site swap, the command format is:
Invoke-SPOSiteSwap` -SourceURL https://office365itpros.sharepoint.com/sites/NewMarketingComms ` -TargetURL https://office365itpros.sharepoint.com ` -ArchiveURL https://office365itpros.sharepoint.com/sites/OldMarketingComms -Force
As with anything in large organizations, it’s usual to plan operations like site swaps well ahead of time (so there’s no reason not to run page diagnostics) and to schedule the event for a period of low user activity, like a weekend. This avoid any user issues like 404 errors when the page swap is in flight.
Happy swapping!
Good SharePoint Online management is essential to the overall health of an Office 365 tenant. The Office 365 for IT Pros eBook reflects this and includes a ton of interesting and useful advice about how to work with SharePoint Online.
]]>Microsoft has updated the REST-based Exchange Online Management PowerShell module. This is an important release that you should upgrade to as soon as possible. To upgrade, run the following command from a PowerShell session with administrative permissions:
# Upgrade to the latest version of the Exchange Online Management module Update-Module ExchangeOnlineManagement -RequiredVersion 0.3582.0
After the upgrade completes, check that you’re using the right module with:
Get-Module |?{$_.Name -eq "ExchangeOnlineManagement"}| Select Name, Version Name Version ---- ------- ExchangeOnlineManagement 0.3582.0
Version 0.3582.0 is the latest release of the Exchange Online Management module available at the time of writing. It might be newer when you upgrade. As you can see by the zero used for the major version number, this module is still a preview release.
The developers have included some release notes with the module:
(Find-Module # Access release notes for the Exchange Online Management module (Get-Module ExchangeOnlineManagement).ReleaseNotes
Apart from several security and other bug fixes in the new module, the new module contains four important updates:
As always, it’s a good idea to check for module updates periodically to ensure that you don’t miss out on bug fixes. The Exchange Online Management is still in preview, so more bugs are expected, and change will be faster.
Because these cmdlets deliver great performance and reliability benefits when dealing with very large sets of Exchange objects, some are looking for a way to run them in a secure manner in background processes. I know that this is high on the list of what the engineers want to do in the near future, so that’s another good reason to keep checking for updates.
The Office 365 for IT Pros eBook includes hundreds of practical examples of PowerShell in use to manage Exchange and other workloads. Upgrade your knowledge with the best Office 365 book on the market.
]]>The SharePoint Online Management Shell is a Windows PowerShell module designed for command-line operations and inclusion in PowerShell scripts. The module makes it possible to perform batch processing for tasks like reports and is the only way to achieve some management tasks in SharePoint and OneDrive.
Like with many other cloud components, Microsoft updates the SharePoint Online Management Shell almost every month to align with the release cadence of the SharePoint Client-Side Object Model (CSOM) API libraries. The updates include new cmdlets, new parameters for cmdlets, and other tweaks. If you use PowerShell to work with SharePoint Online, it’s important that you use the latest module.
You can download an MSI (installable package) for the latest SharePoint Online Management Shell module. Once downloaded, you run the executable to install the module, remembering to uninstall any previous version first. The MSI version is the traditional method to distribute updated modules, but since Microsoft released the Microsoft.Online.SharePoint.PowerShell module in the PowerShell Gallery, it’s more convenient to install it from there.
To install the SharePoint Online module, run PowerShell as an administrator and run this command:
Install-Module -Name Microsoft.Online.SharePoint.PowerShell
To update to the latest version, run:
Update-Module -Name Microsoft.Online.SharePoint.PowerShell
The Connect-SPOService cmdlet is used to connect to the SharePoint administration endpoint for a tenant. (the same as used for the SharePoint Admin Center) To build the endpoint, take the normal SharePoint root for your tenant (like https://office365itpros.sharepoint.com/) and insert an “-admin” after the tenant name. For example:
# Connect to SharePoint Online administration endpoint Connect-SPOService -URL "https://office365itpros-admin.sharepoint.com"
The SharePoint Online module is designed for administrative tasks, so you should always connect with an account that has Global Administrator or SharePoint Administrator rights for the tenant.
As part of the connection to the administration endpoint, the SharePoint Online module is loaded into your PowerShell session and you can check the version of the installed module installed:
Get-Module |? {$_.Name -eq "Microsoft.Online.SharePoint.PowerShell"}| Format-Table Name, Version Name Version ---- ------- Microsoft.Online.Sharepoint.PowerShell 16.0.19724.12000
In your scripts, it’s a good idea to include a test to make sure that a connection is available to SharePoint Online before running any other code. Here’s a very simple test for a connection:
Try { $TestSPO = Get-SPOTenant } Catch { Write-Host "Error accessing SharePoint Online - please connect to the service before retrying"; break }
To see the available SharePoint Online cmdlets, run:
Get-Command -Module "Microsoft.Online.SharePoint.PowerShell"
The current SharePoint Online Management Shell module includes 179 cmdlets. Theses cmdlets can be divided into several types, including:
It’s very common to want to retrieve information about the sites in a tenant. To do this, run the Get-SPOSite cmdlet. The Limit parameter specifies that all sites are to be returned.
Get-SPOSite -Limit All
Note that this command returns all types of sites found in the tenant, including redirect sites (created because of site URL renames), hub sites, the app catalog, and sites used by Teams private channels. In most cases, it is best to be more precise when using Get-SPOSite to find sites by specifying the template for the type of sites you want to process. For instance, this command only returns the sites used by Teams private channels.
Get-SPOSite -Limit All -Template "TEAMCHANNEL#0"
Sometimes you need to retrieve information about a SharePoint Online site for use elsewhere inside Office 365. For example, if you want to include a document library belonging to an Office 365 group, team, or team private channel on an eDiscovery case or content search, you need to specify the site’s URL as a search location.
If you use PowerShell to examine the properties of an Office 365 group, you will see three SharePoint Online URLs returned for the site, the document library, and the shared OneNote notebook. The value returned in SharePointSiteUrl property is the one needed when you wish to add a site to content searches, found using the Get-UnifiedGroup PowerShell cmdlet:
Get-UnifiedGroup –Identity "Office 365 for IT Pros" | Format-List Share*Url SharePointSiteUrl : https://Office365ITPros.sharepoint.com/sites/O365ITPros SharePointDocumentsUrl : https://Office365ITPros.sharepoint.com/sites/O365ITPros/Shared Documents SharePointNotebookUrl : https://Office365ITPros.sharepoint.com/sites/O365ITPros/SiteAssets/Office 365 for IT Pros Notebook
The URL retrieved from the Office 365 Group can be used with Get-SPOSite to find further information about the site belonging to the group.
It’s also possible to discover what Office 365 Group a site belongs to by using the GroupId property stored for the site. For example:
Get-UnifiedGroup -Identity (Get-SPOSite -id https://office365itpros.sharepoint.com/sites/O365ExchPro).GroupId.Guid | Format-Table DisplayName, SharePointSiteURL DisplayName SharePointSiteUrl ----------- ----------------- Office 365 for IT Pros https://office365itpros.sharepoint.com/sites/O365ITPros
The functionality available through the SharePoint Online PowerShell module is limited and restricted to basic administration tasks performed by a SharePoint Online administrator, such as managing sites and tenant settings. To get extra functionality, use the cmdlets available in the SharePoint PnP PowerShell cmdlets project in GitHub, part of the Patterns & Practices community initiative. To go further and be able to access all the aspects of SharePoint, you will need to use the CSOM API in your PowerShell scripts. To install the PnP PowerShell module, run this command:
Install-Module SharePointPnPPowerShellOnline -Force
Many good examples of using the SharePoint PnP cmdlets are available on the web.
This information is an example of the kind of text you’ll find in the Office 365 for IT Pros eBook. Don’t you think you should be a subscriber?
]]>With Microsoft’s intention to support cloud signatures for Outlook desktop (for Windows), I’ve been working through the challenges of generating and maintaining corporate email signatures for Office 365 users. Previously, I discussed what needs to be done to update the system registry settings for Outlook signatures and explained why the current situation works well for individual users but is a real pain for central management. Today, I want to turn my attention to OWA signatures.
It seems weird that after nine years of Office 365, OWA and Outlook desktop still use different signatures. It’s a pain for many reasons, including duplication of administrator effort to maintain signatures.
This situation might change (at least, I hope so) if Microsoft’s new cloud signatures for Outlook pick up some of the framework that exists to allow administrators update OWA signatures centrally. One thing that won’t go away is the absolute necessity of accurate directory information. If the directory doesn’t hold good data about users, it’s going to be much harder to generate good-looking (and useful) signatures.
OWA stores its signature information as mailbox settings. Two signatures can be defined: plain text and HTML and mailbox settings determine which is used for new messages and replies/forwards.
The Set-MailboxMessageConfiguration cmdlet is the core component in OWA signature management. Its important parameters are:
You can ignore the SignatureTextOnMobile, UseDefaultSignatureOnMobile, and AutoAddSignatureOnMobile parameters. They only apply to the old OWA for Devices client and aren’t used by the Outlook Mobile client.
With these parameters in mind, a simple command to manage signatures for a mailbox is:
Set-MailboxMessageConfiguration -Identity James.Ryan -AutoAddSignature $True ` -AutoAddSignatureOnReply $False -SignatureText "From the desk of James Ryan" ` -SignatureHTML "<h2>From the desk of James Ryan</h2>"
Users pick up the amended signature the next time they refresh OWA.
We can reuse some of the code in the script to update Outlook signature settings in the system registry to serve the same function for an OWA signature. To generate and apply a individualized corporate signature to multiple mailboxes, we need to:
Set-MailboxMessageConfiguration -Identity $M.UserPrincipalName ` -SignatureHTML $SignatureHTML -AutoAddSignature $True ` -AutoAddSignatureOnReply $False
You can download a working script that illustrates the principals of how to go about centralized management for OWA signatures from the Office365ITPros GitHub repository.
To complete the solution, you could schedule a monthly run of the script to process mailboxes and update signatures. Perhaps every month the script could be updated to allow corporate PR to insert a new cheery catchphrase (or graphic about the latest corporate initiative) into signatures. Or maybe that’s a bad idea.
Another idea is for the script to create a report of missing directory properties and email the report to administrators when the script finishes to help improve the quality of the information in the directory.
Users can edit the signature created for their mailbox through OWA options. However, if you make the signature attractive enough, they’ll probably leave it alone. There’s no out-of-the-box method for administrators to block the option to update signatures, but you could try doing this with a user role assignment policy to remove user access to the Set-MailboxMessageConfiguration cmdlet.
OWA signatures prove the value of holding user signature information in the cloud. It’s so much simpler when administrators can run a PowerShell script to update signatures across an Office 365 tenant on a periodic basis. This doesn’t mean that the ISV market for autosignature products will go away because those products include a heap of functionality that I haven’t touched upon here. And those products are engineered by people who think about nothing but how to manage email signatures.
However, for those who would like to write and maintain their own signature generation code, it would be nice if Microsoft builds on what exists for OWA to have Outlook use the same signature information held in user mailboxes. And it would be even better if Outlook Mobile joined the party too. That might be too much to ask in the first round.
Worried that you can’t quite get your head around using PowerShell to manage Office 365? Subscribe to the Office 365 for IT Pros eBook and learn from the hundreds of examples in the book.
]]>After finishing my article about Microsoft developing cloud signatures for Outlook, I decided to look at what’s involved with updating an Outlook signature with PowerShell. As it turns out, there’s quite a few methods suggested in various blogs and articles, mostly on the theme of how to use information from Active Directory into signatures (here’s an example).
Most of the scripts I met were old and suffered from one problem or another, like failing to support Office ProPlus (click to run) or not using Azure Active Directory. So I decided to explore the topic by putting together my own version.
As noted in my other article, Outlook for Windows stores information about its settings in the system registry. The first issue was to find out from the registry which Azure Active Directory account is used with Outlook. My solution is to fetch the accounts information and parse out the user principal name. I then use the user principal name to fetch account properties from Azure Active Directory:
$UserAccount = Get-ItemProperty -Path HKCU:\Software\Microsoft\Office\Outlook\Settings -Name Accounts | Select -ExpandProperty Accounts $UserId = (ConvertFrom-Json $UserAccount).UserUpn[0] # Retrieve the properties of the user from Azure Active Directory $UserProperties = Get-AzureADUser -ObjectId $UserId
Outlook can have multiple profiles on a PC. Each profile has its own settings, including signatures. The default profile name is Outlook, and it’s the one that you’ll probably encounter most often (based on a limited test). But you can have more profiles and then must get into the business of figuring out how to update which profile with which signature. Given I was doing this on a wet Sunday afternoon, I decided to cheat by:
# Find Outlook Profiles in registry $CommonSettings = $False $Profiles = (Get-ChildItem HKCU:\Software\Microsoft\Office\16.0\Outlook\Profiles).PSChildName # This script can only deal with a single (default profile); more code needed to handle multiple profiles If ($Profiles -eq $Null -or $Profiles.Count -ne 1) { Write-Host "Warning - Applying signature to all Outlook profiles" $OutlookProfilePath = "HKCU:\Software\Microsoft\\Office\16.0\Common\MailSettings" $CommonSettings = $True} Else { # Path to default profile is elsewhere in the registry $OutLookProfilePath = "HKCU:\Software\Microsoft\Office\16.0\Outlook\Profiles\" + $Profiles.Trim() + "\9375CFF0413111d3B88A00104B2A6676\00000001" }
Sometimes the path to the user profile in the registry ends with 00000002 (the first might point to the Outlook address book), so your code should be prepared to handle this situation.
Now that I know where in the registry to update, we can proceed to generate the signature file. This is usually an RTF file written to %appdata%\Microsoft\Signatures (English language PCs). A HTML file is also acceptable. Many scripts call Word as a COM object to create or update a signature file. I looked at using the impressive PSWriteWord module (available in the PowerShell gallery) to do the job with code like this:
Import-Module PSWriteWord $WordDocument = New-WordDocument $FilePath Set-WordTextFontFamily $Line = $Null Add-WordText -WordDocument $WordDocument -Text $Line $Line = $UserProperties.DisplayName Add-WordText -WordDocument $WordDocument -Text $Line -Bold $True -FontSize 12 -FontFamily "Segoe UI" $Line = $UserProperties.Title Add-WordText -WordDocument $WordDocument -Text $Line -FontSize 12 -FontFamily "Segoe UI" $Line = "Email: " +$UserProperties.WindowsEmailAddress Add-WordText -WordDocument $WordDocument -Text $Line -FontSize 10 -FontFamily "Segoe UI" $Line = "Telephone: " + $UserProperties.Phone + " Mobile: " + $UserProperties.MobilePhone Add-WordText -WordDocument $WordDocument -Text $Line -FontSize 10 -FontFamily "Segoe UI" $Line = $UserProperties.StreetAddress Add-WordText -WordDocument $WordDocument -Text $Line -FontSize 10 -FontFamily "Segoe UI" $Line = $UserProperties.StateOrProvince Add-WordText -WordDocument $WordDocument -Text $Line -FontSize 10 -FontFamily "Segoe UI" $Line = $UserProperties.PostalCode Add-WordText -WordDocument $WordDocument -Text $Line -FontSize 10 -FontFamily "Segoe UI" ### Save document Save-WordDocument $WordDocument -Language 'en-US'
It’s easy to generate a Word DOCX file. You still must convert the signature file to RTF, which can be done using a Word COM instance, but I ran into some problems when calling Word, apparently due to failure to load a DLL.
$WordDocument = $WordApplication.Documents.Open($FilePath) You cannot call a method on a null-valued expression. At line:1 char:1 + $WordDocument = $WordApplication.Documents.Open($FilePath) + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : InvalidOperation: (:) [], RuntimeException + FullyQualifiedErrorId : InvokeMethodOnNull
Not wanting to reinstall Office, I went back to my old backstop of creating formatted HTML text. To get a head start, I used the free email signature generator tool from Code Two Software to get some ideas of what should be in the signature and what the necessary HTML would look like. The code to build the HTML and write out the signature file is:
# Construct a signature file in HTML format using the information fetched from Azure Active Directory $CompanyLogo = "https://i1.wp.com/office365itpros.com/wp-content/uploads/2020/02/2020EditionVerySmall.jpg" $HeadingLine = "<title>Signature</title><br>" $ImageLine = "" $PersonLine = "' $EndLine = "<table style="`"FONT-SIZE:" 8pt;="" color:="" gray;="" font-family:="" `'segoe="" ui`'="" `"=""> <tbody><tr><td><img src="" + $CompanyLogo + "" border="0"></td><td padding="0"><b>" + $UserProperties.DisplayName + " </b> " + $JobTitle + "<br>" $CompanyLine = "<b>" + $CompanyName + "</b> " + $StreetAddress + ", " + $City + ", " + $PostalCode + "<br>" + $UserProperties.TelephoneNumber + "/" + $UserProperties.Mobile + " Email: " + $UserProperties.Mail + "<br><br>" # Facebook and Twitter icons $IconsLine = '</td></tr><tr><td style="font-size: 10pt; font-family: Arial, sans-serif; padding-bottom: 0px; padding-top: 5px; padding-left: 10px; vertical-align: bottom;" valign="bottom"><span><a href="https://www.facebook.com/Office365itpros/" target="_blank" rel="noopener noreferrer"><img border="0" width="23" alt="facebook icon" style="border:0; height:23px; width:23px" src="https://i0.wp.com/office365itpros.com/wp-content/uploads/2020/02/Facebook.png"></a> </span><span><a href="https://twitter.com/12Knocksinna" target="_blank" rel="noopener noreferrer"><img border="0" width="23" alt="twitter icon" style="border:0; height:23px; width:23px" src="https://i1.wp.com/office365itpros.com/wp-content/uploads/2020/02/Twitter.png"></a></span></td></tr></tbody></table><br><br>" # Put everything together and output the HTML file $SignatureHTML = $HeadingLine + $ImageLine + $PersonLine + $CompanyLine + $Iconsline + $EndLine | Out-File $HtmlPath
The final step is to update the registry with details of the new signature file. Here’s how I updated the settings (these settings mean that Outlook inserts the signature in new messages and replies/forwards):
# Update the registry settings where Outlook picks up its signature information If (Test-Path $TargetForSignatures) { Get-Item -Path $OutlookProfilePath | New-Itemproperty -Name "New Signature" -value $SignatureName -Propertytype string -Force Get-Item -Path $OutlookProfilePath | New-Itemproperty -Name "Reply-Forward Signature" -value $SignatureName -Propertytype string -Force }
The resulting signature is pretty nice (Figure 1), and I am happy with it, even if the code to generate the signature is a bit kludgy. For this to work in production, you’d have to make sure that the script called the Connect-AzureAD cmdlet to connect to Azure Active Directory and add a pile of error checking and other essential pieces. It’s also important to underscore the importance of an accurate directory in this exercise. If your directory isn’t populated with up-to-date information about people, any signature which depends on that information won’t be successful. If you’re uncertain about the accuracy of your directory, maybe a visit to Hyperfish might be a good idea.
If you want to make the script better, you can grab a copy from GitHub. Make sure you let us know what you did to improve things by writing a comment to this post.
My wet afternoon’s coding taught me that the ISVs who build auto-signature products for Office 365 have a lot to cope with. And that Microsoft’s work to put Outlook signatures in the cloud can only be a good thing.
Making sure that users have the right signature is a mixture of client and mailbox management. The Office 365 for IT Pros eBook covers both topics in-depth and at length. You should subscribe!
]]>I’ve written a couple of articles about using Microsoft Graph queries with PowerShell to access data that you can’t normally get to with cmdlets. For instance, this example explains how to report the somewhat bizarre email addresses assigned to Teams channels. When a channel is email-enabled, people can post to the channel by sending email to the assigned address, which is a good way to introduce information into Teams. Apart from being posted as new topics in the channel, messages are captured in the channel’s folder in the SharePoint document library belonging to the team.
But as a comment to the article notes, when you use the Invoke-WebRequest cmdlet to send a Graph command to fetch information about the set of Teams in a tenant, the Graph responds with 100 teams. This is what the Graph intends to do because it doesn’t want the response to be too large. And the response is good enough to prove the principle of working with the Graph through PowerShell. However, once you get to production, you probably need to deal with more than 100 teams and need to be able to fetch all the teams in the tenant and process them.
A few days ago, Mike Tilson posted a note in the Office 365 Facebook group asking if it is “possible to run a report (hopefully via PowerShell) to report on what Microsoft Teams apps (third party apps like Polly) are installed across your environment?” As it happens, the esteemed technical editor for the Office 365 for IT Pros eBook, Vasil Michev, had written a script to report on apps and tabs. I took his script and made (in my mind) some improvements, and I gave Mike a copy of the script.
Mike’s response was that the script worked great in a small environment but had 2,000+ teams (among 4,000-odd groups) to report on in production. The code needed to be upgraded to process larger numbers.
The solution is a thing called a nextlink, available when Microsoft Graph queries have more data to provide to clients than is returned to the original request (server-side paging). A page is the set of data returned for a Graph call and several pages might need to be retrieved to fetch the full set of objects you want to process The documentation says “When a result set spans multiple pages, Microsoft Graph returns an @odata.nextLink property in the response that contains a URL to the next page of results.” In effect, the nextlink tells the Graph the next set of data to return if an application wishes to request it following its first call. To be sure that you get all data, you need to fetch it page by page until the nextlink is null. This is called pagination.
If you use the Graph Explorer to play with Microsoft Graph queries, you’ll see that a nextlink turns up in calls like “all groups in my organization.” You know it’s a nextlink because it includes the term “skiptoken” as in https://graph.microsoft.com/v1.0/groups?$skiptoken=X%274453707402 (a real nextlink is much longer). Microsoft Graph queries can generate nextlinks after fetching less than 100 items. For instance, the default number of folders retrieved from a mailbox is 10.
The uprated code:
Here’s the code, based on an original solution created by Mike Tilson.
$Uri = "https://graph.microsoft.com/V1.0/groups?`$filter=resourceProvisioningOptions/Any(x:x eq 'Team')" [array]$Teams = Invoke-WebRequest -Method GET -Uri $Uri -ContentType "application/json" -Headers $Headers | ConvertFrom-Json If ($Teams.Value.Count -eq 0) { Write-Host "No Teams found - exiting!"; break } $Teams.Value.ForEach( { $TeamsHash.Add($_.Id, $_.DisplayName) } ) $NextLink = $Teams.'@Odata.NextLink' While ($NextLink -ne $Null) { $Teams = Invoke-WebRequest -Method GET -Uri $NextLink -ContentType $ctype -Headers $headers | ConvertFrom-Json $Teams.Value.ForEach( { $TeamsHash.Add($_.Id, $_.DisplayName) } ) $NextLink = $Teams.'@odata.NextLink' }
Another pragmatic solution to the problem of how to fetch all teams in a tenant is to use the Get-Team cmdlet. Using Get-Team is much slower than the code listed above (minutes instead of seconds), but the cmdlet handles the paging for you.
After fetching the set of teams, we can begin to process each team to discover what apps and tabs are installed in it. The Graph calls in the script to fetch channels, tabs, and apps don’t use paging. A team can have up to 200 channels, so I guess the call to fetch channels might need to change to handle such a well-endowed (and possibly confusing) team.
Although we can now fetch all the teams in a tenant, things aren’t perfect yet. I found a problem processing archived teams, where the attempt to retrieve app information fails. I can’t see how to identify an archived team from the information returned, so more research is needed. (Update: check the isArchived property in team settings to find archived teams).
Another issue is that the auth token expires after an hour and stops the script. A refresh token is needed at this point. (Update: the latest version of the Teams and Groups Report script illustrates how to renew an access token).
If you’d like to improve the code and make it even better, you can get the script from GitHub. In the meantime, enjoy using Microsoft Graph queries to process data with PowerShell.
Need help understand how to use PowerShell to manage Office 365 Groups and Teams? The Office 365 for IT Pros eBook contains a ton of examples to help you get going.
]]>Idly playing with PowerShell on a dull Friday afternoon in winter, I decided to respond to a question in the Office 365 Facebook group about how to be notified when someone deletes a team. Presumably the requirement exists to allow tenant administrators to leap into action to chastise people who delete teams without asking, or something like that.
My initial response was that this is the same problem as you have when someone deletes an Office 365 Group (each team is a group) and directed the questioner to this 2018 Petri article, which describes how to check groups in a soft-deleted state waiting for their 30-day retention period to expire. During this time, you can rescue a soft-deleted group and return it to full working order.
The Office 365 Security and Compliance Center includes the ability to create activity alerts (in the Alerts section). These alerts fire when an Office 365 audit record is captured for specific events, like team deletions (Figure 1). When an alert happens, email notifications go to the people specified in the alert to tell them that something’s happened. It all sounds good.
When you access activity alerts in the Security and Compliance Center, you’ll see a banner saying that Microsoft has a better solution (activity policies). Activity alerts have some problems. First, they can fire some time after an event occurs. It all depends when the audit log ingests events from the workload responsible for the monitored activity. Usually the delay is between 15-30 minutes for most Office 365 workloads, but it can be longer. Second, whatever process is responsible for sending the email notifications seems to be asleep for most of the day as the arrival time of the notifications is very unpredictable. You might even say unreliable.
It’s easy to create your own version of activity alerts based on the same data as used by Office 365. First, we look in the Office 365 audit log for team deletion events. Then we distribute the information via email or Teams.
The PowerShell script below searches for team deletion events from the last seven days and stores the information in a list object.
<pre class="lang:ps">CLS; Write-Host "Searching Office 365 Audit Records to find Team deletions" $StartDate = (Get-Date).AddDays(-7); $EndDate = (Get-Date) $Records = (Search-UnifiedAuditLog -Operations TeamDeleted -StartDate $StartDate -EndDate $EndDate -ResultSize 1000) If ($Records.Count -eq 0) { Write-Host "No audit records for Team deletions found." } Else { Write-Host "Processing" $Records.Count "team deletion audit records..." $Report = [System.Collections.Generic.List[Object]]::new() # Create output file # Scan each audit record to extract information ForEach ($Rec in $Records) { $AuditData = ConvertFrom-Json $Rec.Auditdata $ReportLine = [PSCustomObject] @{ TimeStamp = Get-Date($AuditData.CreationTime) -format g User = $AuditData.UserId Action = $AuditData.Operation Team = $AuditData.TeamName } $Report.Add($ReportLine) } } Cls Write-Host "All done - Team deletion records for the last 90 days" $Report | Format-Table TimeStamp, Action, Team, User -AutoSize</pre>
After we know what teams were deleted in the last week, we can use the information stored in the $Report variable to create notifications for administrators that are posted via email or Teams.
Creating and sending email notifications in PowerShell is straightforward (an example is explained here). Remember that the account used to send the message must be enabled for SMTP authentication as otherwise the Send-Message cmdlet will fail.
Posting to a Teams channel can be done using the incoming webhook connector as described in this article. In some respects, it seems appropriate that notifications about deleted teams should be posted to Teams, but I will let you make your own mind up.
The Office 365 Audit log is stuffed full of interesting information to explain how and when things happen inside a tenant. The Office 365 for IT Pros eBook contains many examples of using the audit log to good effect. Subscribe to receive monthly updates full of Office 365 goodness.
]]>One of the nice things about PowerShell is the ease in which a script can be adapted to meet different circumstances, improve the flow of processing, or simply execute code the way you like code to run. A recent post by Ståle Hansen confirmed this yet again.
Like me, I don’t think Ståle regards himself as a professional programmer (I’ve probably offended him now). He spends most of his time thinking about Teams and voice/phone systems, which is he covers in chapter 16 of the Office 365 for IT Pros eBook. In his post, Ståle describes how to use PowerShell to send various items of information about Microsoft 365 to a Teams channel using the incoming webhook connector. The idea is to scan for recent updates and post new items as message cards to inform tenant admins about new features.
The original work was done by Einar Asting, who created a series of scripts covering how to extract and post information from the Microsoft 365 roadmap, Office 365 health status, the Office 365 message center, Azure Resource Health, and Office ProPlus updates. Ståle’s twist on the topic is to post items for different technologies to their own channel. For instance, anything to do with SharePoint Online shows up in the SharePoint channel, and so on.
All good stuff. We have covered some of the same ground about posting through the incoming webhook connector here with posts about:
I liked some of the extra touches that Einar had added in his post about extracting Microsoft 365 roadmap updates and posting the items to a Teams channel, like using different colors to highlight whether a roadmap item was in development, rolling out, or generally available.
Each channel needs their own incoming webhook connector. The connector cannot be setup up programmatically, but creating a new connector is quickly done through the Connectors link in the channel’s […] menu.. The important thing is to copy and store the URI created for the connector as you need that to post to the channel (Figure 1).
We all have our own ideas how code should work. In my case, I tried to make the script more flexible and improve the message cards generated in Teams. After retrieving data from the RSS feed for the Microsoft 365 roadmap, the script processes the information and creates a list that is written out to a CSV file. You can export data from the Microsoft 365 roadmap using a choice in the web site, but it’s always nice to have control over what’s exported. The CSV file can be used for later analysis. For instance, if you only want to review roadmap items relating to Exchange Online and list the items with the latest item first, you can do this with the following command:
$Report | Sort {$_.Date -as [datetime]} -Descending | ?{$_.Technology -eq "Exchange"} | Format-Table FeatureId, Date, Technology, Title FeatureId Date Technology Title --------- ---- ---------- ----- 59441 6 Dec 2019 16:00 Exchange Support for Plus Addressing in Office 365 59438 5 Dec 2019 16:00 Exchange Message Recall in Exchange Online 59437 5 Dec 2019 08:00 Exchange Send from proxy addresses (aliases) from OWA
Identifying the technology that a roadmap item belongs to also makes it easier to direct a post to a specific channel using the PowerShell Switch command.
The original idea was to use Azure Automation to run the script daily to post message cards for new roadmap items created in the last 24 hours . My version does much the same but uses a slightly different approach and format for the message card (Figure 2). Beauty is in the eye of the beholder.
The complete script is too long to post here. If you want a copy, head over to GitHub and grab the code from the Office365ITPros repository.
It’s hard to be truly original and most of the time we build on what has gone before. In this case, I adapted a script to meet my view about how things should work. Feel free to disagree and please go ahead to create your own, even better, version.
The Office 365 for IT Pros eBook contains hundreds of PowerShell examples. Some of the code is even useful! All of it is interesting…
]]>As I have noted before, Teams suffers from having two PowerShell modules. The Skype for Business Online module will eventually go away, but only when Microsoft retires Skype for Business Online in July 2021. Until then, I fear we must cope with the capricious nature of the module.
One of the most irritating aspects of the Skype for Business Online module is its inability to keep a session active for longer than an hour. I have no idea why Microsoft let this situation persist for so long, especially for a cloud service, but at least they have now released the Enable-CsOnlineSessionForReconnection cmdlet to paper over the problem. If you run the cmdlet in a session connected to Skype for Business Online, the session won’t time out and refuse to reconnect as once was the case. Instead, the behavior is slightly better and the session should reconnect, just like any well-mannered PowerShell session should.
Of course, the immediate question that comes to mind is why this cmdlet is necessary at all. Why, for instance, didn’t Microsoft fix the underlining problem in the New-CsOnlineSession cmdlet so that once a session is established with the Skype for Business Online endpoint, it stays stable and usable for as long as is needed?
I suspect that the answer is that Microsoft didn’t want to open up a can of worms that might lurk in New-CsOnlineSession and decided instead to patch the problem with Enable-CsOnlineSessionForReconnection, the logic being that they didn’t want to invest any more engineering effort than necessary in a module that will soon be defunct. The rationale is understandable, even if it is also irritating.
In any case, to take advantage of the fix, download and install the latest version of the Skype for Business Online module (another irritation is that the new module has the same 7.0.0.0 version number as the previous version, perhaps yet another indication of the quick fix solution we see here). You can then insert the cmdlet in your connection scripts. I use a simple function in my PowerShell profile to connect to Skype for Business Online, which I have updated to make sure that Enable-CsOnlineSessionForReconnection is used every time I connect to the endpoint.
Function Connect-SfBO { Import-Module SkypeOnlineConnector $userCredential = $O365Cred $sfbSession = New-CsOnlineSession -Credential $O365Cred Import-PSSession $sfbSession Enable-CsOnlineSessionForReconnection }
Microsoft has published some documentation explaining that once the cmdlet is run, you should see that a Skype for Business Online PowerShell session reconnects after 60 minutes. Testing shows that the reconnection is automatic, and I haven’t experienced any issues, but given the size of Office 365 and the variety of configurations people run under, no guarantee is given that it will work as smoothly in your environment. Always test before deploying anything!
Coping with the oddities and mysteries of Office 365 is our specialty. Or at least we’re used to handling this kind of stuff. Which is why we pack the Office 365 for IT Pros eBook full of information like this.
]]>One of the actions taken when the URL for a SharePoint Online site is renamed is the creation of a redirect site. A redirect site takes the old site URL and uses special headers and logic to redirect browser requests to the new URL for the site. Essentially, the redirect site is a pointer for the old site to make sure that links to the old site URL continue to work.
The documentation explains that redirect sites are created with a special template (REDIRECTSITE#0). This means that you can use the Get-SPOSite PowerShell cmdlet to see a list of redirect sites with the command:
# Find all redirect sites in the tenant Get-SPOSite -Template "REDIRECTSITE#0" Url Owner Storage Quota --- ----- ------------- https://office365itpros.sharepoint.com/sites/europeanoffice365engage 26214400 https://office365itpros.sharepoint.com/sites/askhr 26214400 https://office365itpros.sharepoint.com/sites/EngineeringExcellence 26214400
If you look at the details of a redirect site, you’ll see that its Lockstate is set to ReadOnly and its Title is RedirectSite. There’s no obvious link to the new (renamed) site URL.
Many PowerShell scripts create a collection of sites for processing by running a command like:
# Form collection of SharePoint Online sites in a tenant $Sites = Get-SPOSite -Limit All
The collection created will contain redirect sites because you haven’t told Get-SPOSite to exclude them. It might be the case that the presence of the redirect sites won’t cause any problems for the other commands in the script, but it’s probably best to exclude these sites unless you have good reason to process them. Accordingly, the call to Get-SPOSite should be updated. For instance, you could simply filter out the redirect sites:
# Get SharePoint Online sites without redirect sites $Sites = Get-SpoSite -Limit All |?{$_.Template -ne "REDIRECTSITE#0"}
The output now excludes redirect sites but includes hub sites, app catalogs, and the sites owned by Teams private channels (which use a template called TEAMCHANNEL#0). In most cases, it’s best not to fetch all sites and instead ask Get-SPOSite to only fetch the sites you want to process. For instance, you might only want team sites (those connected to Office 365 Groups with or without Teams), in which case we’d use the command:
# Get SharePoint Online sites connected to Office 365 Groups $Sites = Get-SpoSite -Limit All -Template "GROUP#0"
The popularity of the site URL rename feature and the availability of Teams private channels will lead to a lot more types of sites in use. These features are part of the reason why SharePoint Online now supports two million sites per tenant). With that in mind, it’s a good idea to be a lot more precise about how the Get-SPOSite cmdlet is used in scripts.
Worrying about the detail is something we do all the time. It’s this kind of thing that makes the Office 365 for IT Pros eBook invaluable for tenant administrators. Or anyone else with an interest in how Office 365 really works.
]]>A reader asked why the PowerShell examples in the book (and this site) are “just code.” It’s a reasonable question that deserves a reasonable answer.
PowerShell is a fantastic tool for Office 365 administrators. You can create your own solutions for many of the gaps that Microsoft leaves to be filled in the administration of different apps. But before you can fill anything, you need to know how to fill. We want to teach people what’s possible with PowerShell and Office 365 by showing them how to put together scripts to get real work done. The scripts work, but they are incomplete in the sense that they don’t represent professional production-ready code. And that’s just fine by us.
Every Office 365 tenant is different, which makes it awfully difficult to write a script to suit everyone. However, it’s possible to write a fully-functional script that illustrates all of the key functional points that need to come together to solve an operational problem. Our task is done when we publish an example in a blog post or a chapter in the book (and we have literally hundreds of scripts in Office 365 for IT Pros); the task of the reader only starts when they realize that a script might be useful in their environment. The work from that point might include bulletproofing the code to make it work better when errors happen (as they do); making sure that all the requisite modules are loaded and connected to with valid credentials; streamlining scripts by moving some code into functions to make the flow of commands easier to understand; and so on. You might even want to add some extra functionality, or use your preferred method to do something. Or if you’re working with Exchange Online, you might want to use the new REST cmdlets because they are more reliable and run faster.
In a nutshell, we consider our role is to create the basic skeleton of a script to inspire people to expand, enhance, and complete the code. We plant the idea; you take the idea on to completion. It’s a win-win for everyone.
After this post was originally published, we received some great feedback about increasing performance for string handling in reports (you can read about this and and other interesting PowerShell tips online). As a result, we changed a bunch of example scripts that will appear in the December 2019 update for the Office 365 for IT Pros eBook. It’s a great example of community feedback in action and underlines yet again the value of the ePublishing model for technical books.
If you’re not convinced that you can deal with PowerShell, you might like to read the thoughtful article by Paul Cunningham explaining how he approaches writing new scripts. Other useful reads are on good PowerShell habits and code layout and formatting. But don’t adopt anyone’s ideas slavishly without thinking things through. Search for PowerShell solutions to different problems and browse some scripts published in repositories like Github to see how they’re put together and the techniques used, and then take those that make sense to you. Your code won’t be great at the start, but will get better with time and practice.
Don’t be too proud to fetch code from the internet and use it as the basis for what you want to get done. But always test code that you didn’t write to make sure that it does what you think it does. And then test it again, just to be sure.
If you want a book to consult, try the Practical PowerShell for Office 365 book by Damian Scoles (MVP). PowerShell shares the common problem with any book covering Office 365 in that the content changes all the time that topics discussed in a book might be completely different by the time you read the text. For this reason, look for books published in 2018 or later. Warning: there are many crap PowerShell books out there…
Now that we’ve told you just how horribly we write PowerShell code, let us redress the balance by saying that our code actually works. Which is why we proudly feature it in the Office 365 for IT Pros eBook. And if we find a problem, we fix it in the next monthly update.
]]>Updated May 21, 2020 – see below
Microsoft got itself in quite a mess when it announced that users in Office 365 tenants would be able to make self-service purchases for the Power Platform. Some frantic backtracking resulted in a decision to postpone the introduction of the feature until January 14, 2020 and a commitment to deliver administrative controls to allow tenants to disable self-service purchases. Self-service purchase capabilities are not available for Office 365 Government, Nonprofit, and Education tenants.
Without any fuss, Microsoft quietly updated their self-service FAQ on November 19 with the statement that:
“Admins can also control whether users in their organization can make self-service purchases. For more information see Use AllowSelfServicePurchase for the MSCommerce PowerShell module.”
Subsequently, Microsoft published Office 365 notification MC196205 to announce the news.
Administrative control over self-service purchases is available through the MSCommerce PowerShell module. Version 1.2 of the module is the latest version, released via the PowerShell Gallery on November 15. This isn’t a particularly feature-rich or easy-to-use module, but it gets the job done.
To install the module and connect to the MSCommerce endpoint, start PowerShell as an administrator to install the module. Then connect to the endpoint as shown below. You’ll be prompted for credentials: because you’re going to interact with the tenant configuration, make sure to use an account belonging to an Office 365 tenant or billing administrator. After connecting, run Get-Command to see the set of cmdlets loaded by the module.
Install-Module -Name MSCommerce -Scope AllUsers -Force Import-Module MSCommerce Connect-MSCommerce Get-Command *-mscommerce* CommandType Name Version Source ----------- ---- ------- ------ Function Connect-MSCommerce 1.2 mscommerce Function Get-MSCommercePolicies 1.2 mscommerce Function Get-MSCommercePolicy 1.2 mscommerce
The MsCommerce endpoint only supports TLS 1.2, so make sure that your workstation supports this protocol.
As is the norm for many Office 365 management entities these days, control is exerted through policies. If you run the Get-MSCommercePolicies cmdlet, you’ll find that there’s only one policy defined, called AllowSelfServicePurchase.
Get-MSCommercePolicies | fl Description : This policy allows you to manage whether members of your organization can buy specified products using self-service purchasing. You can set this policy on a per-product basis. PolicyId : AllowSelfServicePurchase DefaultValue : Enabled Get-MSCommercePolicy -PolicyId AllowSelfServicePurchase | fl
Looking at the AllowSelfServicePurchase policy, we find:
Get-MSCommerceProductPolicies -PolicyId AllowSelfServicePurchase ProductName ProductId PolicyId PolicyValue ----------- --------- -------- ----------- Power Apps CFQ7TTC0KP0P AllowSelfServicePurchase Enabled Power BI Pro CFQ7TTC0L3PB AllowSelfServicePurchase Enabled Power Automate CFQ7TTC0KP0N AllowSelfServicePurchase Enabled
So we know that the three apps in the Power Platform are covered by this policy. There’s no granular disablement possible on an account basis; if you disable self-service purchases for a product, it’s off for everyone in the tenant. With that in mind, the Update-MSCommerceProductPolicy cmdlet is the way to disable self-service purchases. An inconsistency is that the other cmdlets report the enabled status as the PolicyValue property while this cmdlet uses the Enabled boolean as the control.
Update-MSCommerceProductPolicy -PolicyId AllowSelfServicePurchase -ProductId CFQ7TTC0KP0P -Enabled $False Update policy product success ProductName ProductId PolicyId PolicyValue ----------- --------- -------- ----------- Power Apps CFQ7TTC0KP0P AllowSelfServicePurchase Disabled
To disable self-service for all three products, run the command for each product or run:
Get-MSCommerceProductPolicies -PolicyId AllowSelfServicePurchase | ? {$_.PolicyValue -eq "Enabled" }| ForEach {Update-MSCommerceProductPolicy -PolicyId AllowSelfServicePurchase -ProductId $_.ProductId -Enabled $False }
Everyone loves a trier and the Microsoft team responsible for self-service purchases of Power Platform licenses are firmly in this category. Rebuffed in their first attempt to make self-service purchases available to all Office 365 tenants, Office 365 notification MC213897 (21 May) announces that in situations where tenants block self-service purchases, users will be able to request purchases of Power Platform licenses and have those requests added to a queue. Administrators can then review the request and assign licenses to users, if some are available in the tenant. If licenses aren’t available, Microsoft hopes that administrators will respond to user demand and buy some licenses. The feature will start rolling out in mid-June and is scheduled for completion in mid-July 2020.
Administration of an Office 365 tenant can be a pain at times. Learn how to work smarter through the Office 365 for IT Pros eBook.
]]>Microsoft released support for private channels on November 4 (also see this note about managing private channels). Support for PowerShell access to private channels is not yet available in the publicly available version of the Teams PowerShell module. Instead, if you want to work with private channels through PowerShell, you must install the latest version of the Teams module from the PowerShell test gallery. You need to run version 1.0.18 or better to manage private channels (Figure 1).
The latest version of the Teams PowerShell module is 2.0.
The following cmdlets are updated to support private channels:
Three new cmdlets are available to manage the membership of private channels:
Like everything to do with Teams when working through PowerShell, you need to know the group identifier of the team hosting the private channel to use any of these cmdlets. The object identifier for a team is easily fetched. In this example, we fetch the group identifier for the team with the display name “Corporate Acquisition Planning” and store it in the $GroupId variable.
# Get Group Id for the Corporate Acquisition Planning team $GroupId = (Get-Team -DisplayName "Corporate Acquisition Planning 2020").GroupId
The New-TeamChannel cmdlet creates a new private channel if you specify that the MembershipType parameter is Private. Remember to add an owner selected from the membership of the team.
# Add new private channel to a team New-TeamChannel -GroupId $GroupId -DisplayName "Project Hydra" -Description "Discussions about the Hydra Project" -MembershipType Private -Owner Tony.Redmond@office365itpros.com
After creating a new private channel, you build out its membership by adding a subset of the members of the team with the Add-TeamChannelUser cmdlet. Specify -Role Owner for the members who will be owners of the private channel. You must first add someone as a member before you can add them as an owner. Everyone in the channel can be an owner, if that’s what you want. Again, the private channel is identified with its display name.
# Add members to the private channel Add-TeamChannelUser -GroupId $GroupId -DisplayName "Legal Discussions" -User Oisin.Johnston@Office365itpros.com Add-TeamChannelUser -GroupId $GroupId -DisplayName "Legal Discussions" -User Oisin.JohnSton@Office365itpros.com -Role Owner
Use the Get-TeamChannelUser cmdlet to return the membership of a private channel. Note that you identify the private channel using its display name.
# Fetch channel membership Get-TeamChannelUser -GroupId $GroupId -DisplayName "Legal Discussions" UserId User Name Role ------ ---- ---- ---- eff4cd58-1bb8-4899-94de-795f656b4a18 Tony.Redmond@office365itpros.com Tony Redmond owner d36b323a-32c3-4ca5-a4a5-2f7b4fbef31c Kim.Akers@office365itpros.com Kim Akers member c6133be4-71d4-47c4-b109-e37c0c93f8d3 Oisin.Johnston@office365itpros.com Oisin Johnston member cad05ccf-a359-4ac7-89e0-1e33bf37579e James.Ryan@office365itpros.com James Ryan member
Use the Remove-TeamChannelUser cmdlet to remove an owner or member from a private channel:
# Remove member from a private channel Remove-TeamChannelUser -GroupId $GroupId -DisplayName "Legal Discussions" -User James.Ryan@office365itpros.com
If you run Get-TeamChannel to list the channels in a team, you see all channels without any indication of which are private, and which are public unless you output the MembershipType property:
# List channels for a team Get-TeamChannel -GroupId $GroupId Id DisplayName Description -- ----------- ----------- 19:44d9f180c1ea4cd49291fd4607054706@thread.skype General A team to coordinate the work to id... 19:3b7d26b1253f4eff9014a8fe2c79b586@thread.skype Acquisition Targets 19:85a78f1d2c734c69952215eb631a690c@thread.skype Legal Discussions Legal debate about our acquisition ... 19:bbac69a0ea1a460fb07b766eac10c63a@thread.skype Project Hydra Discussions about the Hydra Project Get-TeamChannel -GroupId $GroupId | Format-Table DisplayName, MembershipType DisplayName MembershipType ----------- -------------- General Standard Acquisition Targets Standard Legal Discussions Private<
To select a specific type of channel, use the MembershipType parameter to state the kind of channel you want to return:
# List private channels for a team Get-TeamChannel -GroupId $Groupid -MembershipType Private Id DisplayName Description -- ----------- ----------- 19:85a78f1d2c734c69952215eb631a690c@thread.skype Legal Discussions Legal debate about our acquisition pl... 19:bbac69a0ea1a460fb07b766eac10c63a@thread.skype Project Hydra Discussions about the Hydra Project
There’s no way to run a command to change a channel type from Private to Standard or vice versa. All you can do with Set-TeamChannel is update the display name or description.
# Update a channel Set-TeamChannel -GroupId $GroupId -CurrentDisplayName "Project Hydra" -NewDisplayName "Deep and Dark Secrets" -Description "The place where deep and dark secrets are discussed"
The Remove-TeamChannel cmdlet doesn’t give any warning or seek confirmation when it removes a private channel (and the underlying SharePoint site).
# Remove a private channel Remove-TeamChannel -GroupId $GroupId -DisplayName "Project Hydra"
Need examples of how to use PowerShell to solve real-life administration challenges with Teams? Check out the “Managing Groups and Teams with PowerShell” chapter in the Office 365 for IT Pros eBook. It’s always easier to create a script based on a working example!
]]>Browsing the personal blog of Vasil Michev, the esteemed technical editor of the Office 365 for IT Pros eBook (I have to call him that as otherwise he gets very vexed and causes problems when he edits a chapter), I found an interesting post about using names specified in a CSV file to remove licenses from Office 365 accounts. In fact, I found a logic bug in Vasil’s PowerShell code, which neatly reversed the normal situation when he criticizes my poor attempts at coding.
In any case, the thought came to me that it would be useful to have a script that reported the license assignments to users and output a CSV file that an Office 365 administrator could either use for their own purposes or as an input to Vasil’s script.
You can find license assignment information in the Billing section of the Office 365 Admin Center (select Licenses – Figure 1).
The Office 365 Admin Center also supports the option of exporting license information, but only after you choose a specific license. Anyway, it’s nice to be able to do your own thing in terms of automating administrative processes, which is what PowerShell is all about.
The quick and dirty PowerShell code in this script fetches details of all licensed accounts in an Office 365 tenant and extracts the license assignment information from each account. The information is written into an array that’s then grouped to calculate a number for each license assignment. We then write the information about to the CSV file, taking care to sort it by license and displayname (just to show how to do multi-property sorts in PowerShell). Here’s the code:
# Quick and dirty code to create a report of license assignments in an Office 365 tenant $Report = [System.Collections.Generic.List[Object]]::new() $Users = Get-MsolUser -All | Where {$_.isLicensed -eq $true} | Select UserPrincipalName, DisplayName, Department, IsLicensed Write-Host "Processing Users" ForEach ($User in $Users) { $SKUs = @(Get-MsolUser -UserPrincipalName $User.UserPrincipalName | Select -ExpandProperty Licenses) ForEach ($Sku in $Skus) { $ReportLine = [PSCustomObject]@{ User = $User.UserPrincipalName SKU = $Sku.AccountSkuId.Split(":")[1] Name = $User.DisplayName Dept = $User.Departmnent} $Report.Add($ReportLine) } } Cls # Write out the information Write-Host "License information" $Groupdata = $Report | Group-Object -Property SKU $Groupdata | Sort Count -Descending | Select Name, Count # Set sort properties so that we get ascending sorts for one property after another $Sort1 = @{Expression='SKU'; Ascending=$true } $Sort2 = @{Expression='Name'; Ascending=$true } $Report | Select SKU, Name, User | Sort-Object $Sort1, $Sort2 | Export-CSV c:\Temp\UserLicenses.CSV -NoTypeInformation
Figure 2 shows what the on-screen output looks like:
While Figure 3 shows how the data appears in the CSV file.
If you want to make the license names clearer (for example, to translate ENTERPRISEPACK to Office 365 E3), you can add some code to replace the license name before writing it to the output array.
Update May 30, 2020: The code in the GitHub version of the script shows how to resolve SKU names to user-friendly license names.
Another idea is to increase the number of account properties written out to the CSV file to make analysis easier or more productive. For example, you could include properties such as UsageLocation (Country), City, or Department to focus in on license usage in those areas.
The intricacies of dealing with Office 365 licenses via PowerShell are explained in the Office 365 for IT Pros eBook. We might even add this script in a future update!
]]>A couple of years ago, retrieving information about OneDrive for Business sites with PowerShell usually involved some gyrations. Then Microsoft updated the Get-SPOSite cmdlet with the IncludePersonalSite switch and things became easier. For instance, a reader asked if it was possible to generate a report listing all the OneDrive for Business sites in a tenant with the storage allocated and used for each site.
No problem, we thought, as we scanned the internet to see if people had already solved the problem. As it happens, several example scripts are available, but we ended up writing our own because it was possible to simplify the code . We also store the output in a CSV file as it’s a very flexible format for reporting or further analysis (like importing into Power BI).
You need to connect to SharePoint Online in a PowerShell session with an admin account. The connection process imports the SharePoint cmdlets from the module. Once a connection is made, you can retrieve the storage data. The basic steps are:
Here’s the code:
# Get a list of OneDrive for Business sites in the tenant sorted by the biggest consumer of quota Write-Host "Finding OneDrive sites..." [array]$ODFBSites = Get-SPOSite -IncludePersonalSite $True -Limit All -Filter "Url -like '-my.sharepoint.com/personal/'" | Select Owner, Title, URL, StorageQuota, StorageUsageCurrent | Sort StorageUsageCurrent -Descending If (!($ODFBSites)) { Write-Host "No OneDrive sites found (surprisingly...)" ; break } # Calculate total used $TotalODFBGBUsed = [Math]::Round(($ODFBSites.StorageUsageCurrent | Measure-Object -Sum).Sum /1024,2) # Create list to store report data $Report = [System.Collections.Generic.List[Object]]::new() # Store information for each OneDrive site ForEach ($Site in $ODFBSites) { $ReportLine = [PSCustomObject]@{ Owner = $Site.Title Email = $Site.Owner URL = $Site.URL QuotaGB = [Math]::Round($Site.StorageQuota/1024,2) UsedGB = [Math]::Round($Site.StorageUsageCurrent/1024,4) PercentUsed = [Math]::Round(($Site.StorageUsageCurrent/$Site.StorageQuota * 100),4) } $Report.Add($ReportLine) } $Report | Export-CSV -NoTypeInformation c:\temp\OneDriveSiteConsumption.CSV # You don't have to do this, but it's useful to view the data via Out-GridView $Report | Sort UsedGB -Descending | Out-GridView Write-Host "Current OneDrive for Business storage consumption is" $TotalODFBGBUsed "GB. Report is in C:\temp\OneDriveSiteConsumption.CSV"
Figure 1 shows an example of the CSV file generated by the script. Because the information is in a CSV file, you can sort and organize it in whatever way makes sense for you. Some organizations like to grab information like this and store it in a repository to track the growth in storage consumption over time.
The public health warning is that we’ve not tested the script on very large tenants. It might take some time to run in those conditions, in which case you could break up processing. For instance, you could filter for sites starting with each letter of the alphabet and then combine the results for each letter into a single file.
Need more information about managing OneDrive for Business? Because the same general approach can usually be taken for both SharePoint Online and OneDrive for Business, we cover that topic in the chapter that deals with SharePoint Online management in the Office 365 for IT Pros eBook.
]]>Updated 24 September 2023
One of the first PowerShell scripts created after the launch of Exchange 2007 was to report the quotas assigned to mailboxes and the amount of quota consumed by each mailbox. Scripts of this type are relatively simple and rely on the Get-ExoMailbox and Get-ExoMailboxStatistics cmdlets to provide data about quotas and usage.
Over time, many variations on this report have appeared. The variant shown here is a response to a request in a Facebook group about Office 365 for a script to identify mailboxes whose quota is nearly exhausted. In this case, the Office 365 tenant has many frontline accounts whose Exchange Online mailbox quota are 2 GB instead of the much more generous 100 GB assigned to enterprise accounts. It’s obviously much easier to fill a 2 GB quota, especially if you use Teams and share images in personal chats. The idea therefore is to scan for mailboxes whose usage exceeds a threshold of quota used (expressed as a percentage). Mailbox owners who fall into this category might need to remove some items (and empty the Deleted Items folder) or receive a larger quota.
Two things are notable. First, if you want to do comparisons with the information returned by the cmdlets, you should convert the returned values into numbers (byte count), which is what is done here. This is because the Get-ExoMailbox and Get-ExoMailboxStatistics cmdlets return values like:
Get-ExoMailboxStatistics -Identity Kim.Akers | Format-List TotalItemSize TotalItemSize : 3.853 GB (4,136,899,622 bytes)
It’s hard to do computations on these values, so some processing is needed to ensure that calculations proceed smoothly.
Second, the output is a CSV file sorted by mailbox display name. You could use the output in different ways. For instance, you could use the incoming webhook connector to post information about users whose mailboxes need some attention to Teams or Microsoft 365 Groups (here’s an example).
Here’s the script. As always, no claims are made that this is perfect PowerShell code. It’s entirely up to the reader to improve, enhance, or debug the script to match their needs. You can download the script from GitHub.
# Set threshold % of quota to use as warning level $Threshold = 85 # Get all user mailboxes Clear-Host Write-Host "Finding mailboxes..." [array]$Mbx = Get-ExoMailbox -ResultSize Unlimited -RecipientTypeDetails UserMailbox -Properties ProhibitSendReceiveQuota | Select-Object DisplayName, ProhibitSendReceiveQuota, DistinguishedName $Report = [System.Collections.Generic.List[Object]]::new() ForEach ($M in $Mbx) { # Find current usage Write-Host "Processing" $M.DisplayName $Mailbox = $M.DisplayName $ErrorText = $Null $MbxStats = Get-ExoMailboxStatistics -Identity $M.DistinguishedName | Select ItemCount, TotalItemSize # Return byte count of quota used [INT64]$QuotaUsed = [convert]::ToInt64(((($MbxStats.TotalItemSize.ToString().split("(")[-1]).split(")")[0]).split(" ")[0]-replace '[,]','')) # Byte count for mailbox quota [INT64]$MbxQuota = [convert]::ToInt64(((($M.ProhibitSendReceiveQuota.ToString().split("(")[-1]).split(")")[0]).split(" ")[0]-replace '[,]','')) $MbxQuotaGB = [math]::Round(($MbxQuota/1GB),2) $QuotaPercentUsed = [math]::Round(($QuotaUsed/$MbxQuota)*100,2) $QuotaUsedGB = [math]::Round(($QuotaUsed/1GB),2) If ($QuotaPercentUsed -gt $Threshold) { Write-Host $M.DisplayName "current mailbox use is above threshold at" $QuotaPercentUsed -Foregroundcolor Red $ErrorText = "Mailbox quota over threshold" } # Generate report line for the mailbox $ReportLine = [PSCustomObject]@{ Mailbox = $M.DisplayName MbxQuotaGB = $MbxQuotaGB Items = $MbxStats.ItemCount MbxSizeGB = $QuotaUsedGB QuotaPercentUsed = $QuotaPercentUsed ErrorText = $ErrorText} $Report.Add($ReportLine) } # Export to CSV $Report | Sort-Object Mailbox | Export-csv -NoTypeInformation MailboxQuotaReport.csv
Figure 1 shows an example of the kind of report that the script generates.
The script is reasonably simple PowerShell code so there shouldn’t be much difficulty in tailoring it to fit the needs of a specific organization. The nice thing about PowerShell is that it’s easy to customize and easy to stitch bits of code from different scripts together to create a new solution. Hopefully you’ll be able to use the code presented here for your purposes.
Need more information about how to manage Exchange Online mailboxes? Look no further than the Office 365 for IT Pros eBook, which is filled with practical ideas, suggestions, and lots of PowerShell examples.
]]>The topic of how best to find the URL of someone’s OneDrive for Business account arose in the context of Office 365 content searches. You need to know the URL of any SharePoint Online site or OneDrive for Business account before you can include it in the locations scanned by a content search (Figure 1), eDiscovery case, or Office 365 retention policy.
Finding the URL of a SharePoint site is straightforward, especially if the site is connected to an Office 365 Group (team). You can:
We can find the URL with the SharePoint Online PowerShell module or the Exchange Online module. First, here’s SharePoint Online where we use the filter parameter with the Get-SPOSite cmdlet to find all sites containing “Ben” in the URL:
# Find SPO Sites with Ben in the URL Get-SPOSite -Filter "URL -like 'Ben'" Url Owner Storage Quota --- ----- ------------- https://tenant.sharepoint.com/sites/benowensteam 26214400
The Get-UnifiedGroup cmdlet in the Exchange Online module can return details of any group-enabled site:
# Get SPO details from group Get-UnifiedGroup -Identity "Ben Owens Team" | Format-list share* SharePointSiteUrl : https://tenant.sharepoint.com/sites/benowensteam SharePointDocumentsUrl : https://tenant.sharepoint.com/sites/benowensteam/Shared Documents SharePointNotebookUrl :
The OneDrive for Business Admin Center doesn’t list OneDrive accounts: neither does the SharePoint Admin Center. However, we can find the URLs as follows:
PowerShell is probably the easiest method because you can create a list of all OneDrive for Business accounts in the tenant and keep it for easy reference. After connecting to the SharePoint Online PowerShell module with an administrator account, run this command to generate a CSV file with all the links. Figure 3 shows an example of what the CSV file contains.
# Get list of OneDrive for Business accounts and export them to CSV file Get-SPOSite -IncludePersonalSite $true -Limit all -Filter "Url -like '-my.sharepoint.com/personal/'" | Select Owner, URL | SOrt Owner | Export-CSV c:\temp\OneDriveSites.csv -NoTypeInformation
Apart from being a useful reference, generating a list of OneDrive accounts also allows you to identify any accounts belonging to long-deleted accounts that should no longer be online (I found a couple from 2013).
Tracking down tips like this can be very time-consuming. Wouldn’t it be much better to be able to consult a comprehensive, always up-to-date manual. Something like the Office 365 for IT Pros eBook?
]]>Recently, I posted a Petri.com article to report the availability of some new properties in the Export-MailboxDiagnosticLogs cmdlet. The properties record different kinds of mailbox activity, and I included a script to generate a report based on the properties. The output is a CSV file that can be opened in Excel or imported in Power BI. All is well.
I then had the idea that maybe it would be good to filter the output to find unused mailboxes and post that information to a Teams channel as a form of proactive notification to administrators. Not having endless time, I browsed the web to find a PowerShell script to serve as a starting point and found one that reports inactive Active Directory accounts. That’s not a long way from what I wanted to do, so I grabbed the code and edited it to fit my purpose.
Posting messages to a Teams channel is easily done using the incoming webhook connector, one of the standard connectors available to all Microsoft 365 tenants to bring information sourced from applications into Teams and Microsoft 365 Groups. When you configure the connector for a channel, you get a webhook (unique identifier) to post messages to the channel.
I then filtered the set of mailboxes I created in the table generated by the previous script (see link above) to find mailboxes with no email activity over the last 90 days. If a mailbox has no activity in three months, it’s a good indicator that it is an unused mailbox. I then generate the necessary JSON format payload consumed by Teams and post the resulting message reporting the unused mailboxes to the webhook. Here’s the script:
# Script uses some code from https://www.thelazyadministrator.com/2018/12/11/post-inactive-users-as-a-microsoft-teams-message-with-powershell/ $WebHook = "https://outlook.office.com/webhook/42f6d6b0-c191-496d-85b4-bfd6e63e230b@b662313f-14fc-43a2-9a7a-d2e27f4f3478/IncomingWebhook/62c92a65258a416b90e969980ae4ebb1/eff4cd58-1bb8-4899-94de-795f656b4a18" $InactiveTable = New-Object 'System.Collections.Generic.List[System.Object]' $PersonImage = "https://img.icons8.com/cotton/2x/gender-neutral-user--v1.png" $Today = (Get-Date) ForEach ($R in $Report) { $DaysSinceLastEmail = ((New-TimeSpan –Start $R.LastEmail –End $Today).Days) If ($DaysSinceLastEmail -gt 90) { $UserData = @{ ActivityTitle = "$($R.Mailbox)" ActivitySubTitle = "-----------------------------------------------" ActivityText = "$($R.Mailbox)'s last email activity was on $($R.LastEmail)" ActivityImage = $PersonImage Facts = @( @{ name = 'Mailbox:' value = $R.Mailbox }, @{ name = 'Last Email activity:' value = $R.LastEmail }, @{ name = 'Days since last activity:' value = $DaysSinceLastEmail }, @{ name = 'Last active time (unreliable):' value = $R.LastActive } ) } $InactiveTable.Add($UserData) Write-Host $R.Mailbox $R.LastEmail $DaysSince }} $Body = ConvertTo-Json -Depth 8 @{ Title = "Possibly Inactive Office 365 Users" Text = "There are $($InactiveTable.Count) users with no detected email activity for 90 days or more" Sections = $InactiveTable } # Only post if we have less than 25 items If ($InactiveTable.Count -lt 25) { Write-Host "Posting inactive account information to Teams" -ForegroundColor Yellow Invoke-RestMethod -uri $WebHook -Method Post -body $body -ContentType 'application/json' } Else { Write-Host "Too many (" $InactiveTable.Count ") inactive accounts found to post to Teams... Spread the bad news another way" }
The message posted to Teams looks like the example shown in Figure 1:
There’s nothing earth shattering in the code and plenty of similar examples are available online (such is the joy of PowerShell). What is important to note when you post to Teams is that the message is limited to a maximum size of 25 KB. If your message exceeds the limit, Teams responds with a HTTP 413 error similar to:
Microsoft Teams endpoint returned HTTP error 413 with ContextId tcid=982653230009892365,server=DB3PEPF00000461,cv=4LrmcZGylkmtD0uEToTT2g.0.
In my case, it seemed like the error happened if more than 28 or so items were in the list of reported accounts. This will obviously vary depending on how much data you try to post for each item.
For more information about using Teams and Microsoft 365 Groups with connectors, read the Office 365 for IT Pros eBook.
]]>One of the sessions I attended at the recent TEC event covered the topic of attacks on Office 365. A recent session called “Attacking and Defending the Microsoft Cloud” given at the Black Hat USA 2019 conference explained how PowerShell can be used to fetch information from an Office 365 tenant, like a dump of your Azure Active Directory information.
I’m also well aware of how attackers use PowerShell to set forwarding addresses on mailboxes to redirect email. This is usually done to allow an attacker to understand the ebb and flow of messages into a target’s mailbox before launching a Business Email Compromise attack. Understanding what mailboxes have mail forwarding set is one of the reasons why tenant administrators should review the Forwarding report (Figure 1) in the Mail Flow section of the Security and Compliance Center weekly. Obviously anyone on the list should have a valid business reason to forward email outside the organization.
Much the same information can be retrieved with PowerShell. For example:
# Find mailboxes with mail forwarding address set $Mbx = (Get-ExoMailbox -RecipientTypeDetails UserMailbox, SharedMailbox -ResultSize Unlimited -Properties ForwardingSmtpAddress) $NumberForwards = 0 ForEach ($M in $Mbx) { If ($M.ForwardingSmtpAddress -ne $Null) { # Mailbox has a forwarding address $NumberForwards++ Write-Host $M.DisplayName "is forwarding email to" $M.ForwardingSmtpAddress.Split(":")[1] } } If ($NumberForwards -gt 0) { Write-Host $NumberForwards "mailboxes found with forwarding addresses"}
In any case, discussing the various techniques used by attackers to exploit PowerShell isn’t the purpose of this post. Instead, I want to point out how useful it is to enable PowerShell logging on workstations. Following the introduction of PowerShell 5.0 in 2016, enabling logging was a popular recommendation in the security community (here’s an example post) with the idea that you can collect the logs generated by PowerShell activity into a repository like Splunk where they could then be queried as necessary.
It’s easy to enable PowerShell logging by updating settings in the local group policy editor (Figure 2).
Alternatively (and possibly more appropriately), because all we’re doing is manipulating system registry settings, we can do the same by running these PowerShell commands in an administrator session.
# Module Logging $RegistryPath = "HKLM:\SOFTWARE\Wow6432Node\Policies\Microsoft\Windows\PowerShell\ModuleLogging" $Name = "EnableModuleLogging" $Value = "1" If (!(Test-Path $RegistryPath)) { # Value Doesn't Exist, so create it New-Item -Path $RegistryPath -Force | Out-Null New-ItemProperty -Path $RegistryPath -Name $Name -Value $Value -PropertyType DWORD -Force | Out-Null} Else { New-ItemProperty -Path $RegistryPath -Name $Name -Value $Value -PropertyType DWORD -Force | Out-Null} $Name = "ModuleNames" $Value = "*" New-ItemProperty -Path $RegistryPath -Name $Name -Value $Value -PropertyType String -Force | Out-Null # Script Block Logging $RegistryPath = "HKLM:\SOFTWARE\Wow6432Node\Policies\Microsoft\Windows\PowerShell\ScriptBlockLogging" $Name = "EnableScriptBlockLogging" $Value = 1 If (!(Test-Path $RegistryPath)) { # Value Doesn't Exist, so create it New-Item -Path $RegistryPath -Force | Out-Null New-ItemProperty -Path $RegistryPath -Name $Name -Value $Value -PropertyType DWORD -Force | Out-Null} Else { New-ItemProperty -Path $RegistryPath -Name $Name -Value $Value -PropertyType DWORD -Force | Out-Null} # Transcription $RegistryPath = "HKLM:\SOFTWARE\Wow6432Node\Policies\Microsoft\Windows\PowerShell\Transcription" $Name = "EnableTranscripting" $Value = "1" If (!(Test-Path $RegistryPath)) { # Value Doesn't Exist, so create it New-Item -Path $RegistryPath -Force | Out-Null New-ItemProperty -Path $RegistryPath -Name $Name -Value $Value -PropertyType DWORD -Force | Out-Null} Else { New-ItemProperty -Path $RegistryPath -Name $Name -Value $Value -PropertyType DWORD -Force | Out-Null} $Name = "EnableInvocationHeader" $Value = "1" New-ItemProperty -Path $RegistryPath -Name $Name -Value $Value -PropertyType DWORD -Force | Out-Null $Name = "OutputDirectory" $Value = "C:\Temp\PSLogs" New-ItemProperty -Path $RegistryPath -Name $Name -Value $Value -PropertyType String -Force | Out-Null
My favorite setting is the one that enables PowerShell transcripts. The PowerShell cmdlets Start-Transcript and End-Transcript are available to generate transcripts manually, but I like to have automatic transcripts because I don’t have to remember to start one each time I do some work in PowerShell.
Transcripts last for the full duration of a session. In my case, because I have PowerShell open all the time, a session might last a week or more. A separate folder is used for transcripts that start each day, and transcripts are generated by user and system activity (for example, if you run the Windows printer troubleshooter, you’ll see a transcript that captures its work).
Transcripts capture details of everything that happens during a session (Figure 3). They are tremendously useful in terms of identifying what happened during a session and in tracking down why commands didn’t run successfully. As noted earlier, the logs can be copied to a repository and kept there for auditing and search purposes.
Given the value that you can gain from PowerShell logging, it’s something that you should consider enabling on every workstation you use to write code. Or indeed, every workstation in the organization. Hopefully you’ll never have to use the logs to track what an attacker does inside your Office 365 tenant, but at least if you have the logs, some chance exists that you might be able to discover exactly what they did.
Because PowerShell is a great way to automate common Office 365 processes or just get stuff done, we have tons of PowerShell examples in the Office 365 for IT Pros eBook. Some of them even work, which is nice…
]]>A reader asked how easy it might be to find out what SharePoint Online sites are connected to Office 365 Groups and Teams and how the link works. Well, the easy answer is to say that you can get a list of group-enabled sites by running the Get-SPOSite cmdlet as follows:
# Fetch a list of SharePoint Online sites created using the Office 365 Groups template Get-SPOSite -Template "GROUP#0" -IncludePersonalSite:$False -Limit All
However, the output is a simple list that’s not very interesting. The list also includes sites belonging to deleted groups that are being retained by Office 365 because of a retention policy. Things get a lot more interesting when you use the group identifier for a site to look at some group information. To have SharePoint return the group identifier, you must pass the Detailed parameter to Get-SPOSite. This slows down processing, but it means that you get a group identifier. The group identifier (a GUID) can be used with the Get-AzureADGroup, Get-UnifiedGroup, and Get-Team cmdlets (alas, all in separate PowerShell modules) to fetch different information about the group. You can also use the identifier with other cmdlets to retrieve group member information, and so on.
Equipped with the knowledge of how to get to group information, we can build a script to generate a report. Here’s a rough and ready script to do the job.
# Report Group-enabled SharePoint sites # Find all sets created with the Office 365 Group template and reports them in a CSV file. Checks if group exists for the site # and if it is enabled for Microsoft Teams Cls Write-Host "Analyzing SharePoint sites created with the Office 365 Groups template..." $Sites = Get-SPOSite -Template "GROUP#0" -IncludePersonalSite:$False -Limit All $Report = [System.Collections.Generic.List[Object]]::new() $ErrorSites = 0 $i = 0 Foreach ($Site in $Sites) { $i++ $ProgressBar = "Processing site " + $Site.Title + " (" + $i + " of " + $Sites.Count + ")" Write-Progress -Activity "Checking SharePoint Sites" -Status $ProgressBar -PercentComplete ($i/$Sites.Count*100) $GroupId = (Get-SpoSite $Site.Url -Detailed).GroupId.Guid # Check if the Office 365 Group exists $ErrorText = $Null $O365Group = $Null $O365Group = (Get-UnifiedGroup -Identity $GroupId -ErrorAction SilentlyContinue) If ($O365Group -eq $Null) { $O365GroupName = "Unknown Office 365 Group" $ErrorText = "Failed to find Office 365 Group for site " + $Site.Title + " identifier " + $GroupId $O365GroupMembers = 0 $ErrorSites++ } Else { $O365GroupName = $O365Group.DisplayName $O365GroupMembers = $O365Group.GroupMemberCount # Check if site has a team $TeamsEnabled = $True Try {Get-Team -GroupId $GroupId | Out-Null} Catch {$TeamsEnabled = $False} } # Generate a line $ReportLine = [PSCustomObject]@{ Site = $Site.Title URL = $Site.URL LastContent = $Site.LastContentModifiedDate O365Group = $O365GroupName Members = $O365GroupMembers TeamEnabled = $TeamsEnabled Error = $ErrorText } $Report.Add($ReportLine) } Write-Host "All done" $ErrorSites "sites reported errors" $Report | Sort Site | Export-CSV c:\temp\Sites.csv -NoTypeInformation
As with all my scripts, there’s lots that could be done to improve the robustness of the code, improve error handling, and so on. Creating a production-ready script is not the point. What I want to illustrate is how to exploit the connection between SharePoint Online and other PowerShell modules.
The report is output as a CSV file. You can add extra columns to the file by adding properties to the table generated by the script, which I’ve limited to site title, URL, the last date for content, the name of the Office 365 group, count of group members, a True/False flag to show if the group is enabled for Teams, and some error text. The date for last content modified is unreliable as content can be modified by background activities, so it’s included as an example of using information returned by Get-SPOSite. Figure 1 shows an example of the CSV file.
The script highlights sites that have a group identifier that no longer points to a valid Office 365 group. These situations are likely to be caused by groups that have been removed but the site remains, probably because an Office 365 retention policy governs the site. You can try to remove the site in the SharePoint Online Admin Center: if you see an error, you know that a retention policy is blocking the deletion.
I look forward to hearing how people amend or expand the script to meet different circumstances. Please post a comment if you do.
For more information about working with SharePoint Online, Groups, and Teams through SharePoint, see the hundreds of code examples included in the Office 365 for IT Pros eBook. We’re bound to have something that will make you think!
]]>The process of migrating Teams tenant management settings has been in progress since Microsoft announced the Teams Admin Center in April 2018. Lots has changed since and the Teams Admin Center has matured greatly, and now we see the final pieces of the puzzle appear with Teams app setup policies (to control the default apps available to users) and Teams app permission policies (to control the apps users are allowed to install and use, including during meetings).
If you’ve already blocked some third-party apps in the Teams settings in the Office 365 Admin Center, you’ll find that the settings are moved across into org-wide app settings in the App Permissions Policies sector of the Teams Admin Center (Figure 1).
Org-wide app settings (Figure 2) control if third-party or custom apps (app packages developed by your organization) can be installed. If you allow third-party apps to be installed, you can create a list of blocked third-party apps that will never be available to users.
App Permission Policies control the set of Microsoft, third-party, and custom apps available to end users. While org-wide settings apply to everyone in the tenant, app permission policies offer a finer degree of control down to the individual user level. Each policy allows access to its own set of apps (Figure 3). After you assign an app permission policy to a user, they can install any of the apps covered by the policy. An app permission policy can’t override a block set in the org-wide app settings.
A global app permission policy is created automatically within a tenant and applied to all accounts. If you want to allow access to different apps, you can customize the set of apps defined in the global app permission policy or create a new app permission policy and assign it to selected accounts. An app permission policy covers three types of app:
For each type of app, you can decide to:
When you restrict the set of apps available in Teams, the Store filters the set of apps, bots, and connectors it displays to users and team owners. To assign a policy to a user, go to the Users section of the Teams Admin Center, select the user, and edit the policies section of their account to update the assigned app permission policy, which will be the Global (Org-wide default) policy unless it was previously changed for another policy. Due to caching, it can take a up to a day before Teams clients respond to a change in the set of apps allowed to users or a change in the policy assigned to an account.
Editing individual accounts to update policies rapidly becomes a boring activity. The cmdlets to work with Teams App Permissions Policies are in V2.0 of the Teams PowerShell module. PowerShell makes it easy to assign the same App Permissions policy to a group of users, such as the members of a team. In the code snippet below, we connect to the Skype for Business Online endpoint, find the members of a team, and use the membership list to assign the policy to each member.
# Find members of the Human Resources Group and assign them the appropriate Teams App Permissions policy $HRGroup = Get-Team -DisplayName "Human Resources Group" $TeamUsers = Get-TeamUser -GroupId $HrGroup.GroupId -Role Member $TeamUsers | ForEach-Object { Grant-CsTeamsAppPermissionPolicy -PolicyName "HR App Policy" -Identity $_.User}
For more information about managing all aspects of Teams, read the several hundred pages of coverage we give to Teams and Office 365 Groups in the Office 365 for IT Pros eBook. You won’t be disappointed.
]]>On July 31, Microsoft announced the deprecation of the AADRM PowerShell module and its replacement by the AipService module. AADRM stands for “Azure Active Directory Rights Management” while AIP is the Azure Information Protection service. The two modules connect to the same back-end service to manage the configuration of the protection service, including the rights management templates to protect content inside and outside Office 365, including Office 365 sensitivity labels configured to protect documents and email with encryption. Protection templates can also be applied by Exchange Online mail flow rules to protect selected messages as they pass through the transport pipeline.
The set of cmdlets in the AzureInformationProtection module used to apply (or remove) protection to files (outside Office 365) are unaffected. These cmdlets are available when you install the Azure Information Protection client on a workstation. Either version of the client (classic or unified labeling) installs the cmdlets. In passing, a recent Microsoft blog post explains the current state of the transition to the unified labeling client.
One example of using these cmdlets is to decrypt protected documents found in an GDPR Data Subject Request search (DSR). A DSR is a special form of Office 365 content search that returns all the information about an individual held within Office 365 repositories.
The deprecation takes effect on July 2020. Before then, you should review any scripts with calls to the AADRM cmdlets and replace them with the equivalents in the AIP module. Microsoft supports aliases for the AADRM cmdlets in the new module, but it’s best to replace the cmdlet names as you don’t know for how long Microsoft will continue support for the aliases. Fortunately, editing to update scripts s is simple as it’s a matter of replacing the module prefix with the new name. For example:
Old module: Get-AadrmSuperUser
New module: Get-AipServiceSuperUser
Naturally, you should test scripts thoroughly after updating the cmdlets to make sure that they still work as expected.
A listing of the cmdlets in the AipService module is available online.
For more information about Azure Information Protection, including using PowerShell to manage the service and files, read Chapter 24 of the Office 365 for IT Pros eBook.
]]>Office 365 Notification MC187289 posted on August 5 told us that Microsoft has started the roll-out of the SharePoint “site swap” feature described in Office 365 roadmap item 51259. The original plan for the roll-out called for a measured deployment (some would say “slow”) across different categories of tenants and is due to complete in October 2019. On September 6, Microsoft issued notification MC189866 with news that they had started to deploy to to Office 365 tenants with less than 10,000 tenants. Larger tenants will still have to wait for a further update. Fortunately, my tenant was updated in the first part of the deployment and I could swap sites to my heart’s content.
The new feature uses the Invoke-SPOSiteSwap PowerShell cmdlet (part of the SharePoint Online PowerShell module from version 16.0.8812.1200 on). The latest version is 16.0.9119.1200, but I used the cmdlet with version 16.0.9021.1201. The cmdlet swaps an entire site collection.
I’ve been using SharePoint Online since 2011 and my root site was a very old page (Figure 1) that I put together years ago when sites could still be accessed by external users. I haven’t paid any attention to the page for a long time.
To replace the root page, I created a new SharePoint communications site and made some minor changes to it. I then ran the Invoke-SPOSiteSwap cmdlet to swap the new communications site to become the root site using this command:
# Swap a SharePoint site Invoke-SPOSiteSwap -SourceURL https://office365itpros.sharepoint.com/sites/NewMarketingComms -TargetURL https://office365itpros.sharepoint.com -ArchiveURL https://office365itpros.sharepoint.com/sites/OldMarketingComms
Invoke-SPOSiteSwap starts off a background job to move things around. In this case, it took the old root site (https://office365itpros.sharepoint.com) and moved it to an archived site (https://office365itpros.sharepoint.com/sites/OldMarketingComms) and replaced the root site with the new communications site that I had updated (https://office365itpros.sharepoint.com/sites/NewMarketingComms). After a few minutes (you’ll see a 404 error while the moving around happens), the new root site was available (Figure 2). It was all very easy.
Office 365 captures audit records when you run Invoke-SPOSiteSwap to start the background job (SiteSwapScheduled) and when the job completes (SiteSwapped). These records are visible through the Audit log search in the Security and Compliance Center. They can also be found with the Search-UnifiedAuditLog cmdlet using a command like:
# Find records for SharePoint site swaps
Search-UnifiedAuditLog -Operations SiteSwapped, SiteSwapScheduled -StartDate 7-Aug-2019 -EndDate 8-Aug-2019
Some restrictions exist. The source or target sites can’t be associated with an Office 365 Group (team) or a hub site. The target site can only be the root site or the search center. There’s several other notes to read up on in the documentation. Basically, this is a focused cmdlet that does what it says: Invokes a job to swap the location of a site with another site while archiving the original site.
Read more about managing SharePoint Online in the Office 365 for IT Pros eBook, including many other PowerShell examples.
]]>The question arose about the best way to set auto-reply for a shared mailbox to inform external senders that the company is on holiday (public or otherwise). Some suggested using Flow for the job. I, of course, thought of PowerShell. I’m not against Flow: I simply think that PowerShell offers more control and flexibility, especially when multiple shared mailboxes are involved. For instance, you might want to set appropriate auto-reply messages up for all the shared mailboxes in an organization, especially if those mailboxes are used for customer interaction.
Auto-replies, or OOF (Out of Facility) notifications as they are known in the trade, go back to the dawn of email (before Exchange 4.0). Even Teams supports out of office notifications. For Exchange (on-premises and online), it’s easy to manage auto-replies with PowerShell using the Set-MailboxAutoReplyConfiguration cmdlet. The Get-MailboxAutoReplyConfiguration cmdlet reports the current auto-reply state of a mailbox. You can have separate auto-reply messages for internal (any mail-enabled object within the organization) and external senders (anyone else).
The example solution uses a quick and dirty script to find all shared mailboxes in the tenant and set two auto-replies on each mailbox. One (brief) for internal correspondents; the other (less terse and nicer) for external people. Two variables are declared to set the start and end time for the scheduled auto-reply. If you specify a time, remember that Exchange Online runs on UTC so any time you set is in UTC. In other words, you should convert your local time to UTC when you set up the auto-reply. Rather bizarrely, Get-MailboxAutoReplyConfiguration converts the UTC time to local (workstation) time when it reports an auto-reply configuration.
#These times are in UTC $HolidayStart = "04-Aug-2019 17:00" $HolidayEnd = "6-Aug-2019 09:00" $InternalMessage = "Expect delays in answering messages to this mailbox due to the holiday between <b>" + $HolidayStart + "</b> and <b>" + $HolidayEnd + "</b>" $ExternalMessage = "Thank you for your email. Your communication is important to us, but please be aware that some delay will occur in answering messages to this mailbox due to the public holiday between <b>" + $HolidayStart + "</b> and <b>" + $HolidayEnd + "</b>" [array]$Mbx = (Get-ExoMailbox -RecipientTypeDetails SharedMailbox | Select DisplayName, Alias, DistinguishedName) ForEach ($M in $Mbx) { # Set auto reply Write-Host "Setting auto-reply for shared mailbox:" $M.DisplayName Set-MailboxAutoReplyConfiguration -Identity $M.DistinguishedName -StartTime $HolidayStart -AutoReplyState "Scheduled" -EndTime $HolidayEnd -InternalMessage $InternalMessage –ExternalMessage $ExternalMessage -ExternalAudience 'All' -CreateOOFEvent:$True }
The code above uses the Get-ExoMailbox cmdlet from the Exchange Online management module, which is what you should use in Exchange Online. However, the Get-Mailbox cmdlet will work, and it’s what you use for Exchange on-premises.
Figure 1 shows the result when an external person sends an email to a shared mailbox. You can be as creative as you like with the text when you set the auto-reply on the mailbox. Because Exchange stores the auto-reply message in HTML format, most basic HTML formatting commands work when you set auto-reply for a shared mailbox. I only use bolded text in this example, but you could also include something like a mailto: link to tell people who they should contact if someone is out of the office and unavailable.
The scheduled auto-reply lapses when the end time arrives. If you want to remove the auto-replies from all shared mailboxes, run the command:
# We assume that all shared mailboxes are in $Mbx ForEach ($M in $Mbx) { Set-MailboxAutoReplyConfiguration -Identity $M.DistinguishedName -AutoReplyState "Disabled" }
For more information about working with shared mailboxes, see the Exchange Online chapter in the Office 365 for IT Pros eBook. There’s over a thousand PowerShell examples in the book, including lots of examples of using PowerShell to work with the Microsoft Graph.
]]>Last year, I wrote about how to use events recorded in the Office 365 audit log to find out who deleted a message from an Exchange Online mailbox. Time marches on and we can make some improvements to the script.
This version also uses the techniques explained in Chapter 21 of the Office 365 for IT Pros eBook to fetch audit records with PowerShell and unpack the JSON-format information included in the records to retrieve information of interest. The major changes are:
# Look for Hard delete and soft delete records $Records = (Search-UnifiedAuditLog -StartDate (Get-Date).AddDays(-30) -EndDate (Get-Date).AddDays(1) -Operations "HardDelete", "SoftDelete" -ResultSize 5000) If ($Records.Count -eq 0) { Write-Host "No message delete records found." } Else { Write-Host "Processing" $Records.Count "audit records..." $Report = @() ForEach ($Rec in $Records) { $AuditData = ConvertFrom-Json $Rec.Auditdata If ($AuditData.Folder.Path -ne $Null) { $Folder = $AuditData.Folder.Path.Split("\")[1]} Else {$Folder = "Unknown"} If ($AuditData.LogonType -eq 1) { # Admin deleted the message $Mbx = Get-Mailbox -Identity $AuditData.MailboxGuid -ErrorAction SilentlyContinue $Msg = "No message identifier" $Mailbox = $Mbx.UserPrincipalName } Else { # User deleted the message $Msg = $AuditData.AffectedItems.InternetMessageId $Mailbox = $AuditData.MailboxOwnerUPN } $ReportLine = [PSCustomObject]@{ TimeStamp = Get-Date($AuditData.CreationTime) -format g User = $AuditData.UserId Action = $AuditData.Operation Status = $AuditData.ResultStatus Mailbox = $Mailbox Items = $AuditData.AffectedItems.Subject Folder = $Folder MsgId = $Msg } $Report += $ReportLine }} $Report | Out-GridView
Remember that control over the capture of audit records for message deletions depends on the audit configuration applied to Exchange Online mailboxes. If the configuration doesn’t include hard and soft deletions, you won’t see events turn up in the Office 365 audit log. In most cases, the audit configuration only captures message deletions by delegates who access shared mailboxes.
The reason to include the internet message identifier in the set of properties returned for deleted messages is that you might have a situation where multiple messages have the same subject and you can’t identify who deleted what copy of the message. The problem can be solved if the mailbox where the messages were stored is on hold. Exchange will keep a copy of the deleted message in the Recoverable Items folder. That copy is discoverable, so we can run a content search to find the message and then download it to check its properties, including the internet message identifier. It would be easier if Microsoft included details of the original message sender in the properties captured in the audit record, but that’s unlikely in the near future.
]]>Update: This article describes a script using Graph APIs to generate a report showing the MFA status for accounts and highlights administrative accounts that aren’t MFA-enabled. Given the deprecation of the MSOL module, you should switch to the Graph version.
If, like me, you were impressed by the case laid out in the July 10 2019 blog entitled Your Pa$$word doesn’t matter by Alex Weinert (Microsoft), you might wonder how to take his advice to “turn on MFA” for accounts. The process can take some time and user education because you can’t really enable MFA for “average users” if you don’t prepare them to deal with the resulting challenges, roll out the Microsoft Authenticator app, and so on. And then there’s the ongoing need to find unprotected Azure AD accounts to coach their owners about the wonders of MFA.
One immediate step you can take is to clamp down on accounts holding one or more Azure Active Directory administrative roles that are not MFA-enabled. Microsoft has an Azure Active Directory usage and insights report about authentication methods to inform tenants about the accounts that are/are not enabled for MFA and self-service password reset (Figure 1), but it doesn’t highlight accounts holding administrative roles.
We discussed how to create a report of Azure AD accounts and their MFA status in a previous post and we can build on the techniques explored there to construct a PowerShell script to report accounts holding an administrative role that need to be protected with MFA. You can grab a copy of the script from GitHub.
The script is imperfect and could do with improvement in terms of optimization and error handling, but it works. Here’s what it does.
Azure Active Directory defines directory roles which can be assigned to accounts to allow those accounts to perform specific tasks. In this case, we’re interested in some of the more highly-permissioned roles like Exchange Admin, so we use the Get-AzureADDirectoryRole cmdlet to grab the GUIDs identifying these roles and put them in variables. We then call the Get-AzureADDirectoryRoleMember cmdlet to populate another set of variables with details of the accounts that hold each role.
Write-Host "Finding Azure Active Directory administrative roles..." $UserAccountAdmin = Get-AzureADDirectoryRole | Where-Object {$_.DisplayName -eq ‘User Account Administrator’} | Select ObjectId $TenantAdmin = Get-AzureADDirectoryRole | Where-Object {$_.DisplayName -eq ‘Global Administrator’} | Select ObjectId $TeamsAdmin = Get-AzureADDirectoryRole | Where-Object {$_.DisplayName -eq ‘Teams Service Administrator’} | Select ObjectId $ExchangeAdmin = Get-AzureADDirectoryRole | Where-Object {$_.DisplayName -eq ‘Exchange Service Administrator’} | Select ObjectId $SharePointAdmin = Get-AzureADDirectoryRole | Where-Object {$_.DisplayName -eq ‘Sharepoint Service Administrator’} | Select ObjectId # Find out the set of accounts that hold these admin roles in the tenant $UserAccountAdmins = Get-AzureADDirectoryRoleMember -ObjectId $UserAccountAdmin.ObjectID | Select ObjectId, UserPrincipalName $TenantAdmins = Get-AzureADDirectoryRoleMember -ObjectId $TenantAdmin.ObjectID | Select ObjectId, UserPrincipalName $TeamsAdmins = Get-AzureADDirectoryRoleMember -ObjectId $TeamsAdmin.ObjectID | Select ObjectId, UserPrincipalName $ExchangeAdmins = Get-AzureADDirectoryRoleMember -ObjectId $ExchangeAdmin.ObjectID | Select ObjectId, UserPrincipalName $SharePointAdmins = Get-AzureADDirectoryRoleMember -ObjectId $SharePointAdmin.ObjectID | Select ObjectId, UserPrincipalName
The script then calls the Get-MsolUser cmdlet to create a collection of Azure Active Directory licensed accounts (yes, there’s an odd mix of the Azure AD V1 and V2 cmdlets in the script; that’s because I can’t work out how to get MFA information using the V2 cmdlets). Using the MFA report code described here, each account is checked to see if it is MFA-enabled. We then create an array of accounts which are not MFA-enabled. These accounts are checked to see if they hold one of the administrative roles we’re interested in. If an account holds one or more of those roles, we capture its details.
# Extract users whose accounts don't have MFA $MFAUsers = $MFAReport | ? {$_.MFAUsed -ne "Enforced"} If (!($MFAUsers)) { Write-Host "No privileged accounts found without MFA protection" ; break} Write-Host "Checking MFA status for accounts holding admin roles..." $i = 0 $Report = [System.Collections.Generic.List[Object]]::new() # Create output file # Check Admin Roles if MFA not enabled ForEach ($User in $MFAUsers) { $Roles = $Null If ($UserAccountAdmins.ObjectId -Contains $User.ObjectId) { Write-Host $User.DisplayName "Account holds the User Account Admin role" -ForegroundColor Red $Roles = "Account Admin" } If ($TenantAdmins.ObjectId -Contains $User.ObjectId) { Write-Host $User.DisplayName "Account holds the Tenant Admin role" -ForegroundColor Red If ($Roles -eq $Null) { $Roles = "Tenant Admin" } Else { $Roles = $Roles + "; Tenant Admin" } } If ($TeamsAdmins.ObjectId -Contains $User.ObjectId) { Write-Host $User.DisplayName "Account holds the Teams Admin role" -ForegroundColor Red If ($Roles -eq $Null) { $Roles = "Teams Admin" } Else { $Roles = $Roles + "; Teams Admin" } } If ($ExchangeAdmins.ObjectId -Contains $User.ObjectId) { Write-Host $User.DisplayName "Account holds the Exchange Admin role" -ForegroundColor Red If ($Roles -eq $Null) { $Roles = "Exchange Admin" } Else { $Roles = $Roles + "; Exchange Admin" } } If ($SharePointAdmins.ObjectId -Contains $User.ObjectId) { Write-Host $User.DisplayName "Account holds the SharePoint Admin role" -ForegroundColor Red If ($Roles -eq $Null) { $Roles = "SharePoint Admin" } Else { $Roles = $Roles + "; SharePoint Admin" } } If ($Roles -ne $Null) {Write-Host "User" $User.DisplayName "is assigned the following roles:" $Roles -ForeGroundColor Yellow; $i++ $ReportLine = [PSCustomObject]@{ User = $User.DisplayName UPN = $User.UserPrincipalName Roles = $Roles MFA = $User.MFAUsed } $Report.Add($ReportLine) } #End if }
As the code runs, it generates information about accounts which are not MFA-enabled but hold administrative roles (Figure 2). Apart from anything else, this is a good way to see what accounts hold administrative roles and ask whether they need to hold those roles.
Finally, a CSV file is generated with details of accounts holding Azure AD administrative roles which are not MFA-enabled and exported to a CSV file. Figure 3 shows details of what the file contains as viewed through the Out-GridView cmdlet. It’s easy to pick out the accounts whose security needs to be improved.
As always, we’re happy to hear about other approaches to the problem. Please post your ideas as a comment to this post.
Need more solutions to common Office 365 Admin problems? The Office 365 for IT Pros eBook is packed full of ideas…
]]>We all know that Office 365 is in a state of perpetual change, but you’d imagine that a component that has worked perfectly well since its introduction in the Exchange 2013 on-premises server would remain stable. Alas, that’s not true and someone has broken PowerShell command logging for the Exchange Admin Center (EAC).
Exchange 2007 was the first major Microsoft server to support PowerShell. Because PowerShell was so new, the developers were taking a risk with both their implementation as the basis for all Exchange management interfaces and in how customers took to the new shell. To bridge the knowledge gap, Microsoft introduced the command logging feature in the Exchange Management Center (EMC). Command logging reported the PowerShell commands executed by EMC options. The value in the command log was obvious: administrators could learn PowerShell by reviewing the command run by Exchange to get work done. Administrators could also copy and reuse the commands in their own scripts. Command logging proved to be a tremendously valuable learning tool to kickstart PowerShell for Exchange.
Over the years, the Exchange administration centers evolved. The browser-based EAC (in the guise of the ECP) arrived in Exchange 2010 and became the prime administrative interface in Exchange 2013. Aside from a little hiccup in Exchange 2013 (fixed in SP1), EAC included command logging in both on-premises and cloud variants and the logging feature continued its popularity within the Exchange administrator community. In the context of Exchange Online, command logging helped administrators understand the differences in the set of PowerShell cmdlets available in the cloud and how these cmdlets are used.
Something changed recently. I don’t know when because I seldom use command logging anymore: after 13 years or so working with PowerShell, I have now reached the “dangerous” stage. Some would say “dangerously inept,” but that’s not important right now. I only look at the command log when I need an example of syntax for a cmdlet that I am unfamiliar with.
In any case, the option to view the command log is still available in EAC. Gio to the ? (question mark) menu and you’ll see Show Command Logging (Figure 1).
Clicking the link displays the command log. A perfectly empty command log, bereft of any useful information (Figure 2). To add insult to injury, the Learn more link is perfectly useless too.
Update (July 11): Microsoft says that they have found and fixed the problem that caused command logging fail. The fix is now rolling out across Office 365.
To be clear, the command logging in EAC is not the same as the audit records captured in the Office 365 Audit log for administrative operations performed against Exchange Online. EAC command logging shows you the exact PowerShell commands (syntax, parameters, and values) used by Exchange to get work done. That’s why the command log is so helpful to understand and learn PowerShell. Audit logging captures records of activities in a normalized format across all Office 365 workloads. You will know that a PowerShell cmdlet was run, but will find it hard to cut and paste the cmdlet and its parameters for reuse. Also, it’s much harder to associate what you see in the audit log with what you just did in EAC because records only show up in the audit log 15 minutes or so after an action is performed. EAC command logging shows you what you just did.
In the overall scheme of Office 365, this is a small bug. But it’s hard to understand how something that has worked so long has suddenly experienced a problem. Obviously one of the moving parts in the landscape of Exchange Online or Office 365 caused commands to fail to show up in the log. Let’s hope that Microsoft fixes the problem soon to restore this valuable learning tool.
Need more information about Exchange Online and the rest of Office 365? Look no further than the Office 365 for IT Pros eBook. The command logging feature is so old that we don’t cover it in the book, but we do cover almost everything else.
]]>Updated 17 April 2023
Sometimes Microsoft doesn’t communicate changes made to PowerShell cmdlets that introduce interesting new functionality. There’s so much change in the service that they could be forgiven for an occasional slip-up, unless of course you need to use the specific feature that is undocumented.
Which brings me to the well-known Set-Mailbox cmdlet, which boasts two parameters called ExcludeFromOrgHolds and ExcludeFromAllOrgHolds, a fact highlighted by MVP Vasil Michev in his ongoing crusade to discover what’s hidden in the corners of Office 365.
These parameters allow administrators to exclude some or all org-wide retention holds from inactive mailboxes. Remember that an inactive mailbox is one belonging to an Azure AD account that has been deleted but is kept because a hold exists on the mailbox. The hold can be any form of hold supported by Exchange Online, including litigation holds and those set by Office 365 retention policies. Retention holds come in two flavors, org-wide and non-org-wide (in other words, holds that apply to all mailboxes and those that apply to selected mailboxes).
Excluding an org-wide hold means that when Exchange evaluates whether to keep an inactive mailbox, it ignores that hold. If all org-wide holds are ignored, the inactive mailbox will only be kept if a specific non org-wide hold exists.
Why do these parameters exist? Well, Microsoft introduced inactive mailboxes several years ago as a way for organizations to keep mailboxes around for compliance purposes without having to pay for Office 365 licenses. The most common use case is when mailboxes are kept for ex-employees. The idea is that a tenant will apply a hold to keep the mailboxes inactive for the desired period and then release the hold when the mailboxes are no longer needed.
Org-wide holds apply to both active and inactive mailboxes. Over time, it’s possible that a tenant will add new org-wide holds. The effect is that the set of inactive mailboxes is likely to grow because any mailbox that is deleted will become inactive because one or more org-wide holds exist.
Keeping inactive mailboxes is good if intended. It’s not so good if you don’t want or need those mailboxes. One of the principles of data governance in Office 365 is that tenants should be able to decide what data to keep and what to remove, and keeping inactive mailboxes longer than they should be goes against that principle. I imagine that Microsoft introduced these cmdlets to give tenants the ability to decide what org-wide holds should apply to inactive mailboxes.
Org-wide holds are registered in the Exchange Online organization configuration. To see the set, run the PowerShell command:
# Retrieve org-wide holds for the Exchange Online Get-OrganizationConfig | Select-Object -ExpandProperty InPlaceHolds mbx15382841af9f497c83f9efe73e51888d:1 mbx9696959111f74ecda8a40aef97edd2c2:1 mbx703105e3b8804a1093bb5cb777638ea8:1 grp703105e3b8804a1093bb5cb777638ea8:1 mbxc1e2d6f1785d4bf8a7746a26e58e5f66:1 grpc1e2d6f1785d4bf8a7746a26e58e5f66:1 mbxf6a1654abdba4712a43c354e28a4d56c:2 grpf6a1654abdba4712a43c354e28a4d56c:2
The holds we’re interested in start with mbx. Those starting with grp apply to Office 365 Groups. The values following are GUIDs that point to the retention policies defining the holds. If you’re interested in understanding how to resolve the GUID to find the retention policy, see the Compliance chapter in the Office 365 for IT Pros eBook.
To exclude specific org-wide holds from a mailbox, run the Set-Mailbox cmdlet and pass the GUIDs for the holds you want to exclude in a comma-separated list for the ExcludeFromOrgHolds parameter. Use the same format for the GUIDs as reported by Get-OrganizationConfig. When you run the command, Exchange updates the InPlaceHolds property for the mailbox to note the excluded holds.
# Exclude specific org-wide holds from a mailbox Set-Mailbox -Identity Kim.Akers -ExcludeFromOrgHolds "mbx9696959111f74ecda8a40aef97edd2c2:1", "mbx19200b9af08442529be070dae2fd54d3:1"
Microsoft recommends that you use the distinguished name or ExchangeGUID property to identify the mailbox. This is to be absolutely sure that a unique value is passed because if you exclude the holds for the wrong inactive mailboxes, you run the risk that Exchange will remove these mailboxes permanently when it evaluates the holds that exist on them.
To remove all org-wide holds from a mailbox, run Set-Mailbox and pass the ExcludeFromAllOrgHolds parameter. Because you’re now removing all org-wide holds, it’s even more important to be certain that you’re processing the right mailboxes.
#Exclude all org-wide holds from the target mailbox Set-Mailbox -Identity $Mbx.DistinguishedName -ExcludeFromAllOrgHold
I wrote a script to exclude all org-wide holds from the inactive mailboxes in my tenant. Here’s the relevant code to retrieve org-wide holds from the Exchange Online configuration and exclude inactive mailboxes from the mailbox holds. Figure 1 shows the script running.
[array]$InPlaceHolds = Get-OrganizationConfig | Select-Object -ExpandProperty InPlaceHolds $InPlaceHoldsMbx = $InPlaceHolds | Where-Object {$_ -like "*mbx*"} [array]$InactiveMbx = Get-ExoMailbox -InactiveMailboxOnly -ResultSize Unlimited | Select-Object -ExpandProperty Alias ForEach ($Mbx in $InactiveMbx) { Write-Host ("Excluding inactive mailbox {0} from org-wide holds" -f $Mbx) $Status = Set-Mailbox -Identity $Mbx -ExcludeFromOrgHolds $InPlaceHoldsMbx }
Immediately Set-Mailbox processes a mailbox, Exchange evaluated the holds to decide whether to remove the mailbox. After the script finished, the number of inactive mailboxes reduced from 39 to 17. This proves that you need to be ultra-careful when you exclude any org-wide hold from an inactive mailbox.
For more information about managing Exchange Online mailboxes, read Chapter 6 in the Office 365 for IT Pros eBook to discover even more valuable tips and techniques.
]]>A contribution to the Microsoft Technical Community offered a solution for how to add all the Teams in an Office 365 tenant to the Office 365 Groups Expiration Policy. Once teams are added to the policy, the groups they belong to expire at the end of the policy lifetime (say, 750 days) and must be renewed by a team owner. If not, the group is soft-deleted for 30 days, during which time it can be recovered. At the end of that period, Office 365 permanently removes the group for the team and all the associated resources.
In any case, the script worked on the basis of finding all the Office 365 Groups in the tenant and then figuring out which groups are team-enabled before adding those groups to the policy. It’s a valid approach, but a better method is to use the Get-Team cmdlet in the Teams PowerShell module because it only returns the set of teams and you don’t have to fiddle around checking what groups are team-enabled.
Once you have the set of teams, it’s easy to add them to the expiration policy using the Add-AzureADMSLifecyclePolicyGroup cmdlet.
A script showing how to add multiple groups to the expiration policy is included in the chapter covering how to manage Groups and Teams in the Office 365 for IT Pros eBook. It was simple to take that script and amend it to process Teams rather than Groups. We can improve the solution further by implementing a check for teams already covered by the policy so we don’t trigger an error when running Add-AzureADMSLifecyclePolicyGroup.
One way to do this is to update one of the fifteen custom attributes available for all mail-enabled objects. In the example, we write the GUID of the policy into this attribute, meaning that we can check for its existence before trying to add a team to the policy. Here’s the code:
# Add all Teams that aren't already covered by the Groups expiration policy # to the policy $PolicyId = (Get-AzureADMSGroupLifecyclePolicy).Id $TeamsCount = 0 Write-Host "Fetching list of Teams in the tenant…" $Teams = Get-Team ForEach ($Team in $Teams) { $CheckPolicy = (Get-UnifiedGroup -Identity $Team.GroupId).CustomAttribute3 If ($CheckPolicy -eq $PolicyId) { Write-Host "Team" $Team.DisplayName "is already covered by the expiration policy" } Else { Write-Host "Adding team" $Team.DisplayName "to group expiration policy" Add-AzureADMSLifecyclePolicyGroup -GroupId $Team.GroupId -Id $PolicyId -ErrorAction SilentlyContinue Set-UnifiedGroup -Identity $Team.GroupId -CustomAttribute3 $PolicyId $TeamsCount++ }} Write-Host "All done." $TeamsCount "teams added to policy"
Another advantage of using a custom attribute is that many cmdlets support server-side filtering for this attributes. For instance, to find the set of teams that have been added to the expiration policy, we can run the command:
Get-UnifiedGroup -Filter {CustomAttribute3 -ne $null} | Format-Table DisplayName, CustomAttribute3
After I wrote the original post, Microsoft updated the Groups expiration policy to be activity based. A side effect of the change was that the Get-UnifiedGroup cmdlet returns the calculated expiration date for groups covered by the policy, meaning that you could use this instead of a custom attribute to figure out what groups are covered by the policy. Thus, we can now base the script on the expiration date as shown below.
$PolicyId = (Get-AzureADMSGroupLifecyclePolicy).Id Write-Host "Fetching list of Teams in the tenant…" [array]$Teams = Get-Team $TeamsCount = 0 ForEach ($Team in $Teams) { $CheckPolicy = $Null $CheckPolicy = (Get-UnifiedGroup -Identity $Team.GroupId).ExpirationTime If ($CheckPolicy -ne $Null) { Write-Host "Team" $Team.DisplayName "covered by expiration policy and will expire on" (Get-Date($CheckPolicy) -format g)} Else { Write-Host "Adding team" $Team.DisplayName "to group expiration policy" -Foregroundcolor Red Add-AzureADMSLifecyclePolicyGroup -GroupId $Team.GroupId -Id $PolicyId -ErrorAction SilentlyContinue Set-UnifiedGroup -Identity $Team.GroupId -CustomAttribute3 $PolicyId $TeamsCount++ }} Write-Host "All done." $TeamsCount "Teams added to policy"
In turn, this means that to find the groups covered by the policy you can do the following (unfortunately the ExpirationTime property is not supported for server-side filtering):
[array]$Groups = Get-UnifiedGroup -ResultSize Unlimited |? {$_.ExpirationTime -ne $Null} $Groups | Sort Expirationtime | Format-Table DisplayName, ExpirationTime
After you add a bunch of groups to the expiration policy, the likelihood exists that some of those groups will expire because they are already older than the expiration period. For this reason, it’s a good idea to prepare team owners to let them know what to do if they see an expiration notice. If the group is team-enabled, the notification appears in the activity feed of the team owner (Figure 1).
They can then extend the lifetime of the team by editing its settings. Select the team expiration option to view the current expiration date and then click Renew now (Figure 2) if needed.
Need more examples of how to manage Teams and Office 365 Groups (and many other things) with PowerShell? Look no further than the Office 365 for IT Pros eBook. At the last count, the text included over a thousand examples.
]]>The Teams App navigation bar is on the left-hand side of the desktop and browser clients and along the bottom of the mobile client. It’s where icons for pinned apps like the core set of default apps usually thought of as “Teams” (like Files, Chat, and Teams) appear along with a set of
less prominent apps accessed through the ellipsis menu […]. The set of apps shown in the navigation bar and their order are defined in a Teams App Setup policy. By default, all tenants have a default app setup policy called Global and another policy suitable for front-line workers called FrontLineWorker, which includes the Shifts app.
You don’t have to go anywhere near app setup policies if you’re happy with the set of apps in the navigation bar. However, if you want to add some apps or change the order, you do so through an app setup policy. You can have multiple app setup policies within a tenant, each of which is customized for specific groups of users.
Teams App Setup policies are part of a set of features designed to make apps more manageable. App setup policies also control if users are allowed to load custom apps into Teams (needed if they want to switch to developer preview) and pin apps to the navigation bar. Teams App Permission policies, which are announced but not yet available, are also in this set.
To create a new app setup policy, open the Teams Admin Center, go to the Teams Apps section, and select Setup policies. Select the choice to add a new policy. Teams populates the policy with the core apps (Activity, Teams, Chat, Files, Calendar, Calling). You can then remove apps, add apps, or move the apps up and down within the navigation bar. In Figure 1, I added the Insights, Tasks by Planner, Yammer Communities, and Stream apps. You can also install apps through the policy, as I have done for the Approvals app.
You can add any app from the Teams store to the navigation bar, or any app you publish to your own tenant app catalog (aka the “Company store”).
After settling on the final set and order of apps, save the new policy.
An apps setup policy only becomes effective when you assign it to a user. You can do this individually by selecting users in the Teams Admin Center and editing the set of policies assigned to the user (Figure 2). After a short period, the user’s Teams client will pick up the change in policy and apply the settings to the client’s navigation bar.
In Figure 3 we can see the effect of applying an app setup policy. The default apps are reordered so that Teams is above the activity feed and three new apps (Planner, Stream, and the Who bot) are included in the bar.
If you remove any of the core apps from the navigation bar, the user can still access them through the ellipsis menu.
When an admin changes the app setup policy assigned to an account, Teams notifies the user that the change happened and advises them that some of their pinned apps might have moved (Figure 4).
The Teams PowerShell module includes cmdlets to work with Teams app setup policies. For instance, to see all the policies in a tenant, run the command:
# Get Teams App Setup Policies Get-CSTeamsAppSetupPolicy
and to see the set of apps and their order in a policy, run a command like this:
# Return list of apps for a selected Teams App Setup Policy Get-CSTeamsAppSetupPolicy -Identity "App Policy 2" | Select -ExpandProperty Pinnedappbarapps Id : 2a84919f-59d8-4441-a975-2a8c2643b741 Order : 1 Id : 14d6962d-6eeb-4f48-8890-de55454bb136 Order : 2 Id : 86fcd49b-61a2-4701-b771-54728cd291fb Order : 3 Id : 5af6a76b-40fc-4ba1-af29-8f49b08e44fd Order : 4 Id : ef56c0de-36fc-4ef8-b417-3d82ba9d073c Order : 5 Id : 20c3440d-c67e-4420-9f80-0e50c39693df Order : 6 Id : com.microsoft.teamspace.tab.planner Order : 7 Id : com.microsoftstream.embed.skypeteamstab Order : 8 Id : fc6b6d20-89ed-45fb-9e62-e4b4ca8fbf3f Order : 9
It’s usually best to update app setup policies through the GUI of the Teams Admin Center. Where PowerShell comes in very handy is to assign a new policy to a bunch of users, even if the cmdlets in the Skype for Business Online module badly needed to be replaced by new cmdlets in the Teams PowerShell module. In any case, here’s a quick snippet of how to assign a Teams app setup policy to a group of users from a selected department.
# Assign Teams App Setup Policy to users in the Marketing department $Users = (Get-CSOnlineUser -Filter {Department -eq 'Marketing'}) Foreach ($U in $Users) { Write-Host "Assigning Teams App Setup Policy App Policy 2 to" $U.DisplayName Grant-CSTeamsAppSetupPolicy -PolicyName "App Policy 2" -Identity $U.UserPrincipalName }
For more information about Teams, read Chapters 11 and 12 of the Office 365 for IT Pros eBook.
]]>