Using Intune to remotely install Powershell modules on enrolled devices

A few weeks ago I shared a post detailing how you could write the resultant output of an Intune pushed Powershell script to Azure Tables, you can read that post here, the use case that drove that post was a customer asking for explicit evidence that a particular Microsoft hotfix had been installed on all devices in their estate.

The main function of that script used the Az module to connect to the Azure table and write the data, however, in that script I made what is in hindsight a pretty significant oversight in that I assumed and therefore didn’t check that the Az module had been installed and imported, this meant it failed when running on the majority of users’ devices as they didn’t have the module installed.

Thank you to Nathan Cook who commented on that post asking that very question and making me realise the mistake, I’ve now added an addendum to that post advising as such.


To that end see the below script from Nickolaj Andersen’s Github repo that I’ve adapted to suit being deployed from Intune.

This script can either be used at the start of an individual script to check for the presence of any required modules or deployed separately, say as part of an Autopilot post-deployment sequence to push out commonly used modules.

#Start logging
Start-Transcript -Path "C:\Logs\InstallAzNew - $(((get-date).ToUniversalTime()).ToString("yyyyMMddThhmmssZ")).log" -Force

# Determine if the Az module needs to be installed
try {
Write-Host "Attempting to locate Az module"
$AzModule = Get-InstalledModule -Name Az -ErrorAction Stop -Verbose:$false
if ($AzModule -ne $null) {
Write-Host "Authentication module detected, checking for latest version"
$LatestModuleVersion = (Find-Module -Name Az -ErrorAction Stop -Verbose:$false).Version
if ($LatestModuleVersion -gt $AzModule.Version) {
Write-Host "Latest version of Az module is not installed, attempting to install: $($LatestModuleVersion.ToString())"
$UpdateModuleInvocation = Update-Module -Name Az -Scope CurrentUser -Force -ErrorAction Stop -Confirm:$false -Verbose:$false
catch [System.Exception] {
Write-Host "Unable to detect Az module, attempting to install from PSGallery"
try {
# Install NuGet package provider
$PackageProvider = Install-PackageProvider -Name NuGet -Force -Verbose:$false

# Install Az module
Install-Module -Name Az -Scope AllUsers -Force -ErrorAction Stop -Confirm:$false -Verbose:$false
Write-Host "Successfully installed Az"
catch [System.Exception] {
Write-Host "An error occurred while attempting to install Az module. Error message: $($_.Exception.Message)" ; break

# Stop Logging

This script should be run as the logged-on user, ensure this is set when creating the task in Intune, as below.


Note, if you do use this script to deploy the entire Az module and not a subset such as Az.Network, be aware that it is pretty big and may take a while to download depending on environmental factors, such as available bandwidth etc.

The below screenshot shows the output of the transcript log written to the local device, note it shows that neither the Az module or NuGet package provider was installed so they were both pulled from the PSGallery and installed.


For confidence during testing, you can see the script installing the many Az modules into C:\Program Files\WindowsPowershell\Modules


A few quick tips for troubleshooting, not just for this script but for any you deploy via Intune.

1 > Don’t Wait

The default refresh and pull cycle of Intune (think GP refresh time for AD GPO’s) is 60 minutes but during development you’re going to want to push that script out fast. There are several ways to force a sync between Windows 10 and Intune, the quickest is definitely to restart the Microsoft Intune Management Extension service, this will force an immediate sync.


2 > Run Script Locally

This may sound like an obvious one, but say you’re running a script like the one above to install a certain module and you’ll want to keep the testing environment that same, that is, you don’t want to run it on a different build of Windows 10, where you have full admin rights etc as this will increase the likelihood of false-positives.

To that end, Intune caches and executes a local copy of the script in C:\Program Files (x86)\Microsoft Intune Management Extension\Policies\Scripts – run that as the locally logged on user, maybe add the -WhatIf switch to simulate the results.

Note, the scripts won’t have the same friendly and informative name you saved them as, instead, they’re given the GUID name of the task in Azure.


3 > In The Registry We Trust

If you don’t subscribe to the practice of using Start and Stop-Transcript for logging you can use the registry to get the results of the script.

The key is HKLM\Software\Microsoft\MicrosoftIntuneManagementExtension\Policies

Again, as with the locally cached script, the key adopts the name of the task GUID, from there you can view the Result and ResultDetails values.

This is handy for development, however, I strongly suggest writing the out to verbose logs for Production.


QuickFix > Starting and Stopping Azure VM’s using a Powershell Menu

This is another quick fix to help automate a somewhat monotonous task, this time it’s using Powershell to build a simple menu to give the ability to start and stop a set of Azure virtual machines using a Service Principal.

The script is pretty straight forward and can be easily adapted to add additional VM’s should you need to power up, or down, a larger set of virtual machines.

For ease and to keep this as hands-off as possible this script uses a Service Principal to authenticate to Azure AD and perform the task setting the VM power state, this saves having to manually enter credentials and responding to MFA challenges.
I’ve borrowed and adapted the Powershell commands directly from this Microsoft Windows Virtual Desktop article which can be used to create a Service Principal if you don’t have one created already.
# This script creates a Service Principal within Azure AD

# Install and import AzureAD PS Module
Install-Module AzureAD
Install-Module AzureAD
# Connect to Azure AD with a Global Admin account.
$aadContext = Connect-AzureAD

# Create Service Principal
$svcPrincipal = New-AzureADApplication -AvailableToOtherTenants $true -DisplayName "VM Power Menu Service Principal"

# Get Service Principal ID and Key 
$svcPrincipalCreds = New-AzureADApplicationPasswordCredential -ObjectId $svcPrincipal.ObjectId

# Return Service Principal App ID

# Return Service Principal Secret Key

# Return Azure AD Tenant ID

Before proceeding please ensure that the newly created Service Principal has the adequate permissions to start and stop the required virtual machines, this can be set directly on the VM’s themselves or on the Resource Group.

Note, if you’re setting these permissions in a production environment or an environment that is particularly security-sensitive please be mindful of the principle of least privileges, that is, avoid using elevated roles such as Owner or Contributor role if a lesser role would suffice, such as Virtual Machine Contributor.

Once the Service Principal has been created copy and paste the below into your code editor of choice, you will need to set the initial variables in the header, you can use the resultant output from the script above for the Azure AD Tenant ID  and Service Principal details.  You will also need to populate the two Azure VM variables and the associated hosting Resource Group.

Import-Module Az.Compute

$AADTenant = [Azure-AD-Tenant-ID-Here]
$SPAppID = [Service-Principal-App-ID]
$SPAppSecret = [Service-Principal-Secret-Key-Here]
$WVDVM1 = [Second-VM-To Start]
$WVVM2 = [Second-VM-To Start]
$AzRG = [Resource-Group-Hosting-VMs-Here]
$passwd = ConvertTo-SecureString $SPAppSecret -AsPlainText -Force
$pscredential = New-Object System.Management.Automation.PSCredential('$SPAppID', $passwd)

Connect-AzAccount -ServicePrincipal -Credential $pscredential -Tenant $AADTenant

function Show-Menu
     param (
           [string]$Title = 'Start-Stop WVD Lab'
     Write-Host "================ $Title ================"
     Write-Host "1: Press '1' to start WVD."
     Write-Host "2: Press '2' to stop WVD."
     Write-Host "Q: Press 'Q' to quit."
     $input = Read-Host "Please make a selection"
     switch ($input)
           '1' {
                'Starting WVD'
                    Start-AzVM -Name $WVDVM1 -ResourceGroupName $AzRG
                    Start-AzVM -Name $WVDWM2 -ResourceGroupName $AzRG
           } '2' {
                'Stopping WVD'
                    Stop-AzVM -Name $WVDVM1 -ResourceGroupName $AzRG -Force
                    Stop-AzVM -Name $WVDWM2 -ResourceGroupName $AzRG -Force
           } 'q' {
until ($input -eq 'q')

When executed the script presents a simple menu with 3 choices as below.


Note, after executing either options 1 or 2 the script will intentionally not exit and close, instead, it will await a response of Q to quit – this was done intentionally as a gentle reminder to the user to stop the VM’s they’d originally started.

Lastly, as a matter of good practice, for lab environments that don’t need to spin around the clock, for subscriptions that use free credit, or you’re just trying to keep costs down I’d always recommend setting an Auto-Shutdown time on non-production VM’s as a backup.

Writing the results from an Intune executed Powershell script to Azure Table Storage

Edit 23/05/20: The script in this post omits a check for the presence of the required Powershell modules required to run certain cmdlets, please see new accompanying post for details on how to check for the presence, and install required modules remotely using Intune.

Anyone who has worked as a SysAdmin over the years managing either server or client estates, or both, and likely remotely, will have undoubtedly had to hack up a script of varying complexity to push out some sort of workaround or fix, be it adding a registry key or deleting old profiles to recoup some disk space.

That was all straight forward in ‘traditional’ Windows-based server-client environments were connectivity to the source was over a reliable LAN or WAN and the hardest part was remembering the correct syntax for PSEXEC, but that is no longer the case in the more modern of workplaces were devices are Azure AD joined and use the internet as their default communication medium.

The is where Intune (Endpoint Manager) comes in, the ability to remotely execute Powershell scripts to enrolled devices isn’t new and I’ve used it increasingly over the months to bridge a few small gaps, usually where a particular policy setting I needed wasn’t natively available, such as setting the properties of a local account that was created using an Intune policy, such as ‘password does not expire’ but I’ll cover that end-to-end in another post as it’s handy to know.

So, as mentioned, the actual pushing out of a script in Intune is pretty straight forward however centrally capturing the output isn’t. The Intune portal provides a very basic reporting function that displays whether the script was successfully deployed or not, that is to say, did it exit with defined exit code, but you do not get anything remotely verbose in the portal, for example, it does not capture any directed output.

So, in a decentralised environment with no common LAN or server infrastructure to write results too what are the options?

I did once unsuccessfully experiment trying to write to Azure Blob Storage and have contemplated similar using Azure Files and mapping drives within the script but for now, after consulting the MDM community on Twitter I came across a post by Travis Roberts that provided a great article on writing the output of a Powershell script to Azure Table Storage.

Note, the script that follows is directly adapted from Travis Roberts’ original blog post, so please check that out. Thanks, Travis.

Right, so let’s quickly cover the actual use-case here, why do I need to write a Powershell script and push it out via Intune and why am I so concerned with capturing the results?

The answer is I was contacted by a customer who as part of a wider Microsoft 365 adoption we deployed Intune for and was asked whether we could use Intune to quickly report on whether a particular Microsoft hotfix had been installed on all of their Windows 10 devices in response to a recent threat – the short answer was no we couldn’t.  Yes, we were using Intune to define Windows Updates but again the reporting was not detailed enough.

The result was the below, this script when executed searches for the presence of a defined hotfix by the KB it was wrapped in and writes the results to a very simple Table in Azure which I could use Storage Explorer to monitor and report from.

In readiness to utilise this script you must have already created a Table within an Azure Storage Account and generated a SAS key for a defined window.

I’ve used a generic Partition Key of KBCheck to satisfy the data integrity constraints.

# Step 1, use Start-Transcript to capture the execution to a text file and set variables for connecting to Azure Table

Start-Transcript -Path "C:\TempLogs\KB4541338Check-$(((get-date).ToUniversalTime()).ToString("yyyyMMddThhmmssZ")).log"

$storageAccountName = 'StorageAccountName'
$tableName = 'TableName'
$sasToken = 'SASKeyHere' 
$dateTime = get-date
$partitionKey = 'KBCheck'

# Step 2, Connect to Azure Table Storage
$storageCtx = New-AzureStorageContext -StorageAccountName $storageAccountName -SasToken $sasToken
$table = (Get-AzureStorageTable -Name $tableName -Context $storageCtx).CloudTable

# Step 3, Check for presence of hotfix
Write-Host "Checking for KB4541338"

if (get-hotfix -Id KB4541338) {
$PatchCheck = "Installed"
Write-Host "KB4541338 Installed"
else {
$PatchCheck = "Not Installed"
Write-Host "KB4541338 Missing"
# Step 4, Write data to Table Storage and end transcript.

Write-Host "Writing to Azure Table $table"

Add-StorageTableRow -table $table -partitionKey $partitionKey -rowKey ([guid]::NewGuid().tostring()) -property @{
'LocalHostname' = $env:computername
'PatchStatus' = $PatchCheck
} | Out-Null


The output of the script when viewed in Azure Storage Explorer is very simple, it shows the device hostname and either ‘Installed’ or ‘Not Installed’ depending on whether the KB was found.

This script can easily be used as a framework for capturing other simple text-based outputs using Intune and Powershell, simply update the body of the script in step 3.

Thanks again to Travis Roberts for the original script.

Bulk Updating UPN’s

As part of a recent project I had to migrate 1500 user objects between two AD domains and then sync them with Azure AD – I’ll cover that entire project and its challenges, of which there was a handful in another post.

As part of that migration, I needed to bulk update the UPN’s of the newly created accounts to align with the custom domain name configured in Azure AD, that begs the question why didn’t I just stand up the new AD domain with the appropriate suffix in the first instance, and the honest answer is this project was extremely fast-moving and we had to make decisions on the fly whilst we were working on the strategic architecture, again, this will become clearer when I cover the project in more depth.

Anyway, with 1500 user objects to update I certainly wasn’t doing that manually, I used the below script to target all users in a specific OU and change their UPN.

Note, if you are to use this please ensure you have created the new UPN in AD Domains and Trusts first, details on how to do that here.

Import-Module ActiveDirectory

$oldSuffix = "oldsuffix.local"
$newSuffix = ""
$ou = "OU=New-Accounts,DC=oldsuffix,DC=local"

Get-ADUser -SearchBase $ou -filter * | ForEach-Object {
$newUpn = $_.UserPrincipalName.Replace($oldSuffix,$newSuffix)
$_ | Set-ADUser -UserPrincipalName $newUpn