At my employer, we are actively using Windows Autopilot to enroll our Windows devices in Azure AD and Microsoft Intune. Our vendor does the registration of newly ordered devices in our Autopilot service. Default group tagging is handled via a Logic Apps flow as I described in this previous blog post.
Our existing devices are co-managed with SCCM and Intune and we converted most of these devices by assigning an Autopilot deployment profile, so also these older devices should have been registered in our Autopilot service.
But sometimes we have the need to re-register our devices in Autopilot. A reason is that sometimes after changing the group tag, nothing is changed in the profile assignment for hours (and re-importing them again assigns the profile much faster). Or we have no associated Azure AD device object at a serial.
To avoid the need to ask local IT (or even the end-user) to manually export the Autopilot information using PowerShell when we need to re-import this information, we have implemented a solution to gather this information from every Windows device we manage via Intune (that includes co-managed devices as well). We store this information on Azure Storage, so we have a dump from every Windows device. Saving the Autopilot information on Azure storage can be compared with a part of this solution of Oliver Kieselbach, but our solution only stores the information on Azure storage and doesn’t automatically import it to Autopilot (as this involves all existing devices).
We only want to import this information on-demand, when there is a need for it. We could export the information from Azure and manually import it to Intune, but that’s fun one time, not if this is done regularly. So Logic Apps for the win!
The solution
The Logic Apps flow is triggered by creating a new item on a SharePoint List. In that SharePoint List, you enter the serial number and choose the Group TAG which you want to assign to the newly registered serial.
The Logic App flow pulls the csv file from the Azure Storage container (from an Azure Storage file share is also possible). With several HTTP Actions the Autopilot service, Intune, and Azure AD are queried if an existing object is available and if that’s the case, these are deleted.
After that is finished, the information from the CSV file (in which the Autopilot information is stored) is imported to the Autopilot service.
At the end of the flow, information is written back to the SharePoint List with the end-result.
Requirements
We need a SharePoint List and a (service) account with (read) access to the SharePoint List.
The admin enters information in the columns Serial number and Group TAG, the other columns are filled by the flow after the flow finished his job.
We also need an Azure Storage account in my case with a Container.
In which the CSV files are stored.
And we need an Azure Managed Identity. The Managed Identity needs to have read access to the Azure Storage, which I have assigned using an Azure Role Assignment Storage Blob Data Reader.
Microsoft Graph (application) permissions are needed for the Managed Identity:
Device.Read.All
DeviceManagementServiceConfig.ReadWrite.All
And because there is no support for Graph API application permissions to delete a device from Azure AD, I assigned an Azure AD role to the Managed Identity which holds this role permission:
microsoft.directory/devices/delete
This AAD role could be a built-in role or a custom role.
How to set up such a Managed Identity and assign these permissions can be read in this article in more depth.
Setup the first part of the Logic App flow
When we have the prerequisites in place, we can start creating the flow in Azure Logic Apps.
Sign in to the Azure portal and open the Logic App service. I created a blank Logic App of type Consumption.
When the flow is created, click on the name of the flow at the top of the screen, open the Identity section, and on the tab User assigned add your Managed Identity.
Open the Overview tab, which shows a few templates, and choose Blank Logic App.
We need to add our trigger, which is of type SharePoint. Search for Sharepoint, select it and search for When an Item is created and select that trigger.
Click Sign in and authenticate with your (service) account which has access to the SharePoint List.
Select the Site address from the drop-down of your List. If the address is not shown, click Enter custom value and enter the address.
Pick the List Name from the drop-down list.
Make your choice of how often the Logic App needs to check the List for new items.
Next, we add our first action, which is of type Azure Blob Storage. Select the Get blob content using path action.
The Managed Identity is automatically used for the connection to the storage, otherwise provide a connection name and select Logic Apps Managed Identity as the Authentication type.
Enter the Storage account name.
Enter /containername/ in the Blob path field.
Select Title from the Dynamic content list, which is the serial number from the SharePoint List item and should match the name of your CSV file.
And enter .csv behind Title.
Next, we enter a Condition, which is a Control action. This filter is used to built in handling for none existing CSV files. When a user enters a serial for which we don’t have stored a CSV, that is filtered out by this condition.
One thing to note:
Unfortunately, the Azure blob storage action is case-sensitive, something to take into account.
Most likely we could use the statuscode of the previous action, but that requires using an expression. By choosing File Content from the Dynamic content list, we can also get the job done. So add File Content in the left box. Choose contains from the drop-down list and enter Device Serial Number in the right box.
The layout of the CSV file is, like it is when manually exporting the Autopilot information:
Device Serial Number,Windows Product ID,Hardware Hash
Click the three dots of the condition action and click on Configure after.
We need to make sure that this action is also run when the previous action errors because no CSV file is found.
Select has failed and click Done.
Under False, we add an Update item action, which is a SharePoint action.
With this action we update the List item and add information about the CSV file which is not found.
Select ID and Title from the Dynamic content list and enter the other information.
Under True we are going to add several Compose actions (type Data operations). We need to process the CSV file until we are able to use the content of the CSV file as variables in the upcoming action of the flow and remove unnecessary lines.
Add the first Compose action and as Input add File Content. Rename the action to Compose CSV.
As the flow handles the csv file just as one big line of data, we need to split the lines.
Enter a Compose action again and rename it to Compose New Line.
Just hit the ENTER bar to add an empty line to the input box.
With this action, we split the data by line.
Enter a Compose action and rename it to Compose SplitNewLine.
Enter the below as Expression:
split(outputs(‘Compose_CSV’),outputs(‘Compose_New_line’))
This look like the below.
As the first line of the csv file contains data we don’t process, we remove (skip) it with this next Compose action.
Add a Compose action and add this as Expression:
skip(outputs(‘Compose_SplitNewLine’),1)
As we now run the flow, we see that we start the Compose action with all data from the CSV, but at the end, we have one line of data left.
Split the data and parse to JSON
In the next part, we’re going to split the data by comma and parse the data to JSON, so we can use it with Graph.
First, add a For each action.
As input, we use the outputs from the Compose CSVData action.
Next, we enter a new Compose item in which we split the data by a comma.
Add this expression:
split(item(), ‘,’)
In the next Compose action, we are going to define the variables which we want to grab from the CSV file, which we later use in the flow.
We’re first manually creating an empty JSON string (in a text editor) that holds all variables we want to process from the csv:
{
"Device Serial Number": "",
"Windows Product ID": "",
"Hardware Hash": ""
}
Add the JSON in the Inputs field.
We are going the grab the data from the previous action, by using Expressions. Every item in the CSV file is indexed by a number, which starts with 0, followed by 1 and 2.
The Device Serial Number is indexed 0, Windows Product ID 1, and Hardware hash 2.
To grab the Device Serial, we use this Expression:
outputs(‘Compose_splitByComma’)?[0]
To get the Product ID we use:
outputs(‘Compose_splitByComma’)?[1]
And for the Hardware hash we use:
outputs(‘Compose_splitByComma’)?[2]
Enter these expressions between the quotes on every rule.
Next, we need to Parse the created JSON with a Parse JSON action.
As Content add Outputs from the last action.
Enter this as scheme:
{
"properties": {
"Device Serial Number": {
"type": "string"
},
"Hardware Hash": {
"type": "string"
},
"Windows Product ID": {
"type": "string"
}
},
"type": "object"
}
Add the first part of the real action to the flow
The csv file data still contains an empty last line. This is filtered out by adding a condition in the flow.
This is followed by running our first HTTP action against Microsoft Graph. This is used to check the Autopilot service if we’re uploading an existing serial number or a new one.
First add a Condition, which is found under Control, to filter out the empty rules of data.
As the first value, we use the Device Serial Number from the Parse JSON action. Choose is not equal to from the drop-down list. Leave the second value empty.
We now add our first HTTP action.
As Method select GET.
As URI enter:
https://graph.microsoft.com/v1.0/deviceManagement/windowsAutopilotDeviceIdentities?$filter=((contains(serialnumber,'[DeviceSerialNumber]')))
[DeviceSerialNumber] needs to be replaced with Device Serial Number from the last Parse JSON action.
Choose Add Parameter and select Authentication.
As Authentication type select Managed identity.
Select your Managed identity from the list.
And add https://graph.microsoft.com as Audience.
Next, we need to add a new Parse JSON action.
As Content, we select Body from the Dynamic content list that is from our HTTP action.
As Schema, we can run the current flow and grab the body from the HTTP action and add it via the Use sample payload option. We can also grab the body when we run the same query via Graph Explorer.
This is the schema:
{
"properties": {
"@@odata.context": {
"type": "string"
},
"@@odata.count": {
"type": "integer"
},
"value": {
"items": {
"properties": {
"addressableUserName": {
"type": "string"
},
"azureActiveDirectoryDeviceId": {
"type": "string"
},
"displayName": {
"type": "string"
},
"enrollmentState": {
"type": "string"
},
"groupTag": {
"type": "string"
},
"id": {
"type": "string"
},
"lastContactedDateTime": {
"type": "string"
},
"managedDeviceId": {
"type": "string"
},
"manufacturer": {
"type": "string"
},
"model": {
"type": "string"
},
"productKey": {
"type": "string"
},
"purchaseOrderIdentifier": {
"type": "string"
},
"resourceName": {
"type": "string"
},
"serialNumber": {
"type": "string"
},
"skuNumber": {
"type": "string"
},
"systemFamily": {
"type": "string"
},
"userPrincipalName": {
"type": "string"
}
},
"required": [
"id",
"groupTag",
"purchaseOrderIdentifier",
"serialNumber",
"productKey",
"manufacturer",
"model",
"enrollmentState",
"lastContactedDateTime",
"addressableUserName",
"userPrincipalName",
"resourceName",
"skuNumber",
"systemFamily",
"azureActiveDirectoryDeviceId",
"managedDeviceId",
"displayName"
],
"type": "object"
},
"type": "array"
}
},
"type": "object"
}
Add the clean-up part to our flow
Now we add a Switch action to our flow. If this is an existing device, from which the object is still registered in the Autopilot service, we are first going to clean up the Autopilot registration, Intune and Azure AD object. If no Autopilot registration is found, we skip this part and immediately start importing the information from the CSV file.
When the previous HTTP action finds a registered Autopilot registration, odata.count will be 1, which means we have an existing serial number and we enter the Case. If odata.count is 0, we will skip this case and move on to the import section.
Add a Switch action to the flow. Add @odata.count from the last Parse JSON action in the On field.
Add 1 in the Case, equals field.
Now we also need to add another Switch with a Case, to delete the Intune object, but only when there is an existing object in Intune. Otherwise, this case can be skipped.
Add a Switch and add enrollmentState in the On field. This will add the Switch in a For each action.
Enter Enrolled in the Case equals field.
Under the Case, we add an HTTP action.
As Method select GET.
As URI enter:
https://graph.microsoft.com/beta/deviceManagement/managedDevices/
Add managedDeviceId at the end of the URI.
Add/ select the authentication information.
It is important to add the next HTTP action in the correct part of the flow. Close the last HTTP action to get a better overview. Click Add action below the last Case.
With this HTTP action, we query Azure AD for an existing AAD device object. Select GET as Method and enter this URI:
https://graph.microsoft.com/v1.0/devices?$search="deviceId:[azureActiveDirectoryDeviceId]"&$select=displayName,id
Replace [azureActiveDirectoryDeviceId] with azureActiveDirectoryDeviceId from the last Parse JSON action.
As we are using &search, we need to add a special request header:
ConsistencyLevel: eventual
We need to parse the body from the HTTP action.
This is the schema:
{
"properties": {
"@@odata.context": {
"type": "string"
},
"value": {
"items": {
"properties": {
"displayName": {
"type": "string"
},
"id": {
"type": "string"
}
},
"required": [
"displayName",
"id"
],
"type": "object"
},
"type": "array"
}
},
"type": "object"
}
I used a Condition action to determine if we have an existing AAD object or not.
Why not use a Switch again? That didn’t work for me, so I used a Condition with an Expression as value.
Add a Condition action, and as value, in the left field we enter this expression (make sure the Parse JSON action matches my example, or change the Expression):
length(body(‘Parse_JSON_GET_AAD_Device’)?[‘value’])
From the drop-down list, choose is not equal to and add 0 to the right field.
When the query returns an AAD Device object, the value won’t be 0 and under True we will delete the AAD device object.
Under True add another HTTP action, to delete the AAD object if that exists.
As Method select GET.
As URI enter:
https://graph.microsoft.com/v1.0/devices/[id]
Replace [id] for the id from the last Parse JSON action. This will add the HTTP action in a For each action.
Close the HTTP action and make sure to add the next action to the correct place in the flow.
Add an HTTP action, to delete the Autopilot Device identity.
As Method select DELETE.
As URI enter:
https://graph.microsoft.com/v1.0/deviceManagement/windowsAutopilotDeviceIdentities/[id]
Replace [id] for the id from the Parse JSON action Get WindowsAutopilotDeviceIdentities.
We add an Until action to the flow, which is a Control action.
With this action, a Delay action and HTTP action, we ‘pause’ the flow, until the Autopilot identity is deleted, otherwise, we can’t import the identity again.
In the Until action, add a Delay action, which is a Schedule action.
Insert an HTTP action, to query Graph for the existence of the Autopilot object by serial number.
As Method select GET.
As URI enter:
https://graph.microsoft.com/beta/deviceManagement/windowsAutopilotDeviceIdentities?$filter=contains(serialNumber,'[Device Serial Number]')
Replace [Device Serial Number] with the Device Serial Number of the very first Parse JSON action.
We need to Parse the body from this HTTP action.
The Schema is:
{
"properties": {
"@@odata.context": {
"type": "string"
},
"@@odata.count": {
"type": "integer"
},
"value": {
"items": {
"properties": {
"addressableUserName": {
"type": "string"
},
"azureActiveDirectoryDeviceId": {
"type": "string"
},
"displayName": {
"type": "string"
},
"enrollmentState": {
"type": "string"
},
"groupTag": {
"type": "string"
},
"id": {
"type": "string"
},
"lastContactedDateTime": {
"type": "string"
},
"managedDeviceId": {
"type": "string"
},
"manufacturer": {
"type": "string"
},
"model": {
"type": "string"
},
"productKey": {
"type": "string"
},
"purchaseOrderIdentifier": {
"type": "string"
},
"resourceName": {
"type": "string"
},
"serialNumber": {
"type": "string"
},
"skuNumber": {
"type": "string"
},
"systemFamily": {
"type": "string"
},
"userPrincipalName": {
"type": "string"
}
},
"required": [
"id",
"groupTag",
"purchaseOrderIdentifier",
"serialNumber",
"productKey",
"manufacturer",
"model",
"enrollmentState",
"lastContactedDateTime",
"addressableUserName",
"userPrincipalName",
"resourceName",
"skuNumber",
"systemFamily",
"azureActiveDirectoryDeviceId",
"managedDeviceId",
"displayName"
],
"type": "object"
},
"type": "array"
}
},
"type": "object"
}
As long as we get an existing object back from the GET query, the odata.count is 1. As soon as it is 0, the object is deleted. So we use @odata.count from the Parse JSON action and add it in the left field of the Until action. Choose is equal to from the drop-down list and enter 0 in the right field.
Add the import part to our flow
Close the existing serial case and add the import part of our flow.
We use a POST HTTP action, to import the data from the CSV file, and get the Group TAG information from the SharePoint list.
As Method select POST.
As URI enter:
https://graph.microsoft.com/v1.0/deviceManagement/importedWindowsAutopilotDeviceIdentities
In the Body enter below and add Group TAG from the SharePoint List and Hardware Hash and Device Serial Number from the first Parse JSON action between the quotes on the corresponding lines.
{
"assignedUserPrincipalName": "",
"groupTag": "",
"hardwareIdentifier": "",
"serialNumber": ""
}
Again we need a Parse JSON action.
Below is the scheme:
{
"properties": {
"@@odata.context": {
"type": "string"
},
"assignedUserPrincipalName": {
"type": "string"
},
"groupTag": {
"type": "string"
},
"hardwareIdentifier": {
"type": "string"
},
"id": {
"type": "string"
},
"importId": {
"type": "string"
},
"productKey": {
"type": "string"
},
"serialNumber": {
"type": "string"
},
"state": {
"properties": {
"deviceErrorCode": {
"type": "integer"
},
"deviceErrorName": {
"type": "string"
},
"deviceImportStatus": {
"type": "string"
},
"deviceRegistrationId": {
"type": "string"
}
},
"type": "object"
}
},
"type": "object"
}
We are going to add another Until action with a Delay action in it.
With this, we are going to track the import status of our Autopilot information.
Add an HTTP action.
As Method select GET.
As URI enter:
https://graph.microsoft.com/v1.0/deviceManagement/importedWindowsAutopilotDeviceIdentities/[id]
Replace [id] with the id of the last Parse JSON, which is the ID of the POST import action.
Yes, we need to Parse the output again.
This is the scheme:
{
"properties": {
"@@odata.context": {
"type": "string"
},
"assignedUserPrincipalName": {},
"groupTag": {
"type": "string"
},
"hardwareIdentifier": {},
"id": {
"type": "string"
},
"importId": {
"type": "string"
},
"productKey": {
"type": "string"
},
"serialNumber": {
"type": "string"
},
"state": {
"properties": {
"deviceErrorCode": {
"type": "integer"
},
"deviceErrorName": {
"type": "string"
},
"deviceImportStatus": {
"type": "string"
},
"deviceRegistrationId": {
"type": "string"
}
},
"type": "object"
}
},
"type": "object"
}
The HTTP GET action contains the deviceImportStatus, which is during the import unknown. But when finished it changes to complete when finished successfully, or an error when it failed. deviceImportStatus is added in the left field of the until action. Choose is not equal to from the drop-down list and enter unknown in the right field.
Add a Condition action. Based on the deviceImportStatus, we update the SharePoint List item with a success or failure.
Add deviceImportStatus in the left field, choose is equal to from the drop-down and enter complete in the right field.
Under True we add an Update item (SharePoint) action, to write success to the List.
And under False we add an Update item, to write a failure.
The end result
The end result is a pretty long Logic App flow and an easy way to re-register Autopilot information.
As you can see, information is written back to the SharePoint List. It shows the information on success, but also when something is wrong, for example when there is no CSV found, or the serial is already registered in another tenant.
Thanks for reading this blog post!
As soon as I have some spare time, I’ll add the flow to my GitHub.