Today a new blog post regarding the automation of Windows Autopilot related tasks with Azure Logic Apps. In this post, I describe the challenge we had with automatically assigning Group TAGs to all our newly ordered Windows hardware and what automation flow we created to solve that challenge.
Our hardware vendor takes care of the registration of our Windows devices in our Windows Autopilot service. But the vendor is unable to automatically assign a Group TAG to the devices, based on the business unit which ordered the devices. We have that requirement to assign Group TAGs to allow the business units (BU) to manage their own devices using the Windows Autopilot Lifecycle automation we created.
For every hardware order, we receive an Excel sheet via email, which holds the serial numbers of the devices and also some other information like a shipping address and customer number (the BUs have their own customer number). So at least we have something with which we can determine which BU ordered which device.
The solution
With Azure Logic Apps we are able to save the attachment of an email to a SharePoint library. We made sure a copy of the Excel sheet was delivered to a shared mailbox to which a service account has access. And we created a folder to which we can save the file.
We can only further process the data in the sheet when it’s in a table, so the next step is to pick up the Excel sheet and create a table. After this, we list the rows in the table, to process every row.
From every row, we take the customer number and query a SharePoint list to determine which Group TAG needs to be assigned to the serial number from that row.
We use Microsoft Graph to assign the Group TAG to the serial number with an HTTP action. But we can’t assign the Group TAG directly using the serial number, we first have to retrieve the Windows Autopilot device ID. When this ID is retrieved, we finally are able to assign the Group TAG.
I’ve also built in some error handling in the flow. It happened a few times, serial numbers weren’t registered at all in our Autopilot service, and orders were placed under new customer numbers (without notifying us of the new customer number). with all these actions, this is my flow. Even when the vendor is able in the future to register the group TAGs, I might keep a part of this flow running to check if the registration of the serial is done.
Requirements
For this Logic Apps to work we have some requirements.
First of all, we need to have a mailbox that receives the emails with the Excels sheets as attachments.
We need a SharePoint site with a folder to store the Excel sheet. And we need a SharePoint List that holds at least the customer number with the corresponding Group TAG. But if you use something else to determine which TAG needs to be assigned, that information needs to be in that list.
My SharePoint list is like below, the Customer is only for my own reference and isn’t used in the Logic App.
Via HTTP actions, I`m able to use Graph to assign the Group TAG to serial numbers in the Autopilot service. These HTTP actions use an Azure App Registration, which is the next requirement. The App Registration is used for authentication in the HTTP actions and also for granting the required permissions.
The minimum required permission for this Logic App is shown below:
DeviceManagementServiceConfig.ReadWrite.All
Instead of using HTTP actions, you could set up a custom connector to work with Graph in Logic Apps. This could replace all the HTTP actions, or you can use them together in the same flow.
To secure the HTTP actions I use Azure Key Vault, which is described in this previous post.
In my case, I used a service account to authenticate from the Logic App to SharePoint, Azure Key Vault, and to retrieve the mails from the mailbox. I also use this account to send emails from a shared mailbox in case something doesn’t run as expected. So I have the requirement to set up an account with a mailbox and grant that account rights to the Sharepoint List and the shared mailbox.
Note: When I started using Logic Apps I used App Registrations to authenticate to Graph. But I also learn and for a while, I replaced all App Registrations with a Managed Identity (MI). MIs are easier to maintain and are my preferred way of authentication now. To set up a MI please read this article.
Setup the first part of the Logic App flow
When we have the prerequisites in place, we can start creating the first part of the flow in Azure Logic Apps.
Sign in to the Azure portal and open the Logic App service. I created a blank Logic App of type Consumption.
We start the flow with an Office 365 Outlook trigger, search for Office 365 Outlook. Choose When a new email arrives in a shared mailbox.
At the Original Mailbox Address, we need to fill in the mail address from the mailbox which holds the email with Excel sheet as an attachment. The rest of the fields can be used for additional furthering on which message should be picked up by the flow. And we need to set how often the Logic App needs to check for items.
Next we add the Azure Key Vault action to our flow, so we can get the secret and use it in the upcoming HTTP actions.
Click the Add step button, search for Azure Key Vault and select Get secret.
Enter the Vault name and click Sign in.
After authentication is done, enter the Name of the secret.
We are now going to add a SharePoint action to the flow, which is a Create file action.
With this action, we save the email attachment to a folder on SharePoint. Therefore we enter the SharePoint site address and folder path. And as File Name, we use the Attachments Name and the content is the Attachments content. Both are variables from the previous Outlook action and can be found as Dynamic content.
As you can see the Get File action is automatically added in a For each action.
Now we need a second trigger which is a SharePoint trigger; When a file is created or modified in a folder. This will trigger the rest of the flow and with that trigger, we are able to further process the previously saved Excel sheet.
But this trigger needs to be created in a parallel branch. If we don’t use a separate branch, the SharePoint trigger might never get triggered.
Between the Get secret and For each action we click the Plus sign and choose Add a parallel branch.
In that new branch, we’re going to add the SharePoint trigger.
As you can see we now have two branches next to each other, the flow is split up.
In this trigger, we need to enter the Site address and Folder from the location where the Excel sheet is saved.
We need to create a table in the Excel sheet, therefore we add a Create table action to the flow. The Create table action is an Excel online (Business) action.
Here we fill in the SharePoint site location and Document library which holds the Excel sheet.
To identify the correct Excel sheet in the folder which we need to process, we can use the File identifier, which is dynamic content from the previous SharePoint trigger.
We need to specify a table range.
It really depends on your Excel sheet which range you need to set. As my data is in columns A – D, I enter this in the table range. My table needs to start in row 1, but I don’t know the number of rows in every sheet I receive, so I set a large number (150). We filter out the empty rows in the next actions, so make sure you set the number of rows not too short.
Now we are going to filter out the rows which don’t hold a serial number. For that, we use a Data operation action; Filter array.
In the From field pick Value as dynamic content from the Create table action. Click Edit in advanced mode and enter:
@not(equals(item()?['SerialNumber'], ''))
This picks the item SerialNumber from the Table, so SerialNumber needs to match what’s in the Excel table.
When you now click Edit in basic mode, it should look like below.
Next, we’re going to select the values we need from the Excel sheet; CustomerNumber and SerialNumber. We can do that with a Select action, which is also a Data operations action.
In the From field, we enter Body, from the previous Filter array action.
In the left column, we enter CustomerNumber and below that SerialNumber.
To select the data from the previous action, we use expressions. Set the mouse in the right column and on the right click the Expression tab and enter the below expressions (one on each row):
item()?['CustomerNumber']
item()?['SerialNumber']
This is our Select action.
To be able to further process the selected values (data) we first need to parse the data. For this, we use a Parse JSON action (again a Data Operations action).
As content select Output from the Select action.
If you used the exact same values as I did in the Select action, you can use the below schema:
{
"items": {
"properties": {
"CustomerNumber": {
"type": "string"
},
"SerialNumber": {
"type": "string"
}
},
"required": [
"SerialNumber",
"CustomerNumber"
],
"type": "object"
},
"type": "array"
}
Our first part of the flow is ready. We triggered the flow when an email is received. We saved the Excel sheet to SharePoint, created a table, and filtered out the rows which don’t hold a serial number. And we selected the data from the sheet and parsed it for further usage in the flow.
Setup the second part of the Logic App flow
In the next part of the flow, we’re going to query Graph using the serial number we have to get the Windows Autopilot Identity. Once we have the Autopilot ID, we can assign the Group TAG. I also added actions to check if the customer number is known in the SharePoint list and if the serial number is registered in the Autopilot service.
The next action we add is the Get items (a SharePoint) action, to get the items from the SharePoint list which hold the customer numbers (and group TAGs).
Enter the Site address and List name of the SharePoint list. From the drop-down of new parameters select Filter query and enter this query:
Customernumber eq ''
Between the quotation marks, we need to add the dynamic content CustomerNumber from the Parse JSON action. This adds the Get items action in a For each action.
This step is optional but used to make sure the customer number is known. I use the Control action Condition. The outcome of the action (true or false) can be used to determine the following action.
As value on the right I use this expression:
length(body('Get_items')?['value'])
This uses the length of the value from action Get items.
Select is greater than from the dro-down list and enter 0.
Everything greater than 0 will be True. If the customer number from the sheet didn’t match one in the SharePoint list in the previous action, the result will be 0, and the result from the condition will be False.
Under false, I added an action to send an email so we get notified of an unknown customer number. We can fill in the fields to our needs and we can use dynamic content from previous steps.
We continue our flow under True with an HTTP action. With this first HTTP action, we query Graph for the Windows Autopilot Device Identity using the serial number.
Choose GET as Method.
As URI enter:
https://graph.microsoft.com/v1.0/deviceManagement/windowsAutopilotDeviceIdentities?$filter=((contains(serialnumber,'[SERIAL]')))
We need to replace [SERIAL] with SerialNumber from the Parse JSON action found via dynamic content.
Choose Add new parameter and check Authentication.
Enter the Tenant ID and Client ID as found on your App Registration.
Enter https://graph.microsoft.com as Audience.
As secret, we add the Dynamic content Value (Value of the secret).
To use the information we received (by executing the previous HTTP action) in the upcoming actions, we need to parse this information with a Parse JSON action.
Add the Parse JSON action to the flow.
We’re going to parse the response body of the previous HTTP action. In the Content box, we add Body, from the previous action.
To fill the schema with the correct information, we can add an example payload. The information to use as an example payload can be ‘generated’ by performing the same GET query (as in the previous HTTP action), but this time using Graph Explorer. Copy the response in Graph Explorer like below.
Click Use sample payload in the Parse JSON action, and past the information in the new pop-up.
This is our Parse JSON action.
Again I add a Condition action to the flow. This one is used to check if the serial number is found in the Autopilot service. If previous HTTP action is run and a serial number is found, it provides a lot of data and @odata.count is 1. If no serial number is found, @odata.count is 0. So I use this value to determine if the serial number is found or not.
On the left select @odata.count as dynamic content from the Parse JSON action. Select is equal to from the drop-down list and enter 1 in the right box.
Under False I add a send email action again.
Under True we add the next HTTP action, this one adds the group TAG to the serial number.
Choose POST as Method.
As URI enter:
https://graph.microsoft.com/v1.0/deviceManagement/windowsAutopilotDeviceIdentities/[ID]/UpdateDeviceProperties
Replace [ID] with the ID from the last Parse JSON action.
In the body field enter:
{
"groupTag": "[GroupTAG]"
}
Replace [GroupTAG] with the group TAG from the Get items action.
Fill in the rest of the required fields as we did in the previous HTTP action.
This is what our flow looks like now.
Close the last For each action and add a Move email action to the flow (is optional).
I use this action to move the emails, after processing the Excel sheet, to a subfolder so we know the email is processed.
Choose the Message ID from the dynamic content to identify the correct email. Enter the folder to which the messages need to be moved and enter the mailbox address.
Add error handling to the flow
Optional we can add some more error handling to the flow, besides sending notification mails when the customer number isn’t found or the serial number isn’t registered. Every action might fail which causes the flow to fail and without adding some error handling to the flow, you won’t get notified by default. I added in both branches email notifications in case the last action in the branch didn’t complete. If that notification is sent, the flow is finished successfully because the last action did succeed. To change that last status, we can at a terminate action at the end of the flow.
Of course, we can also use something else to get notified of a failure, for example, a Teams webhook, which I also used in a couple of previous articles.
In both branches, we add an extra action to send an email. One after the For each, which holds the Create a file action, and one after the move email action.
Next, click the three dots in the right top corner and click Configure run after.
Remove the checkmark from is successful.
Add checkmarks for has timed out, is skipped, and has failed. Click Done.
When the previous action is failed, times out, or is skipped, an email notification is sent.
To mark the flow itself as failed after this notification we add a Terminate action at the end of both branches. Set the Status to Failed and optionally enter a code and message.
And that’s it, this is our complete flow.
And for example, when the Create file action failed, an email is sent and the flow is set to failed.
That’s it for this automation blog post. Thanks for reading and if you’re interested in some more automation, have a look at the other automation posts.