Close Menu
Peter Klapwijk – In The Cloud 24-7Peter Klapwijk – In The Cloud 24-7
    Facebook X (Twitter) Instagram
    Peter Klapwijk – In The Cloud 24-7Peter Klapwijk – In The Cloud 24-7
    • Home
    • Intune
    • Windows
      • Modern Workplace
    • macOS
    • Android
    • iOS
    • Automation
      • Logic Apps
      • Intune Monitoring
      • GitHub
    • Security
      • Passwordless
      • Security
    • Speaking
    • About me
    Peter Klapwijk – In The Cloud 24-7Peter Klapwijk – In The Cloud 24-7
    Home»Automation»Build your own user onboarding automation: Send a notification when Windows 365 provisioning is finished
    Automation

    Build your own user onboarding automation: Send a notification when Windows 365 provisioning is finished

    Peter KlapwijkBy Peter KlapwijkFebruary 25, 2025Updated:April 13, 20259 Mins Read

    Today, I’m sharing another blog post that’s somewhat related to user onboarding automation. This time I’m sharing how we can automatically build an automation flow to inform our users when their Windows 365 cloud PC finished provisioning and is ready for sign-in. This prevents the user from manually checking if the cloud PC is ready, thus providing a better user experience.

    The solution is short

    We build this automation, with the low-code solution Azure Logic Apps. The flow checks every 30 minutes if there are Windows 365 cloud PCs with a status of provisioning and with the last modified date/ time in the last 30 minutes. This is done by querying Microsoft Graph.
    When such a cloud PC is found, the flow keeps checking for the status of the machine, until the status is no longer equal to provisioning.

    When the status is changed, the flow moves on and informs the end-user per e-mail and Teams message the machine is ready for use.

    Note; I took 30 minutes as a trigger schedule as that’s the time it takes at least for a machine to get provisioned. Change this trigger to your own needs, as provision differs per environment based on various conditions.

    Requirements

    We have some requirements that need to be in place before we can use our Logic Apps flow.

    We use HTTP actions in the flow, to query Microsoft Graph for the status of the Windows 365 cloud PCs. To be allowed to run these queries, we need an identity for authentication and to assign permissions to. I prefer to make use of a Managed Identity, as that is an easy-to-manage and secure option. We have the option to use a system-assigned and user-assigned managed identity, as you can read in the documentation. In my case, I make use of a system-assigned MI, as that is considered the most secure. 

    This Managed Identity needs to have (application) permissions to execute all the Graph calls we make.
    This is the application permission needed for our flow;
    CloudPC.Read.All

    Note; I’m sending an email and Teams message with much user information. In case you want to add some more user information to the message, the Managed Identity should be assign more permissions, for example, User.ReadBasic.All or User.Read.All.

    Depending on how you want to send out the message, a (service) account is needed. In my case, it needs to get proper licensing and permission to send out an e-mail from a shared mailbox and send a Teams chat message.

    I will soon publish a template for easy deployment of the flow to my GitHub repo.

    Setting up the flow

    When we have our requirements in-place, we can start building the Azure Logic Apps flow.

    Sign in to the Azure portal and open the Logic Apps service. Create a Logic App of type Consumption.
    We need to select a subscription, resource group, and region, and enter the name of the flow before we can create the Logic Apps flow.

    When the flow is created, click on the name of the flow at the top of the screen, and open the Identity section. On the System assigned tab, set the switch to On and confirm this by selecting Yes.

    Make sure to assign the application permissions to the system-assigned manager identity.

    Browse to the Overview tab and click on Edit. 
    The first thing we need to add to the flow is a Trigger. Click on Add a trigger and search for Schedule. We need to add a Recurrence trigger to the flow.

    With the recurrence trigger, we trigger the flow to start based on the schedule we create. As written before, I configure the flow to run every 30 minutes. Change this to your own needs.

    Next, we add our first HTTP action to run a Graph API call.
    With this action, we query Microsoft Graph for Windows 365 cloud PCs with a status of provisioning and the last modified date/ time of the last 30 minutes.

    Select GET as Method.
    Next, select Authentication under Advanced parameters.
    As Authentication type select Managed identity.
    Select System-assigned managed identity from the list.
    And add https://graph.microsoft.com as Audience.

    Enter the below URI to the action:

    https://graph.microsoft.com/beta/deviceManagement/virtualEndpoint/cloudPCs?$filter=status%20eq%20'provisioning' and lastModifiedDateTime%20gt%20@{formatDateTime(addMinutes(utcNow(),-30),'yyyy-MM-ddTHH:mm:ssZ')}&$select=id,displayName,status,userPrincipalName,lastModifiedDateTime

    In the URI you can see fx formateDateTime. This is an expression, used to enter the current date and time into the flow, minus 30 minutes. The expression is seen on the left of the screenshot and below in plaintext:

    formatDateTime(addMinutes(utcNow(),-30),'yyyy-MM-ddTHH:mm:ssZ')

    Note; In case you change the flow to run, for example, every 45 minutes, the -30 in the expression should also be changed to -45.

    Our flow now looks like this, with the trigger and our first action.

    Next, we need to add a Parse JSON action, which is a Data operations action. We parse the output of the HTTP action, to be able to use the values later on in the flow.

    Add the Parse JSON action to the flow.
    In the Content field, we add Body, which is found as dynamic content of the HTTP action. Dynamic content can be found by hitting the lightning button.

    We don’t have to write the schema ourselves, we can generate it by adding a sample payload. We can get this example payload, by running the current flow and grabbing the output from the HTTP action. We can also grab the body when we run the same query via Graph Explorer.

    Schema:

    {
        "type": "object",
        "properties": {
            "@@odata.context": {
                "type": "string"
            },
            "value": {
                "type": "array",
                "items": {
                    "type": "object",
                    "properties": {
                        "id": {
                            "type": "string"
                        },
                        "displayName": {
                            "type": "string"
                        },
                        "status": {
                            "type": "string"
                        },
                        "userPrincipalName": {
                            "type": "string"
                        },
                        "lastModifiedDateTime": {
                            "type": "string"
                        }
                    },
                    "required": [
                        "id",
                        "displayName",
                        "status",
                        "userPrincipalName",
                        "lastModifiedDateTime"
                    ]
                }
            }
        }
    }

    This is our Parse JSON action;

    We add a second HTTP action to the flow. This action is to get some information on the cloud PC that’s in the provisioning status.
    Select GET as Method and fill in the authentication information.

    Enter the below URI:

    https://graph.microsoft.com/beta/deviceManagement/virtualEndpoint/cloudPCs/[ID]?$select=id,displayName,status,userPrincipalName,lastModifiedDateTime

    Replace [ID] with Body ID found as dynamic content from the Parse JSON action. This will also add the HTTP action in a For Each action.

    In the above URI, you can see I used select. With a select we only select the values we are interested in, to limit the number of values we see in the output.

    Next, add a Until action to the flow, which is a Control action. Drag the last HTTP action into the Until action.
    With the Until action, we will check the status of the cloud PC as long as the status is equal to provisioning. As soon as the status is no longer equal to provisioning, we move on with the next steps, that come after the Until action.

    Add a second Parse JSON action to the flow, under the HTTP action, inside the Until acton. In the Content field add Body of the second HTTP action and create the schema.

    {
        "type": "object",
        "properties": {
            "@@odata.context": {
                "type": "string"
            },
            "id": {
                "type": "string"
            },
            "displayName": {
                "type": "string"
            },
            "status": {
                "type": "string"
            },
            "userPrincipalName": {
                "type": "string"
            },
            "lastModifiedDateTime": {
                "type": "string"
            }
        }
    }

    Inside the Until action, we also add a Delay action. If we don’t add a delay to the flow, the HTTP action to query the status of the cloud PC would run every couple of seconds, which makes no sense. I added a delay of 2 minutes to the flow.

    Now let’s configure the Until action itself.
    By default, the Count is set to 60 and the Timeout to 1 hour.
    Add Body status (of the last Parse JSON action) to the left field of the Loop until section. Select is not equal to from the drop-down list and enter provisioning to the right field. The actions in the Until action will run as long as the status of the cloud PC is equal to provisioning, or the count to timeout are reached.

    Under the Until action, we add a third HTTP action. With this action, we query Microsoft Graph again for information about the cloud PC, but we query this time for some more values. I use this additional HTTP action, as during provisioning not all the values are already available. Some have a value of null which could give issues with the Parse JSON action.

    Again we select GET as method.
    We use below URI, with some additional values behind select;

    https://graph.microsoft.com/beta/deviceManagement/virtualEndpoint/cloudPCs/[ID]?$select=id,displayName,imageDisplayName,provisioningPolicyName,servicePlanName,status,userPrincipalName,lastModifiedDateTime,servicePlanType,managedDeviceName

    Replace [ID] with Body ID found as dynamic content from the last Parse JSON action.

    The HTTP action is followed by a Parse JSON action.

    As last we add the action to the flow with which we inform our users. In my case I add an action to send en e-mail and one to send a Teams message.

    Add a Send an email from a shared mailbox action. This will ask you to sign in with a (service) account.

    Enter the e-mail address from the shared mailbox in the From field. In the To field, we can add the User Principal Name, assuming this matches the email address of the user.
    Add a subject and write the text in the Body to your own needs. As you can see, we can add variables from the previous actions to the email action like I did.

    We add a parallel branch to the flow, to run both the send email action and the Teams message at the same time. For this, hit the plus button above the send an email action and select Add a parallel branch.

    Under the new branch add a Post card in a chat or channel, which is a Teams action.
    There are different types of actions to post a message to a Teams chat, this time I chose the one that posts an adaptive card that looks a little nicer compared to a normal message.

    Select Flow bot for Post as and Chat with flow bot for Post In.
    Add the User Principal Name in the Recipient field.

    We can easily build an adaptive card for Teams with the online adaptivecards.io/designer.

    Make sure to select Microsoft Teams as the host app.

    Drag TextBlock elements from the right section into the designer in the middle. Those text blocks can be edited to your needs in the right section.

    When you finish designing the adaptive card, copy everything in the card payload editor section.
    Paste the content in the Adaptive card field in the Teams action.

    We can add values from the previous action to the adaptive card body like below.

    And our low-code automation flow is finished!

    The end result

    The end result is pretty simple. We inform our users via email when their Windows 365 cloud PC is ready.

    And we do the same via Microsoft Teams.

    Thanks for reading!

    If you’re interested in more user onboarding automation, have a look at these articles as well!

    Automation Entra ID Identity Management Modern Workplace Power Automate user onboarding Windows 365
    Share. Facebook Twitter LinkedIn Email WhatsApp
    Peter Klapwijk
    • Website
    • X (Twitter)
    • LinkedIn

    Peter is a Security (Intune) MVP since 2020 and is working as Modern Workplace Engineer at Wortell in The Netherlands. He has more than 15 years of experience in IT, with a strong focus on Microsoft technologies like Microsoft Intune, Windows, and (low-code) automation.

    Related Posts

    Build your own user onboarding automation – Day 1: enable the account and create a Temporary Access Pass

    February 18, 2025

    Build your own user onboarding automation – Entra ID user account creation

    February 13, 2025

    Create an application-based Azure AD group with Logic Apps

    May 9, 2022
    Add A Comment
    Leave A Reply Cancel Reply

    Peter Klapwijk

    Hi! Welcome to my blog post.
    I hope you enjoy reading my articles.

    Hit the About Me button to get in contact with me or leave a comment.

    Awards
    Sponsor
    Latest Posts

    Hide the “Turn on an ad privacy feature” pop-up in Chrome with Microsoft Intune

    April 19, 2025

    How to set Google as default search provider with Microsoft Intune

    April 18, 2025

    Using Windows Autopilot device preparation with Windows 365 Frontline shared cloud PCs

    April 13, 2025

    Using Visual Studio with Microsoft Endpoint Privilege Management, some notes

    April 8, 2025
    follow me
    • Twitter 4.8K
    • LinkedIn 6.1K
    • YouTube
    Tags
    Administrative Templates Android Automation Autopilot Azure Azure AD Browser Conditional Access Edge EMS Exchange Online Feitian FIDO2 Flow Google Chrome Graph Graph API Identity Management Intune Intune Monitoring iOS KIOSK Logic Apps macOS MEM MEMMonitoring Microsoft 365 Microsoft Edge Microsoft Endpoint Manager Modern Workplace Office 365 OneDrive for Business Outlook Passwordless PowerApps Power Automate Security SharePoint Online Teams Windows Windows 10 Windows10 Windows 11 Windows Autopilot Windows Update
    Copy right

    This information is provided “AS IS” with no warranties, confers no rights and is not supported by the authors, or In The Cloud 24-7.

     

    Copyright © 2025 by In The Cloud 24-7/ Peter Klapwijk. All rights reserved, No part of the information on this web site may be reproduced or posted in any form or by any means without the prior written permission of the publisher.

    Shorthand; Don’t pass off my work as yours, it’s not nice.

    Recent Comments
    • Peter Klapwijk on The next step in a passwordless Windows experience
    • Sam on The next step in a passwordless Windows experience
    • John M on Using Windows Autopilot device preparation with Windows 365 Frontline shared cloud PCs
    • Nathalie on How to update win32 applications with Microsoft Intune
    • Peter Klapwijk on Using Windows Autopilot device preparation with Windows 365 Frontline shared cloud PCs
    most popular

    Application installation issues; Download pending

    October 1, 2024

    Restrict which users can logon into a Windows 10 device with Microsoft Intune

    April 11, 2020

    How to change the Windows 11 language with Intune

    November 11, 2022

    Update Microsoft Edge during Windows Autopilot enrollments

    July 9, 2024
    Peter Klapwijk – In The Cloud 24-7
    X (Twitter) LinkedIn YouTube RSS
    © 2025 ThemeSphere. Designed by ThemeSphere.

    Type above and press Enter to search. Press Esc to cancel.

    Manage Cookie Consent
    To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
    Functional Always active
    The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
    Preferences
    The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
    Statistics
    The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
    Marketing
    The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
    Manage options Manage services Manage {vendor_count} vendors Read more about these purposes
    View preferences
    {title} {title} {title}