The AVD workbooks coming from Microsoft and other sources like the ITProCloud are nice but are based on individual hostpools. But what if you need to know your total session user count over time? Below query will do exactly that. Run it in Azure Monitor against your log analytic workspaces that capture your hostpool diagnostic data.
let StartWin = startofday(ago(30d));
let EndWin = endofday(ago(1d));
let lookbackWindow = 1d;
let StartCountDay = StartWin - lookbackWindow;
| project CorrelationId, State, TimeGenerated, UserName
| as Connections
| where State == "Started"
| extend StartTime = TimeGenerated
| join kind=fullouter
| where State == "Completed"
| extend EndTime = TimeGenerated
| extend EndTime = coalesce(EndTime, EndWin) // if connection not ended yet use lastday
| where EndTime >= StartCountDay // chop out connections the ended before our window started
| extend StartTime = coalesce(StartTime, StartCountDay) // if start aged off set at start of lookbackwindow
| where StartTime <= EndWin
| extend CorrelationId = coalesce(CorrelationId, CorrelationId1) // fix fields that only came from a completed record
| extend UserName = coalesce(UserName, UserName1)
| project StartTime, EndTime, CorrelationId, UserName // chop down colms to just what we need
| extend StartTime=max_of(StartTime, StartCountDay), EndTime=min_of(EndTime, EndWin) // chop connections to window
| extend _bin = bin_at(StartTime, 1d, StartCountDay) // #1 start of first day connection appears
| extend _endRange = iff(EndTime + lookbackWindow > EndWin, EndWin,
iff(EndTime + lookbackWindow - 1d < StartTime, StartTime,
iff(EndTime + lookbackWindow - 1d < _bin, _bin, _bin + lookbackWindow - 1d))) // #2 last day connection will appear
| extend _range = range(_bin, _endRange, 1d) // #3 create a start of day timestamp for every day connection existed and/or day it will be counted
| mv-expand _range to typeof(datetime) // #4
| summarize Users = dcount(UserName) by Days=bin_at(_range, 1d, StartCountDay) // #5 sum startofday timestamps
| where Days>= StartWin // #6 chop off days we dont want to display
| sort by Days asc
Welcome to Part 1 (Frontend) of my multi-part series of automating your project intake request process. This article describes how to automatically roll-out AVD hostpools based on (M365) user input. I will try to give you a good enough picture so you can replicate but have in mind, there are so many specifics about how we implement all of these technologies, this will not be an easy to copy guide at all! See it as a blueprint and change technologies / steps where needed. I see a lot of folks using ADO as their CI/CD tool or maybe even use Bicep as the IaC tool. Probably all of this can work. By no means is this the perfect solution but it helped us automating a lot of our work.
Table of Contents
I am working for quite a big company that works with numerous vendors, which process a variety of tasks for us. The vendors we work with get (A)AD accounts from us but not hardware. Most of them have to get access to our internal systems. We have been using Horizon View for this since almost a decade but now Azure Virtual Desktop became our new standard. When on-boarding new vendors, we want to separate them out into their own Resource Group, AVD Host Pool, Storage Accounts and so on. Every vendor has different requirements so we need to be able to separate them.
That’s a rough diagram of our setup. We will focus on the red circle. Everything else is already stood up by our enterprise architecture group and we tap into it.
The red circle will include:
DSC extensions for domain join and host pool join
Let users request their own AVD pools (for either new projects, new vendors that they hired or simply for testing AVD)
These requests need to go through an approval process (cost center owner)
Deployment needs to be consistent with our Azure landing zone policy (tags, naming, etc.)
State files need to be kept in case the requester wants to modify the number of machines or my team wants to cleanup the pool again
We want to use an IaC automation. Even better, a solution users can self-service. And in best case (yes there is more) that whole process should be happening in a quality controlled manner, where changes can be done and tested without impacting production. While our host pools are all connected to the same infrastructure, you can use the below concept to create separate vnets & subnets and put NSGs on them. All possible! Tool wise, we defined Gitlab.com as our Devops tool for pretty much all automation projects. And since we are also a Microsoft shop, I simply made use of the apps that come with that toolkit. So that is what I went with and what are the requirements for this blog post.
Frontend: Sharepoint, Powerapp and Flow
Codebase & CI/CD: Gitlab
First steps first. We have to create something that users can enter their data in and that keeps a history of records. While PowerApps is a very complex tool, all we have to start with is creating a Sharepoint List. Setup that list with the columns required for your project. A good list of columns to start with:
ProjectTitle (Single line of text)
Owner (Person or Group)
Justification (Multiple lines of text)
Number of Machines (Number)
CostCenter (Single line of text)
In addition to this list you will have more columns automatically added by Sharepoint that we are going to use later (i.e. CreatedBy or ID). We can also add more features like Spot instances for validation environments, Multisession or Singlesession deployments or even adding a column for choosing VM_Size. But that will be part of an advanced section 🙂
After you have setup the Sharepoint List you can start adding a PowerApps Form frontend. To do that click on “Integrate > Power Apps > CustomizeForms“. Feel free to add whatever you need as information for deploying your project later. For a start Project Title, Owner, Cost Center and Region is enough to start with.
Below is an example of my production form.
Flow will be the frontend brain. When a user submits above form, a new sharepoint list item will be created. Once that item is created, Microsoft flow will trigger and pick up the work. We will be using flow for managing the user facing interaction through O365 and handover to Gitlab using the Pipeline Trigger API. Below is an example of the flow we are using to give you a quickstart.
Watchout: To make API calls you have to have a flow premium license assigned to the owner of the flow. The good news is, you only need one, no matter how many people use the Powerapp that feeds it.
If the manager of the Owner approves, the next flow kicks off that’s doing the actual work.
As you can see we are doing some maintenance during the flow, like updating the state in the status column or sending out emails to update users during the status. The magic really happens inside the PipelineTrigger step though. This is where we call the Gitlab API to handover our payload (the user input) to terraform. Its then repeatedly calling the Gitlab API to report on the status of the pipeline. If it succeeds, that’s fine; if not, we will send a message to the AVD team to check whats going on.
The pipeline trigger as an example below. The <> values come from dynamic fields in flow.
This should cover the frontend aspect for now. In the next article i’ll explain more about setting the up the Gitlab pipeline.