The AVD workbooks coming from Microsoft and other sources like the ITProCloud are nice but are based on individual hostpools. But what if you need to know your total session user count over time? Below query will do exactly that. Run it in Azure Monitor against your log analytic workspaces that capture your hostpool diagnostic data.
let StartWin = startofday(ago(30d));
let EndWin = endofday(ago(1d));
let lookbackWindow = 1d;
let StartCountDay = StartWin - lookbackWindow;
WVDConnections
| project CorrelationId, State, TimeGenerated, UserName
| as Connections
| where State == "Started"
| extend StartTime = TimeGenerated
| join kind=fullouter
(
Connections
| where State == "Completed"
| extend EndTime = TimeGenerated
)
on CorrelationId
| extend EndTime = coalesce(EndTime, EndWin) // if connection not ended yet use lastday
| where EndTime >= StartCountDay // chop out connections the ended before our window started
| extend StartTime = coalesce(StartTime, StartCountDay) // if start aged off set at start of lookbackwindow
| where StartTime <= EndWin
| extend CorrelationId = coalesce(CorrelationId, CorrelationId1) // fix fields that only came from a completed record
| extend UserName = coalesce(UserName, UserName1)
| project StartTime, EndTime, CorrelationId, UserName // chop down colms to just what we need
| extend StartTime=max_of(StartTime, StartCountDay), EndTime=min_of(EndTime, EndWin) // chop connections to window
| extend _bin = bin_at(StartTime, 1d, StartCountDay) // #1 start of first day connection appears
| extend _endRange = iff(EndTime + lookbackWindow > EndWin, EndWin,
iff(EndTime + lookbackWindow - 1d < StartTime, StartTime,
iff(EndTime + lookbackWindow - 1d < _bin, _bin, _bin + lookbackWindow - 1d))) // #2 last day connection will appear
| extend _range = range(_bin, _endRange, 1d) // #3 create a start of day timestamp for every day connection existed and/or day it will be counted
| mv-expand _range to typeof(datetime) // #4
| summarize Users = dcount(UserName) by Days=bin_at(_range, 1d, StartCountDay) // #5 sum startofday timestamps
| where Days>= StartWin // #6 chop off days we dont want to display
| sort by Days asc
Today I was looking at running multiple HTTP requests and combine all of the data into one JSON object in Power Automate to respond back to my PowerApp.
I was browsing through various blogs and articles but didn’t find an efficient solution that didn’t require a lot of steps or complicated data manipulation. I wanted to simply return the object from Response, without doing any kind of apply for each or any other sorcery in between.
My problem was though, that some of HTTP return objects had the same schema. In the beginning I was kind of successful using the union(outputs('HTTP')['body'], outputs('HTTP_2')['body'] function. But this only helped for JSON objects that had a different schema as union would simply overwrite properties if they would have the same name.
After some struggle I thought why not simply building my own schema and referring to the output of the HTTP queries. And it worked! Power Automate can be so simple. Keeps amazing me 🙂
That way I was able to append or combine all JSON objects from my HTTP calls into one response and then properly process it from the calling PowerApp. Cool stuff!
Welcome to Part 1 (Frontend) of my multi-part series of automating your project intake request process. This article describes how to automatically roll-out AVD hostpools based on (M365) user input. I will try to give you a good enough picture so you can replicate but have in mind, there are so many specifics about how we implement all of these technologies, this will not be an easy to copy guide at all! See it as a blueprint and change technologies / steps where needed. I see a lot of folks using ADO as their CI/CD tool or maybe even use Bicep as the IaC tool. Probably all of this can work. By no means is this the perfect solution but it helped us automating a lot of our work.
My problem
I am working for quite a big company that works with numerous vendors, which process a variety of tasks for us. The vendors we work with get (A)AD accounts from us but not hardware. Most of them have to get access to our internal systems. We have been using Horizon View for this since almost a decade but now Azure Virtual Desktop became our new standard. When on-boarding new vendors, we want to separate them out into their own Resource Group, AVD Host Pool, Storage Accounts and so on. Every vendor has different requirements so we need to be able to separate them.
The design
That’s a rough diagram of our setup. We will focus on the red circle. Everything else is already stood up by our enterprise architecture group and we tap into it.
The red circle will include:
Resource Group
Host Pool
Virtual Machine(s)
DSC extensions for domain join and host pool join
Our requirements
Let users request their own AVD pools (for either new projects, new vendors that they hired or simply for testing AVD)
These requests need to go through an approval process (cost center owner)
Deployment needs to be consistent with our Azure landing zone policy (tags, naming, etc.)
State files need to be kept in case the requester wants to modify the number of machines or my team wants to cleanup the pool again
My solution
We want to use an IaC automation. Even better, a solution users can self-service. And in best case (yes there is more) that whole process should be happening in a quality controlled manner, where changes can be done and tested without impacting production. While our host pools are all connected to the same infrastructure, you can use the below concept to create separate vnets & subnets and put NSGs on them. All possible! Tool wise, we defined Gitlab.com as our Devops tool for pretty much all automation projects. And since we are also a Microsoft shop, I simply made use of the apps that come with that toolkit. So that is what I went with and what are the requirements for this blog post.
Frontend: Sharepoint, Powerapp and Flow
Codebase & CI/CD: Gitlab
IaC: Terraform
Frontend
Sharepoint
First steps first. We have to create something that users can enter their data in and that keeps a history of records. While PowerApps is a very complex tool, all we have to start with is creating a Sharepoint List. Setup that list with the columns required for your project. A good list of columns to start with:
ProjectTitle (Single line of text)
Owner (Person or Group)
Justification (Multiple lines of text)
Region (Choice)
Number of Machines (Number)
CostCenter (Single line of text)
In addition to this list you will have more columns automatically added by Sharepoint that we are going to use later (i.e. CreatedBy or ID). We can also add more features like Spot instances for validation environments, Multisession or Singlesession deployments or even adding a column for choosing VM_Size. But that will be part of an advanced section 🙂
PowerApps
After you have setup the Sharepoint List you can start adding a PowerApps Form frontend. To do that click on “Integrate > Power Apps > CustomizeForms“. Feel free to add whatever you need as information for deploying your project later. For a start Project Title, Owner, Cost Center and Region is enough to start with.
Below is an example of my production form.
Flow
Flow will be the frontend brain. When a user submits above form, a new sharepoint list item will be created. Once that item is created, Microsoft flow will trigger and pick up the work. We will be using flow for managing the user facing interaction through O365 and handover to Gitlab using the Pipeline Trigger API. Below is an example of the flow we are using to give you a quickstart.
Watchout: To make API calls you have to have a flow premium license assigned to the owner of the flow. The good news is, you only need one, no matter how many people use the Powerapp that feeds it.
If the manager of the Owner approves, the next flow kicks off that’s doing the actual work.
As you can see we are doing some maintenance during the flow, like updating the state in the status column or sending out emails to update users during the status. The magic really happens inside the PipelineTrigger step though. This is where we call the Gitlab API to handover our payload (the user input) to terraform. Its then repeatedly calling the Gitlab API to report on the status of the pipeline. If it succeeds, that’s fine; if not, we will send a message to the AVD team to check whats going on.
The pipeline trigger as an example below. The <> values come from dynamic fields in flow.
So as of yesterday i finally finished something that i had in my mind for such a long time.
My (FPV) Quadcopter X setup!
As i do not have any flying skills yet i will not get the FPV (First Person View) equipment yet.
For now i definitely want to practice otherwise it would just crash my stuff.
This is my parts list and i am really really satisfied with the quality / performance of the parts.
You can basically call it a budget build but it’s not trading quality against price. So it’s not super super cheap.
The Afro ESCs 20A have different bullet connectors than the motors so you would need to be aware of that when you do not want to solder things.
Sadly the Afro 12A ESCs were out of stock during the time i ordered and 20A are doing the same job.
Build everything starting from the motors to the “wings” to the frame.
After that start with the transmitter and flight controller.
Try to imagine the whole setup in the beginning or just check out the internet for similar builds.
One thing to mention is this Youtube playlist that helped me a lot during the whole process:
Have in mind that you need to check the motor setup (which motor is where) on your flight controller.
First i was building everything counting from 1 to 4, starting on the upper left.
Big mistake! You need to checkout the setup in your configuration software of the flight controller.
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.