Get all the available API versions in Azure

Today a colleague came to me with an ARM template, asking me why certain elements did not seem to process properly when he was deploying this template to Azure.
We came to the conclusion that he was using an old apiVersion reference in his ARM template, that did not include this element yet.

While we mostly use Visual Studio to build ARM templates (and tend to be lazy) and use the ‘add a new resource’ button, this mostly never populates the actually latest avaialble Api version for the resource.

Using powershell to our advantage however, we can quickly retrieve all the latest available versions from Azure.

The “Get-AzureRmResourceProvider -ListAvailable” cmdlet will give you all the available AzureRm Resource Providers.
If you dig a bit deeper into the object, you will notice that the ResourceTypes will display the resource types, locations and API versions.

Lets grab all these details together and display them in a readable list.

$Namespaces = (Get-AzureRmResourceProvider -ListAvailable).ProviderNamespace
foreach ($Namespace in $Namespaces) {
(Get-AzureRmResourceProvider -ProviderNamespace $Namespace).ResourceTypes | select @{l='Namespace';e={$Namespace}},

Now we can get a full overview on AzureRM available API versions per resourcetype.

Tip: Would you only need to see the resources you are currently using? Try removing the -ListAvailable switch.

xAzureTempDrive DSC Module

Today I published my xAzureTempDrive DSC Module to the PowerShell Gallery.

Gallery source:
Github source:

This module contains a resource that will change the default temporary disk driveletter from the D-Drive to whatever you would like it to be.

In the past I have been struggling with this, since the Temp disk is always attached to the D-Drive by default.
Most application teams or customers would however like to use the D-Drive for their own data, (or other usage) and rather have it assigned to a T-drive or maybe even a Z-drive.

To be able to change this however, you would first need to remove the page file from the volume, then reboot the server, then change the driveletter, and then -if- you would like to go trough the hassle you’d have to change the pagefile back to the temporary drive again.

Well that is great, however now when Azure has maintenance or when you deallocate (stop) the VirtualMachine and start it again…. Well yeah it automatically gets a new Temporary disk and voila its attached to the first available driveletter once again!

This DSC resource will help you with that part.

Since its the nature of DSC, it will poll continuously and check for you (by using the assigned pagefile) what driveletter the pagefile has been assigned to.
If this is anything other then what you have defined, it will remove the pagefile, reboot the server, change the temporary disk driveletter, and then attach the pagefile to it again for you.
This way the driveletter will -always- remain on the driveletter you want it to be!

I hope you will enjoy this simple resource as much as I do, since I love it to see all those temporary disks having the same driveletter across all the servers in my domain 🙂


Creating goals, mastering challenges, and realising dreams.

Somebody once told me, that to live your life to the fullest, you have to chase your dreams.
This is exactly what I have been working on throughout my entire carrier, although the biggest step in this might be today.

When I was only 3 years old, it was my eldest brother that introduced me with computers. To be exact it was the Commodore 64. I will never forget this moment, especially spending hours copying games from my receiver to my disks.
Ever since that moment, computers hypnotized me, I love them, and I made it my hobby to learn everything I wanted to know about them. My goal was to do ‘tricks’ with them, do the things you weren’t supposed to do. Or to just break the software and try to fix it again (hopefully learn something additional this way).

Years had passed when I became older, and my goals had changed. Somebody asked me what I would like to become when I got older. I could not put my finger on it what I exactly wanted to do, but it had to be something with computers, I wanted to make a living out of my hobby.
Eventually I switched my education to study ICT at the Nova Collega in Hoofddorp, and after this I eventually found my first job as helpdesk employee for SNT/Wanadoo.

Shortly after this I found my next goal, the old fashioned IT we’re been working on, that had to be an better way. My new goal was to automate every action I had to do at-least 3 times in a row manually. This did not make all managers happy at the time, since automating would cost precious time where they did not see direct result, so I made this my new hobby.
Finally years later after teaching myself, Delphi, Visual Basic and Java, a new language started becoming a big player on the market: PowerShell.
It was during this period I had to do a lot of daily manual actions for Exchange at a customer, and I quickly noticed that writing a few minor scripts made my day a hell of a lot easier.
After I showed this to management they asked me to do this more often, and usually for deployments, or virtual servers.

Eventually I got in touch with automating VMware, and later on Hyper-V. I changed my goals again. I wanted to do more with virtualization, and eventually more with cloud technologies.
Everybody talked about the ‘Cloud’ but what did that really mean?. I did not exactly know it yet at that time but I did know I wanted to lean a lot about it, and share it with the people around me.
I started combining my new passion with cloud technologies with the scripting knowledge I had been working on. I began to automate deployments, write additional code to manage Hyper-V environments in an easier way, and eventually wrote scripts to deploy ‘Roles’ to servers. Because be honest, how many people want an empty server? They want to have specific applications or functions, and perhaps the most important, they wanted every machine to be exactly the same after a deployment (outside of specifications).

Again I learned quite a lot, and technologies changed big time these last few years.
This made me again review my goals. I wanted to share all this knowledge with more people. I loved talking about the new stuff I had been working on, how you could use it in your daily job, and how to simplify managing your environments.
I started blogging, giving presentations at customers, and at some events. But I also started sharing code back to the community on Git-Hub. This is where I had landed un until now, and what I am still doing on a day to day basis.

However, about a year ago a new goal started growing in me. I loved working with automation, new Microsoft cloud solutions, and sharing stuff. But I wanted to do more.
Everywhere I looked around me, when big players and sourcing companies were recruiting and delivering generic systems engineers, or generic automation engineers, nobody placed themselves on the market as ‘the experts’ for PowerShell or Cloud Automation. It became my dream to see if I could fill this gap.

At about this same time, I was placed at a customer together with my good friend and colleague Jeff Wouters. We roughly had about the same idea’s, and eventually we sat together to discuss our ideals and goals, and see if we could realise them: create a company that is fully specialised in Cloud & PowerShell automation. This is where the new Methos was born.

Since Jeff and I are both very community related, it probably won’t surprise you that we are trying to make a difference when it comes to communication between colleagues in the field.
You hire an expert? You don’t only receive the expertise of this individual, but the expertise of the whole Methos group. We believe that nobody knows everything, and you know more with many.
If there is enough contact between the colleagues, people can learn and grow with each other’s expertise’s. Next to this we will encourage people to go to community events around their own expertise’s, and will invite customers to internal master classes on different topics.

The next few years, we will be focussing us on our new dream, and to build on Methos. We will do more than our best to make this a successfully company, and The cloud and datacentre experts in the Netherlands.

Merry X-mas and a Happy New Year!

Its that time of year already, where the holiday celebrations start and everybody sends each other the bests of wishes in cards. I usually do this the traditional way too, but I wanted to do it slightly different this year 😉

Have a good year!


Activate Windows VMs using Powershell Direct

# This code requires Windows 10 professional/Enterprise or Windows Server 2016
$vmname = read-host "VMName"
$key = read-host "Windows Key"
Invoke-Command -VMName $vmname -ScriptBlock {
Write-Verbose -Message "Attempting to inject Windows key: $key"
Cscript.exe $env:SystemRoot\System32\slmgr.vbs -ipk $key
Write-Verbose -Message "Attempting to activate Windows key"
Cscript.exe $env:SystemRoot\System32\slmgr.vbs -ato
} -ArgumentList $key


Zipping your DSC Resources

When you are setting up a DSC Pull scenario, you will have to zip your DSC Resources in a correct format and by using proper tooling.

At first you need to zip your primary directory, and you have to give your zip file the correct naming. This should be (see screenshot below), and then placed in the “C:\Program Files\WindowsPowerShell\DscService\Modules directory”.PSDSC_Pull_Server

Please be aware that even though its ‘just a zip file’ not all programs zip the same way, if you for example use 7zip or winrar you could bump into eventid 4104: Failed to extract the module from zipfile.                                            PSDSC_notpropperlyzipped

If you however use the windows native compressing engine, everything will work out fine.PSDSC_zippedok