Azure: The request was aborted: Could not create SSL/TLS secure channel.

Aer you running in to the following error when trying to login to Azure?

Add-AzureRmAccount : accessing_ws_metadata_exchange_failed: Accessing WS metadata exchange failed: The request was
aborted: Could not create SSL/TLS secure channel.
At line:5 char:1
+ Add-AzureRmAccount -Credential $AzureAutomationCredential
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [Add-AzureRmAccount], AadAuthenticationFailedException
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.Profile.AddAzureRMAccountCommand

This may happen if your company is redirecting your login, and has disabled TLS 1.0/1.1 that is used by default by the Automation session.

You can add the following line to the top of your powershell code to get arround this issue:
[Net.ServicePointManager]::SecurityProtocol = [Net.ServicePointManager]::SecurityProtocol::TLS12

This issue is currently active with the following modules/tools:
– Azure Automation (10/8/2018)
– AzureRm Module version 6.9.0
– AZ Module version 0.2.2

Download from Azure blob using the Azure Rest-API

Today a colleague came to me with the question, if it is possible to download a file from Azure Storage from a Windows server.
Now this would normally be fairly simple, however:

1. We would not be allowed to use an external tool (like azcli / azcopy)
2. The server runs Windows PowerShell 4.0 (and an upgrade is not possible at this time).
3. AzureRm(.Storage) PowerShell module does not support PowerShell 4.0
4. We should be using a SAS-Token to download the files from the Azure Storage Account.

I figured that the only option we had left to approach was to use invoke-Webrequest against the Azure Rest-API.
Digging into the documentation on Microsoft Docs I found the following article: https://docs.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1

If you read closely into the documentation you can see that the SAS token aquired from the Storage Account can simply be added at the end of the requesting URI to authenticate.
Due to this nature, there is no need to add an additional header to the webrequest, making this faily simple to use.

The result is the following simple function we can now use to download files from blobs within azure from *any* windows server with access to Azure.

Function Get-AzureBlobFromAPI {
    param(
        [Parameter(Mandatory)]
        [string] $StorageAccountName,
        [Parameter(Mandatory)]
        [string] $Container,
        [Parameter(Mandatory)]
        [string] $Blob,
        [Parameter(Mandatory)]
        [string] $SASToken,
        [Parameter(Mandatory)]
        [string] $File
    )

    # documentation: https://docs.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1
    Invoke-WebRequest -Uri "https://$StorageAccountName.blob.core.windows.net/$Container/$($Blob)$($SASToken)" -OutFile $File
}

Creating goals, mastering challenges, and realising dreams.

Somebody once told me, that to live your life to the fullest, you have to chase your dreams.
This is exactly what I have been working on throughout my entire carrier, although the biggest step in this might be today.

When I was only 3 years old, it was my eldest brother that introduced me with computers. To be exact it was the Commodore 64. I will never forget this moment, especially spending hours copying games from my receiver to my disks.
Ever since that moment, computers hypnotized me, I love them, and I made it my hobby to learn everything I wanted to know about them. My goal was to do ‘tricks’ with them, do the things you weren’t supposed to do. Or to just break the software and try to fix it again (hopefully learn something additional this way).

Years had passed when I became older, and my goals had changed. Somebody asked me what I would like to become when I got older. I could not put my finger on it what I exactly wanted to do, but it had to be something with computers, I wanted to make a living out of my hobby.
Eventually I switched my education to study ICT at the Nova Collega in Hoofddorp, and after this I eventually found my first job as helpdesk employee for SNT/Wanadoo.

Shortly after this I found my next goal, the old fashioned IT we’re been working on, that had to be an better way. My new goal was to automate every action I had to do at-least 3 times in a row manually. This did not make all managers happy at the time, since automating would cost precious time where they did not see direct result, so I made this my new hobby.
Finally years later after teaching myself, Delphi, Visual Basic and Java, a new language started becoming a big player on the market: PowerShell.
It was during this period I had to do a lot of daily manual actions for Exchange at a customer, and I quickly noticed that writing a few minor scripts made my day a hell of a lot easier.
After I showed this to management they asked me to do this more often, and usually for deployments, or virtual servers.

Eventually I got in touch with automating VMware, and later on Hyper-V. I changed my goals again. I wanted to do more with virtualization, and eventually more with cloud technologies.
Everybody talked about the ‘Cloud’ but what did that really mean?. I did not exactly know it yet at that time but I did know I wanted to lean a lot about it, and share it with the people around me.
I started combining my new passion with cloud technologies with the scripting knowledge I had been working on. I began to automate deployments, write additional code to manage Hyper-V environments in an easier way, and eventually wrote scripts to deploy ‘Roles’ to servers. Because be honest, how many people want an empty server? They want to have specific applications or functions, and perhaps the most important, they wanted every machine to be exactly the same after a deployment (outside of specifications).

Again I learned quite a lot, and technologies changed big time these last few years.
This made me again review my goals. I wanted to share all this knowledge with more people. I loved talking about the new stuff I had been working on, how you could use it in your daily job, and how to simplify managing your environments.
I started blogging, giving presentations at customers, and at some events. But I also started sharing code back to the community on Git-Hub. This is where I had landed un until now, and what I am still doing on a day to day basis.

However, about a year ago a new goal started growing in me. I loved working with automation, new Microsoft cloud solutions, and sharing stuff. But I wanted to do more.
Everywhere I looked around me, when big players and sourcing companies were recruiting and delivering generic systems engineers, or generic automation engineers, nobody placed themselves on the market as ‘the experts’ for PowerShell or Cloud Automation. It became my dream to see if I could fill this gap.

At about this same time, I was placed at a customer together with my good friend and colleague Jeff Wouters. We roughly had about the same idea’s, and eventually we sat together to discuss our ideals and goals, and see if we could realise them: create a company that is fully specialised in Cloud & PowerShell automation. This is where the new Methos was born.

Since Jeff and I are both very community related, it probably won’t surprise you that we are trying to make a difference when it comes to communication between colleagues in the field.
You hire an expert? You don’t only receive the expertise of this individual, but the expertise of the whole Methos group. We believe that nobody knows everything, and you know more with many.
If there is enough contact between the colleagues, people can learn and grow with each other’s expertise’s. Next to this we will encourage people to go to community events around their own expertise’s, and will invite customers to internal master classes on different topics.

The next few years, we will be focussing us on our new dream, and to build on Methos. We will do more than our best to make this a successfully company, and The cloud and datacentre experts in the Netherlands.

WAP Websites UR9 and External Storage issue (FTP: “550 Command not allowed”)

The last few weeks we had upgraded the test environment of Windows Azure Pack Websites at a customer to Update Rollup 9, and Azure Pack to Update Rollup 10.
Since this update we were seeing strange behavior on the environment when we were uploading data using FTP.

All the subscriptions and websites that were already running before the upgrade seemed to be running fine, however an issue started after creating new websites.
Whenever we would connect using FTP, we could list all data but for some reason we could not upload anything.

The exact error we were receiving was:

“550 Command not allowed”

However when we were uploading using GIT or the publish template, everything was working fine.

After some digging and sending detailed information over to Microsoft, we received the answer from Joaquin Vano.
There seems to be a bug in Azure Pack Websites where Quota’s are being enforced using the wrong method for external fileservers.
This check would then fail and kill any upload to the share with an access denied message.

You can resolve this issue with the following work-around:
Go to “C:\Windows\System32\inetsrv\config\applicationHost.config” on the Publisher servers.

Edit the following key value:

<add key="storageQuotaEnabled" value="True" />

and change it to False:

<add key="storageQuotaEnabled" value="False" />

We have been made aware that this issue will be resolved in the next update for Azure Pack Websites.

Source

Create custom Webworkers in Windows Azure Pack Websites

Today a customer came to me with the question if it would be possible to create your own Webworkers in Windows Azure Pack websites.
The reason for doing this is because they want customers within 1 subscription to be able to have 1 website in a “small” scale, and for example another in a “medium” scale.

With the default instances available this is not possible.
When you install Windows Azure Pack websites you get 2 compute modes: Shared, and Dedicated. You get 3 SKU modes: Free, Shared and Standard.
And these SKU modes have one or multiple worker Tiers.

Free: Shared
Shared: Shared
Standard: Small, Medium and Large

So to be able to reach this goal we need to create dedicated SKU’s for new the existing tiers, or new tiers in new dedicated SKU’s.
After starting to look for some documentation I found there was not much available describing this, however when I started looking for commandlets I found the following:

get-command *sku* -Module websitesdev

CommandType Name Version Source
----------- ---- ------- ------
Cmdlet Get-WebSitesSku 1.0 Websitesdev
Cmdlet New-WebSitesSku 1.0 Websitesdev
Cmdlet Remove-WebSitesSku 1.0 Websitesdev
Cmdlet Set-WebSitesSku 1.0 Websitesdev
get-command *tier* -Module websitesdev

CommandType     Name                                               Version    Source
-----------     ----                                               -------    ------
Cmdlet          Get-WebSitesWorkerTier                             1.0        Websitesdev
Cmdlet          New-WebSitesWorkerTier                             1.0        Websitesdev
Cmdlet          Remove-WebSitesWorkerTier                          1.0        Websitesdev
Cmdlet          Set-WebSitesWorkerTier                             1.0        Websitesdev

After this, the next few steps were easy to create a new Workertier, and a new SKU.

New-WebSitesWorkerTier -Name "Custom-Small" -ComputeMode "Dedicated" -NumberOfCores '1' -description "Custom Small Tier" -MemorySize '2048'
New-WebSitesSku -SkuName 'Custom' -ComputeMode 'Dedicated' -WorkerTiers 'Custom-Small'

CustomWebworker

Now I was able to add this new worker into my websites cloud, and use it in my Windows Azure Pack portal!