xAzureTempDrive DSC Module

Today I published my xAzureTempDrive DSC Module to the PowerShell Gallery.

Gallery source: https://www.powershellgallery.com/packages/xAzureTempDrive
Github source: https://github.com/DdenBraver/xAzureTempDrive

This module contains a resource that will change the default temporary disk driveletter from the D-Drive to whatever you would like it to be.

In the past I have been struggling with this, since the Temp disk is always attached to the D-Drive by default.
Most application teams or customers would however like to use the D-Drive for their own data, (or other usage) and rather have it assigned to a T-drive or maybe even a Z-drive.

To be able to change this however, you would first need to remove the page file from the volume, then reboot the server, then change the driveletter, and then -if- you would like to go trough the hassle you’d have to change the pagefile back to the temporary drive again.

Well that is great, however now when Azure has maintenance or when you deallocate (stop) the VirtualMachine and start it again…. Well yeah it automatically gets a new Temporary disk and voila its attached to the first available driveletter once again!

This DSC resource will help you with that part.

Since its the nature of DSC, it will poll continuously and check for you (by using the assigned pagefile) what driveletter the pagefile has been assigned to.
If this is anything other then what you have defined, it will remove the pagefile, reboot the server, change the temporary disk driveletter, and then attach the pagefile to it again for you.
This way the driveletter will -always- remain on the driveletter you want it to be!

I hope you will enjoy this simple resource as much as I do, since I love it to see all those temporary disks having the same driveletter across all the servers in my domain 🙂

~Danny

Disable proxy settings at system level

Today I was facing some issues with the proxy server at the company I was working for.
It seemed that a rule was applied that made all servers connect outbound trough a proxy, instead of only the desktops as provided.

In an attempt to quickly resolve this issue, I quickly searched the internet.
I found that its rather easy to find how to disable the proxy settings using GPO, or at a user level. However it was not that easy to find how to disable this at a system level.

It seems that there are 2 registry keys that need to be created (or modified) to do this.
These registry keys are located at HKLM:\SOFTWARE\Policies\Microsoft\Windows\CurrentVersion\Internet Settings

Step 1, disable the user based proxy settings:
In the HKLM:\SOFTWARE\Policies\Microsoft\Windows\CurrentVersion\Internet Settings there is a DWORD called ProxySettingsPerUser, if this is set to 0 it will be disabled and system wide settings are used. Put it back to 1 or remove this key entirely to enable user based proxy settings again.

Step 2, disable the automatic detect proxy settings checkbox.
In the HKLM:\SOFTWARE\Policies\Microsoft\Windows\CurrentVersion\Internet Settings there is a DWORD called EnableAutoProxyResultCache, set it to 0 and it will be disabled.

Here is a simple script you can use to inject these settings into the registry directly.

~Danny

Creating goals, mastering challenges, and realising dreams.

Somebody once told me, that to live your life to the fullest, you have to chase your dreams.
This is exactly what I have been working on throughout my entire carrier, although the biggest step in this might be today.

When I was only 3 years old, it was my eldest brother that introduced me with computers. To be exact it was the Commodore 64. I will never forget this moment, especially spending hours copying games from my receiver to my disks.
Ever since that moment, computers hypnotized me, I love them, and I made it my hobby to learn everything I wanted to know about them. My goal was to do ‘tricks’ with them, do the things you weren’t supposed to do. Or to just break the software and try to fix it again (hopefully learn something additional this way).

Years had passed when I became older, and my goals had changed. Somebody asked me what I would like to become when I got older. I could not put my finger on it what I exactly wanted to do, but it had to be something with computers, I wanted to make a living out of my hobby.
Eventually I switched my education to study ICT at the Nova Collega in Hoofddorp, and after this I eventually found my first job as helpdesk employee for SNT/Wanadoo.

Shortly after this I found my next goal, the old fashioned IT we’re been working on, that had to be an better way. My new goal was to automate every action I had to do at-least 3 times in a row manually. This did not make all managers happy at the time, since automating would cost precious time where they did not see direct result, so I made this my new hobby.
Finally years later after teaching myself, Delphi, Visual Basic and Java, a new language started becoming a big player on the market: PowerShell.
It was during this period I had to do a lot of daily manual actions for Exchange at a customer, and I quickly noticed that writing a few minor scripts made my day a hell of a lot easier.
After I showed this to management they asked me to do this more often, and usually for deployments, or virtual servers.

Eventually I got in touch with automating VMware, and later on Hyper-V. I changed my goals again. I wanted to do more with virtualization, and eventually more with cloud technologies.
Everybody talked about the ‘Cloud’ but what did that really mean?. I did not exactly know it yet at that time but I did know I wanted to lean a lot about it, and share it with the people around me.
I started combining my new passion with cloud technologies with the scripting knowledge I had been working on. I began to automate deployments, write additional code to manage Hyper-V environments in an easier way, and eventually wrote scripts to deploy ‘Roles’ to servers. Because be honest, how many people want an empty server? They want to have specific applications or functions, and perhaps the most important, they wanted every machine to be exactly the same after a deployment (outside of specifications).

Again I learned quite a lot, and technologies changed big time these last few years.
This made me again review my goals. I wanted to share all this knowledge with more people. I loved talking about the new stuff I had been working on, how you could use it in your daily job, and how to simplify managing your environments.
I started blogging, giving presentations at customers, and at some events. But I also started sharing code back to the community on Git-Hub. This is where I had landed un until now, and what I am still doing on a day to day basis.

However, about a year ago a new goal started growing in me. I loved working with automation, new Microsoft cloud solutions, and sharing stuff. But I wanted to do more.
Everywhere I looked around me, when big players and sourcing companies were recruiting and delivering generic systems engineers, or generic automation engineers, nobody placed themselves on the market as ‘the experts’ for PowerShell or Cloud Automation. It became my dream to see if I could fill this gap.

At about this same time, I was placed at a customer together with my good friend and colleague Jeff Wouters. We roughly had about the same idea’s, and eventually we sat together to discuss our ideals and goals, and see if we could realise them: create a company that is fully specialised in Cloud & PowerShell automation. This is where the new Methos was born.

Since Jeff and I are both very community related, it probably won’t surprise you that we are trying to make a difference when it comes to communication between colleagues in the field.
You hire an expert? You don’t only receive the expertise of this individual, but the expertise of the whole Methos group. We believe that nobody knows everything, and you know more with many.
If there is enough contact between the colleagues, people can learn and grow with each other’s expertise’s. Next to this we will encourage people to go to community events around their own expertise’s, and will invite customers to internal master classes on different topics.

The next few years, we will be focussing us on our new dream, and to build on Methos. We will do more than our best to make this a successfully company, and The cloud and datacentre experts in the Netherlands.

WAP Websites UR9 and External Storage issue (FTP: “550 Command not allowed”)

The last few weeks we had upgraded the test environment of Windows Azure Pack Websites at a customer to Update Rollup 9, and Azure Pack to Update Rollup 10.
Since this update we were seeing strange behavior on the environment when we were uploading data using FTP.

All the subscriptions and websites that were already running before the upgrade seemed to be running fine, however an issue started after creating new websites.
Whenever we would connect using FTP, we could list all data but for some reason we could not upload anything.

The exact error we were receiving was:

“550 Command not allowed”

However when we were uploading using GIT or the publish template, everything was working fine.

After some digging and sending detailed information over to Microsoft, we received the answer from Joaquin Vano.
There seems to be a bug in Azure Pack Websites where Quota’s are being enforced using the wrong method for external fileservers.
This check would then fail and kill any upload to the share with an access denied message.

You can resolve this issue with the following work-around:
Go to “C:\Windows\System32\inetsrv\config\applicationHost.config” on the Publisher servers.

Edit the following key value:

and change it to False:

We have been made aware that this issue will be resolved in the next update for Azure Pack Websites.

Source

Create custom Webworkers in Windows Azure Pack Websites

Today a customer came to me with the question if it would be possible to create your own Webworkers in Windows Azure Pack websites.
The reason for doing this is because they want customers within 1 subscription to be able to have 1 website in a “small” scale, and for example another in a “medium” scale.

With the default instances available this is not possible.
When you install Windows Azure Pack websites you get 2 compute modes: Shared, and Dedicated. You get 3 SKU modes: Free, Shared and Standard.
And these SKU modes have one or multiple worker Tiers.

Free: Shared
Shared: Shared
Standard: Small, Medium and Large

So to be able to reach this goal we need to create dedicated SKU’s for new the existing tiers, or new tiers in new dedicated SKU’s.
After starting to look for some documentation I found there was not much available describing this, however when I started looking for commandlets I found the following:

After this, the next few steps were easy to create a new Workertier, and a new SKU.

CustomWebworker

Now I was able to add this new worker into my websites cloud, and use it in my Windows Azure Pack portal!

New SQLServer DSC Resource cSQLServer

Today we have released a new SQL Server DSC Resource that has been created by merging the already existing xSQLServer and xSqlPs resources and adding new functionality like:

– SQL Always-On (with domain credentials)
– SQL Always-On between 2 Clusters
– SQL File streaming
– SQL Always-On with Listener

For more information please check the source at:

cSQLServer:       https://github.com/Solvinity/cSQLServer
cFailoverCluster: https://github.com/Solvinity/cFailoverCluster
Demo Video:       https://youtu.be/l8KwLUtXNB8

A more in depth article will be published early January!

-Danny

Merry X-mas and a Happy New Year!

Its that time of year already, where the holiday celebrations start and everybody sends each other the bests of wishes in cards. I usually do this the traditional way too, but I wanted to do it slightly different this year 😉

https://github.com/DdenBraver/Xmas-Tree

Have a good year!

-Danny

Activate Windows VMs using Powershell Direct

 

DSC Module: cManageCertificates

I just bumped into the issue where I needed to import a certificate to a server before I could use it in a DSC resource. Since there was no DSC Resource available yet to import or remove certificates from a certain store from a computer I had to create one myself.

cManageCertificates

The result can be found here: https://github.com/DdenBraver/cManageCertificates

This resource is pretty simple, and it uses the powershell cmdlet Import-PfxCertificate to import a certificate, and the Remove-Item “Cert:\Storetype\Store\Thumbprint” to remove a certificate if required.

Using Powershell Packagemanager (OneGet) basics

One of the new great features of WMF5, included in Windows 10 RTM is Powershell Packagemanager (previously called OneGet).

I will not go into the details on how it works or what it is (that you can find here!) I do want to show you how you can use it to simply install some packages from the powershell gallery, or from chocolatey which is basically a community based package source that already includes alot of applications and tools.

Lets take a look at the basic command sets:
Find-Package: find the package you are looking for in the registered package sources in powershell packagemanager
Get-Package: finds the packages on your system that have been installed using powershell packagemanager
Install-Package: installs a package on your system using powershell packagemanager
Uninstall-Package: removes a package from your system that was installed using powershell packagemanager

When you first kick off find-package you may get the request that the package provider nuget has not been installed and needs to be downloaded and installed.

If you select Yes, you will now (by default) get a list of available packages -from the powershell gallery only-.

Lets now install the chocolatey package provider so that we can use PowerShell Packagemanager to install available community packages.

If you now once again do a find-package you will suddenly see alot of additional packages that are available from the source ‘chocolatey’. Please be aware however that not all of these packages can be trusted (please read the chocolatey FAQ)

You can install simply the packages by using the install-package, or pipe the install package behind the find-package to also install the dependencies (if any). For example:

After the software has been installed you can review the installed software by using get-package

Here is an example script that you can use for installing some software to your local desktop with powershell packagemanager: oneget_example.ps1

I hope you can enjoy this new functionality as much as myself. This new functionality could greatly help with package deployment and management in the future. You could for example set up your own package repository and deploy your own internal packages using Nuget Server.