Moving resources with ARM

Resource Groups allow you to manage multiple resources within Azure as a logical group. Resource Groups are enabled via the Azure Resource Manager (ARM), which is the new mechanism for managing and securing resources in Azure.

If you have created resources via the old Azure Portal or Service Management APIs, then you may notice that they belong to Default-XXXX Resource Groups when viewed in the new Azure Portal.

Current State

As you can see from the following view of one of my subscriptions in the new Azure Portal, I have a Storage Account that is part of a Resource Group named Default-Storage-WestUS. I originally created this Storage Account in the old Azure Portal but have re-used it for the pbubuntu VM. The VM and its associated Cloud Service were created via the new Azure Portal and were placed into a Resource Group named pbubuntu.

Azure Resources - Before

I’ve been cleaning up a number of orphaned Default-XXXX Resource Groups this week. I did however want to keep the portalvhds2knfjr67c7f3q Storage Account, since it held the VHD for my VM. I was looking for a way to move a resource from one Resource Group to another. I wanted to move the portalvhds2knfjr67c7f3q Storage Account into my pbubuntu Resource Group.

Trying to move the resource

I discovered the Azure PowerShell cmdlet Move-AzureResource and the Move a resource section in the Using Azure PowerShell with Azure Resource Manager blog post described exactly what I was trying to do. But when I tried to run the cmdlet it returned immediately (with no error) and no effect on my resources.

PS C:\> Get-AzureResource -ResourceId /subscriptions/[MY_SUBSCRIPTION_ID]/resourceGroups/Default-Storage-WestUS/providers/Microsoft.ClassicStorage/storageAccounts/portalvhds2knfjr67c7f3q | Move-AzureResource -DestinationResourceGroupName pbubuntu

When I involved Fiddler in the debugging effort I noticed that there was no traffic being sent across the wire. No REST API calls were being made. Then I discovered this issue on GitHub – #379 Moving Azure resources doesn’t do anything. This was related to version 0.9.1 of the Azure PowerShell cmdlets – which is what I was running.

No problem I thought, I’ll use the Azure xplat cli, but quickly found that the move feature was not yet implemented on the resource command …

Ok – back to basics then.

I’d look up the REST API endpoint and payloads and use raw REST calls. I opened up the Azure Resource Manager REST API Reference on MSDN and tried to find the REST API for moving a resource. Nothing … This was not going well.

Diving into cmdlet source code

My next step was to download the Azure PowerShell code from GitHub and attempt to discover the REST API endpoint and payload from the Move-AzureResource cmdlet source code. I built the code and ran the MoveResourceTest test to start debugging the cmdlet.

Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.MoveAzureResourceCommand

I managed to extract the REST API endpoint (managementUri) and infer the payload structure from the ResourceBatchMoveParameters class. I confirmed my findings with Ilya Grebnov, the Architect and Lead Engineer of Azure Resource Manager.

Thanks to Ilya for getting back to me so quickly and for correcting my initial attempt at the payload.

Moving that resource !

ARMClient is a simple command line tool to invoke the Azure Resource Manager API and can be found on GitHub. It is fairly low level and allows you to interact with the raw REST API directly. ARMClient is great in that it manages the Azure authentication tokens for you. Have a look at the ARMClient: a command line tool for the Azure API blog post to understand this tool better.

ARMClient is available via chocolatey and I installed it as follows:

choco install armclient

I then created the payload in a file named move-resource.json. The targetResourceGroup is the full path to the Resource Group I’d like to move the resource to. In this case it’s the pbubuntu Resource Group in my subscription. The resources collection contains the full path to the resources I’d like to move. My collection consists solely of my portalvhds2knfjr67c7f3q Storage Account.

{
  "targetResourceGroup": "/subscriptions/[MY_SUBSCRIPTION_ID]/resourceGroups/pbubuntu",
  "resources": [
    "/subscriptions/[MY_SUBSCRIPTION_ID]/resourceGroups/Default-Storage-WestUS/providers/Microsoft.ClassicStorage/storageAccounts/portalvhds2knfjr67c7f3q"
  ]
}

You will be required to authenticate yourself to use the ARMClient. You can do that as follows:

PS C:\> armclient login

Then I issued a POST to the REST API endpoint for moving a resource out of my Default-Storage-WestUS Resource Group. I included a reference to my move-resource.json file (you need to include the @ prefix) and switched on verbose mode to get as much detail as possible.

PS C:\> armclient post https://management.azure.com/subscriptions/[MY_SUBSCRIPTION_ID]/resourceGroups/Default-Storage-WestUS/moveResources?api-version=2015-01-01 @C:\Dev\move-resource.json -verbose
---------- Request -----------------------

POST /subscriptions/[MY_SUBSCRIPTION_ID]/resourceGroups/Default-Storage-WestUS/moveResources?api-version=2015-01-01 HTTP/1.1
Host: management.azure.com
Authorization: Bearer eyJ0eXAiOiJKV...
User-Agent: ARMClient/1.0.0.0
Accept: application/json
x-ms-request-id: 586648f9-d5ad-43de-b281-e0269ec3cd48

{
"targetResourceGroup": "/subscriptions/[MY_SUBSCRIPTION_ID]/resourceGroups/pbubuntu",
"resources": [
"/subscriptions/[MY_SUBSCRIPTION_ID]/resourceGroups/Default-Storage-WestUS/providers/Microsoft.ClassicStorage/storageAccounts/portalvhds2knfjr67c7f3q"
]
}
---------- Response (5333 ms) ------------

HTTP/1.1 202 Accepted
Pragma: no-cache
Retry-After: 15
x-ms-ratelimit-remaining-subscription-writes: 1199
x-ms-request-id: acc728c3-7086-4774-9665-e30d15bf6084
x-ms-correlation-request-id: acc728c3-7086-4774-9665-e30d15bf6084
x-ms-routing-request-id: AUSTRALIAEAST:20150526T044311Z:acc728c3-7086-4774-9665-e30d15bf6084
Strict-Transport-Security: max-age=31536000; includeSubDomains
Cache-Control: no-cache
Date: Tue, 26 May 2015 04:43:11 GMT
Location: https://management.azure.com/subscriptions/[MY_SUBSCRIPTION_ID]/operationresults/eyJqb2JJZCI6IlJFU09VUkNFQkFUQ0hNT1ZFSk9CLURFRkFVTFQ6MkRTVE9SQUdFOjJEV0VTVFVTLVdFU1RVUyIsImpvYkxvY2F0aW9uIjoid2VzdHVzIn0?api-version=2015-01-01

After a few seconds, a quick check with the Get-AzureResource cmdlet (filtered to the pbubuntu Resource Group) showed that my Storage Account had been moved.

PS C:\> Get-AzureResource -ResourceGroupName pbubuntu

Name              : pbubuntu
ResourceId        : /subscriptions/[MY_SUBSCRIPTION_ID]/resourceGroups/pbubuntu/providers/Microsoft.ClassicCompute/domainNames/pbubuntu
ResourceName      : pbubuntu
ResourceType      : Microsoft.ClassicCompute/domainNames
ResourceGroupName : pbubuntu
Location          : westus
SubscriptionId    : [MY_SUBSCRIPTION_ID]

Name              : pbubuntu
ResourceId        : /subscriptions/[MY_SUBSCRIPTION_ID]/resourceGroups/pbubuntu/providers/Microsoft.ClassicCompute/virtualMachines/pbubuntu
ResourceName      : pbubuntu
ResourceType      : Microsoft.ClassicCompute/virtualMachines
ResourceGroupName : pbubuntu
Location          : westus
SubscriptionId    : [MY_SUBSCRIPTION_ID]

Name              : portalvhds2knfjr67c7f3q
ResourceId        : /subscriptions/[MY_SUBSCRIPTION_ID]/resourceGroups/pbubuntu/providers/Microsoft.ClassicStorage/storageAccounts/portalvhds2knfjr67c7f3q
ResourceName      : portalvhds2knfjr67c7f3q
ResourceType      : Microsoft.ClassicStorage/storageAccounts
ResourceGroupName : pbubuntu
Location          : westus
SubscriptionId    : [MY_SUBSCRIPTION_ID]

End State

And checking the new Azure Portal also confirms my portalvhds2knfjr67c7f3q Storage Account has indeed been moved into my pbubuntu Resource Group.

Azure Resources - After

I’m hoping that the next release of the Azure PowerShell cmdlets will fix the Move-AzureResource cmdlet and that the Azure xplat cli will implement this functionality in the near future. In the interim, you can use ARMClient and the endpoint and payload as described in this post.

Calling Python from PHP in Azure Websites

I was recently asked for help by an ISV. They needed to generate a SAS (Shared Access Signature) to grant time-limited access to resources that they were storing in Azure Blob Storage. They had developed their solution using PHP and deployed it to Azure Websites. I had generated SAS before using C# and .NET but never using PHP. No problem I thought … we have an Azure SDK for PHP and it’s available on GitHub !

Azure SDK for PHP

But after pouring over the code I came to the realisation that there is no functionality to generate a SAS for Blob Storage in the PHP SDK. It seems as though this was first opened as an issue in 2012 but has not been resolved. So what to do?

I didn’t want to implement a solution that would place the burden of keeping up to date with Azure changes on the ISV. I had a quick look at the Python SDK and the weight lifted off my shoulders when I found sharedaccesssignature.py !!

Azure SDK for Python

I verified my approach with the Azure SDK team and got the thumbs up. I was good to go and I could sense the solution would be a matter of minutes away – little did I know …

Test Azure Website

I created a test Website in Azure called callingpythonfromphp and ensured that PHP was enabled (it is by default). I didn’t need to change the Python version setting since Python is installed by default (as are all the other languages). The switches you can see below enable http handlers for the respective languages.

Site Settings for callingpythonfromphp Website

You can check that Python is installed via the Debug Console in Kudu. Here you can see that Python 2.7 is available in D:\Python27.

Kudu - console

I added two files to the D:\home\site\wwwroot folder of my Website – generate-fakesas.py and test-fakesas.php.

Listing of wwwroot

The test-fakesas.php file simply called Python and passed it the generate-fakesas.py script to execute.

<?php echo ">> "; system('D:\Python27\python.exe generate-fakesas.py'); echo " <<"; ?>;

The generate-fakesas.py was so named since all I’m doing is returning the string “SAS” and not actually generating a SAS. This will be output in the PHP script. In the final version it can be assigned to a PHP variable and utilised.

print ("SAS")

This was a quick test to check whether or not I could call a Python script from PHP in an Azure Website. The >> and << in the PHP script would allow me to visually confirm that the output of the Python script had been inserted where I expected.

Hitting the test-fakesas.php script via the Debug Console in Kudu gave me the expected result. I could see the SAS being output. I had successfully called into Python from the PHP script.

Execute PHP script

I then hit the test-fakesas.php script from the browser. Hmm – this was not good. The SAS string was nowhere to be seen.

Calling php script from Website

Looking at the php_errors.log file in D:\home\LogFiles I found the following:

PHP Warning: system(): Unable to fork [D:\Python27\python.exe generate-fakesas.py] in D:\home\site\wwwroot\test-fakesas.php on line 1

Errors in php_errors.log

The Fix

So there was a difference in behaviour between the Debug Console and the manner in which php-cgi was being launched. After a few emails back and forth to the Azure Websites team, they discovered that the source of this issue was to do with the fastcgi.impersonate PHP setting. It was set to 1 by default and needed to be switched off (set to 0). This resulted in the following solution on the Kudu Xdt Transform Samples wiki page.

Custom php.ini

This solution basically allows you to deploy a custom php.ini file for your Azure Website and override settings. When you are doing this ensure that ALL instances of fastcgi.impersonate=1 are changed to fastcgi.impersonate=0 and are NOT commented out.

Note that the applicationhost.xdt and php.ini file should be deployed to your D:\home\site folder.

Solution deployed

And now the call works from both the Debug Console in Kudu and the browser.

Finally - it works !

Ok – so that’s all great. But I actually needed to generate a SAS.

Generating a genuine SAS

DISCLAIMER – I’m going to preface this entire section by saying that I am not a Python guru.

The recommended mechanism for installing Python packages is pip but when I ran pip via the Debug Console in Kudu to install the Azure SDK …

D:\Python27\Scripts\pip.exe install azure

I got the following error.

error: could not create 'D:\Python27\Lib\site-packages\azure': Access is denied

I found that the only way I could install the Python Azure SDK via pip into the Azure Website was by starting off with Python’s virtualenv. I found a great blog post that got me up and running quickly. I created a new development environment called myapp.

D:\Python27\Scripts\virtualenv --no-site-packages myapp

Create development environment

This creates an entire new and isolated Python environment and copies utilities and the Python executables into this environment.

Next activate the development environment.

myapp\Scripts\activate

Activate development environment

And then install the Azure SDK.

pip install azure

Install Azure SDK

I added the test-sas.php file to the D:\home\site\wwwroot\folder of my Website and the  generate-sas.py file to the D:\home\site\wwwroot\myapp folder .

Python and PHP scripts

The test-sas.php file simply called Python and passed it the generate-sas.py script to execute. You can see that it is using the Python executable in the myapps development environment. This means it will also have access to the Azure SDK I installed.

<?php echo ">> "; system('myapp\Scripts\python myapp\generate-sas.py'); echo " <<"; ?>;

The generate-sas.py was based on an example from StackOverflow. The script now will actually generate a SAS for read-only access to the images/flower.png blob within my imagesstoragepb storage. This SAS will be output in the PHP script.

from azure.storage import *
accss_plcy = AccessPolicy()
accss_plcy.start = '2015-01-08'
accss_plcy.expiry = '2015-01-10'
accss_plcy.permission = 'r'
sap = SharedAccessPolicy(accss_plcy)
sas = SharedAccessSignature('imagesstoragepb', 'DPnYqQIKTbNn6iC+nH03wjbvHcpm9htIYVZYkGQJEaWhUbEaGj6koypC0R8AW5Zc6L/g8Sj1tmTPMQlWY8m+NQ==')
qry_str = sas.generate_signed_query_string('images/flower.png','b', sap)
print (sas._convert_query_string(qry_str))

Running this in the browser led to the successful generation of the expected SAS !!

Generated a SAS from PHP via Python in Azure

So don’t be put off if a feature you need is not available in the PHP SDK. Check if you can leverage the Python SDK. And if you need to override the default PHP settings in Azure Websites you also now have a mechanism to do that.

Working with scriptcs in Atom on Windows

Atom is a text editor from the folks at GitHub. I’ve been intrigued, but up until now it has only been available on the OS X platform. And since I currently don’t have a Mac I haven’t yet taken it for a spin. But all that has changed … Atom is now available for Windows.

Atom - text editor

Installing Atom

Installing Atom on Windows is really easy. It is available as a Chocolatey package. If you don’t have Chocolatey on your Windows machine, install it as per the instructions on the Chocolatey website.

Then simply run the following command from the command line to install Atom.

 cinst atom 

And you’ll be greeted by your shiny new text editor when launching Atom.

Atom - welcome

Add C# language support

Support for the C# language is not provided out the box with Atom, but this is quickly solved with an Atom package.

Atom comes with the Atom Package Manager which is easily launched by issuing the following command at the command line:

 apm 

apm - Atom package manager

The Atom package manager allows you to install Atom packages that can be used to extend the functionality of Atom. You use the apm install command to install packages. You can get help for any command by using apm help <command> as shown below.

apm help install

We are interested in the language-csharp Atom package. This adds syntax highlighting and snippets for both C# and scriptcs specific grammars.

language-csharp

Install the package by issuing the following command at the command line:

 apm install language-csharp 

apm install language-csharp

Add support for running scriptcs

So now we have C# language support in Atom, but cannot yet run our C# script files using scriptcs. To enable this we require another Atom package – atom-runner. This package allows you to run code or scripts from within Atom.

atom-runner

Install the package by issuing the following command at the command line:

 apm install atom-runner 

apm install atom-runner

We then need to configure atom-runner and associate csx files with scriptcs. This will allow us to execute our csx files from within Atom. We need to add this configuration information to Atom’s config.cson configuration settings file.

The easiest way to open this file is to use Atom’s command palette. Press ctrl-shift-p to bring up the command palette and then type config. Hit enter to open the config.cson file for your user profile.

command palette

Add the following lines to the end of the file.

'runner': 
  'extensions': 
    'csx': 'scriptcs' 

config.cson

See scriptcs in action

Atom has now been configured to provide syntax highlighting and snippets for C# and scriptcs. It is also now capable of executing csx files from within Atom. So let’s see this in action.

Create a csx file and write a simple Console.WriteLine statement. I’ve created a file hello.csx in the C:\Labs folder and added the message “Hello from atom & scriptcs!” to the Console.WriteLine statement. Ensure that the file is saved.

hello.csx

Next bring up the command palette again (ctrl-shift-p) and type runner. Select the Runner: Run item and hit enter. This will invoke the Atom Runner and provide it with the path to the hello.csx file which is the active tab in the editor.

Runner: Run

The csx file will be run by scriptcs and the output captured in the Atom Runner window.

Run scriptcs csx file

Now you can write your scriptcs csx files in Atom with C# syntax highlighting and snippets. You can even execute your csx files from within Atom.

Add keybinding for Atom Runner

Starting the Atom Runner via the command palette just felt like too many keystrokes for me. So I decided to have a look at the keymap functionality within Atom in order to bind a set of keys to the run event of the Atom Runner.

Bring up the command palette again (ctrl-shift-p) and type keymap. Hit enter to open the keymap.cson file for your user profile.

keymap

Add the following lines to the end of the file.

'.platform-win32 .workspace .editor': 
  'ctrl-shift-r': 'runner:run' 

keymap configuration

This will map ctrl-shift-r to the Run event of the Atom Runner on the Windows platform. So this is all you need to use now to execute your csx files.

The Atom Runner has it’s own keymap file (%UserProfile%\.atom\packages\atom-runner\keymaps\atom-runner.cson) that is used by Atom but this is currently OS X specific.

Atom Runner - keymap

Acknowledgements

I’d like to thank Adam Ralph for doing the hard yards and documenting the steps on the scriptcs wiki for how to get this up and running quickly.

If you are interested in how to do the same with PowerShell and obtaining syntax highlighting, snippets and script execution within Atom then have a look at Doug Finke‘s blog post – PowerShell and The Github Atom Editor.

Troubleshooting adventures with scriptcs and Edge

What was the issue ?

Glenn Block released a new scriptcs script pack – ScriptCs.Edge. There was a call on twitter for people to test and I was happy to help out. It worked first time for me. Awesome to combine Tomasz Janczuk‘s work on Edge with scriptcs !

tweet

But Morten Christensen was having an issue.

tweet

And it seemed like it had something to do with having Visual Studio 2013 installed on the machine. Which I had but Morten didn’t.

tweettweet

Tomasz confirmed that Edge required that msvcr120.dll was available on the machine. This assembly is the Microsoft C Runtime library and is installed via Visual Studio 2013. Mystery solved :)

tweet

But I wondered how we may have solved this issue if we hadn’t got a quick reply from Tomasz …

Replicate the issue

First I needed an environment to replicate the issue. I really didn’t feel like uninstalling Visual Studio 2013 from my machine so I created a Windows 8.1 VM on Microsoft Azure. A Windows 8.1 image is now available to MSDN subscribers. It does not have Visual Studio 2013 installed so was perfect.

Windows 8.1 VM image

After installing scriptcs and the ScriptCs.Edge script pack I found that I was getting the same error as Morten. This was expected. So now the question was – how could I figure out what was going wrong?

ScriptCs.Edge failure

From the error one could deduce that something is not being loaded and given the evidence that this works on a machine with Visual Studio 2013 but not on one without Visual Studio 2013 it seems likely we are looking for a file that is missing.

When you are looking at low level tasks in Windows you can almost guarantee that Mark Russinovich has written some SysInternals tool to help you. And there it was … Process Monitor.

Running Process Monitor on the machine while testing the script pack showed that a specific file could not be found (msvcr120.dll) just after the edge.node module had been successfully loaded. This matched what we were seeing in the error message. So we had found the culprit.

SysInternals - Process Monitor

There is a firehose of information that Process Monitor will display. I restricted this information via filters. I only displayed file activities via the Show File System Activity button the the menu. I further filtered the entries to only those produced by the scriptcs process by applying a filter as shown below.

Process Monitor - filter

Resolve the issue

To test that having the msvcr120.dll assembly would resolve the issue I copied it from my local machine (that had Visual Studio 2013 installed) and placed it in the same folder as the edge.node module on the Windows 8.1 VM in Azure. This was one of the folders searched so I assumed the assembly would be picked up from here.

Missing assembly

Success !

You can see the Node js v0.10.28 welcomes .NET message in the console below. The msvcr120.dll assembly is also clearly loaded as can be seen in the Process Monitor screen.

06-ItWorks-callouts

It was great to see if I could resolve this issue by troubleshooting the process. I now have another tool that I can add to my troubleshooting belt.

And soon Tomasz will be including this assembly in the Edge NuGet package. So no need to copy around assemblies.

My first Pluralsight course is live !

The last few months have been an incredible journey for me. And that journey has resulted in my first Pluralsight course, Introduction to scriptcs, being published on 2 May 2014.

Introduction to scriptcs

Twitter - @Pluralsight

The scriptcs project was started by Glenn Block and was heavily inspired by node.js. It aims to introduce a low friction experience to the world of C# and even better bring that experience to you across Windows, Mac OS X and Linux. If you haven’t looked at it yet, download it and start playing.

I am also extremely grateful for the support from the scriptcs team.

Twitter - @gblock

Twitter - @scriptcsnet

Twitter - @filip_woj

Twitter - @khellang

And for the awesome feedback from Filip and Glenn !

Twitter - @filip_woj

Twitter - @gblock

Twitter - @gblock

Love at first site – scriptcs and WAML at Brisbane Azure User Group

I gave a talk at the Brisbane Azure User Group on the 13th November 2013 titled Love at first site – scriptcs and WAML. Here is my slide deck.

scriptcs is putting C# on a diet and decoupling your favourite language from Visual Studio. The Windows Azure Management Library (WAML) is a C# library that wraps the Windows Azure Management REST APIs. These two belong together !

In this talk I introduced scriptcs and the Windows Azure Management Library, before showing how to combine these two awesome resources to script the management of your Windows Azure assets with the full power of C#.

First Look at Built-in Autoscaling and Alerting at Brisbane Azure User Group

I gave a talk at the Brisbane Azure User Group on the 21st August 2013 titled First Look at Built-in Autoscaling and Alerting. Here is my slide deck.

image

Autoscaling has finally been built in to Windows Azure via Microsoft’s acquisition of MetricsHub. The Autoscaling funtionality from MetricsHub has been rolled directly into the Windows Azure platform. Other features that have also been rolled into the Windows Azure platform from MetricsHub include Availability Monitoring and Alerting.