Understanding the benefits of Windows Azure geo-redundancy in Australia

The Windows Azure family has been extended with deployment regions in Australia. The deployment regions are paired for geo-redundancy and in Australia are located in the New South Wales and Victoria sub-regions.

Windows Azure data centres

Why is geo-redundancy important ?

Windows Azure Storage is an important service that underpins a number of other Windows Azure services – Blob Storage, Table Storage, and Virtual Machine (OS and Data Volumes).

Geo-replication is enabled by default in Windows Azure Storage and provides the highest level of storage durability. Data is asynchronously replicated from your primary location to a secondary location within the same region. These locations are guaranteed to be at least 400kms apart to ensure that data durability across catastrophic events.

The following picture shows the New South Wales and Victoria sub-regions within the Australia geographical region. Here the NSW deployment region is the primary location and is asynchronously replicating data to the Victoria deployment region which is the secondary location.

Geo-redundancy

In the event of a catastrophic event in the primary location where the primary location cannot be restored, all traffic will be failed over to the geo-replicated secondary location. This ensures business continuity.

What about redundancy within the deployment region ?

Within each deployment region there is an additional layer of redundancy. All data is synchronously replicated to 3 different storage nodes across 3 separate fault and upgrade domains. This allows each deployment region to recover from common issues such as disk, node or rack failure.

The following picture demonstrates the 3 copies across 3 separate racks and the creation of a new copy after the failure of a storage node.

image

When geo-replication is enabled, you will effectively have 6 copies of your data distributed across 2 geo-graphically dispersed deployment regions. The multiple layers and mechanisms ensuring highly durable data will provide business continuity across a number of scenarios.

How do these new deployment regions affect Australian businesses ?

The most obvious answers would be reduced latency and data storage within Australia.

Even though a Windows Azure CDN has been available out of Sydney for a while, it has only offered lower latency on Blob Storage for very specific read-only workloads. With an Australian region available now, lower latency is available to a wider range of Windows Azure services and this will benefit Australian businesses utilising the Windows Azure platform.

Content in Blob Storage that is produced and consumed within the local Australian market will benefit from the lower latency. A more compelling case can now also be made for cloud integrated storage solutions such as StorSimple. Those customers within Australia that have regulatory pressures preventing them from storing data outside of Australia will also be heartened and this announcement should remove this hurdle.

Australian businesses utilising Virtual Machines within the Windows Azure Infrastructure Services will also enjoy lower latency when connecting via Remote Desktop or SSH.

This is a truly exciting announcement that will hopefully see more Australian companies begin their journey into the cloud.

Blob Storage Deep Dive at Brisbane Azure User Group

I gave a talk at the Brisbane Azure User Group on the 16th January 2013 titled Windows Azure Blob Storage – A Deep Dive. Here is my slide deck.

Windows Azure Blob Storage - A Deep Dive

The deep dive covered the basics before diving into advanced, performance and security based topics. Fiddler was used extensively to show the actual REST API calls generated by the C# storage library.

I have pushed the demo code to GitHub and will be starting a series of posts where I will walk through each of the demos and the HTTP traffic observed via Fiddler.

image

Deploying Linux VMs to Azure from VM Depot

You can currently add Linux Virtual Machines to Windows Azure via the Windows Azure Portal from base OS images of the following distributions: CentOS, SUSE and Ubuntu. But it is then up to you to install and configure any additional software you require.

With the announcement of Microsoft Open Technologies’ VM Depot you now have access to a community-driven catalog of preconfigured operating systems, applications, and development stacks that can easily be deployed on Windows Azure.

VM Depot by Microsoft Open Technologies

What do we need to get started ?

The help on VM Depot indicates that the Windows Azure command line tool is the mechanism to use when deploying images from VM Depot to Windows Azure. This tool is available for Mac and Linux but I wanted to deploy from my Windows machine. The only command line tools available for Windows via the Windows Azure Tools Download page are the Windows Azure Powershell cmdlets which do not support the new community virtual machine image switch.

Thankfully I happened across Sandrino Di Mattia’s post Using the Windows Azure CLI on Windows and from within Visual Studio which was exactly what I was after.

So – if you are doing the deployment from a Mac or Linux box download the command line tool from the Windows Azure website. If you are doing the deployment from a Windows box continue reading to install the Windows version of the command line tool. All steps after this installation will work for all platforms since we will be leveraging a common tool.

Download the command line tools for windows and install.

Windows Azure Command Line Tools Setup

This will install the tools to the following location and configure your PATH environment variable.

Installation Location

Firing up a Command Prompt and typing azure will provide usage information for the command line tool.

Azure CLI command prompt

Typing the following at the command line shows that the option we require for creating a virtual machine from a community image is available in the Windows version of the command line tool.

azure vm create --help

Azure CLI vm create with community option

Now we are ready to begin.

Configuring the command line tool

As with all the Windows Azure tools, we’ll need to ensure our management certificates are configured in the tool so that we can interact with the Windows Azure management services.

Type the following at the command line to download your publishsettings file.

azure account download

This file contains all the information for your subscriptions and management certificates. This command will fire up a browser and will require you to authenticate with your Windows Azure credentials if you are not currently logged in. The publishsettings file will then be downloaded.

Azure CLI download publish settings

Once the file has downloaded, type the following at the command line, replacing $FILENAME with the path to the publishsettings file just downloaded. You can see that mine was downloaded to C:\Users\Paul\Downloads\.

azure account import "$FILENAME"

Azure CLI import publish settings

You can see that I have 2 subscriptions, a personal subscription and an MSDN subscription. Next we’ll set the subscription that we wish to work with when creating our virtual machine.

Type the following at the command line, replacing $SUBSCRIPTION with your subscription name. You can see that I have selected my MSDN subscription.

azure account set "$SUBSCRIPTION"

Azure CLI set account

Selecting a VM from the catalog

I selected the Solr 4.0.0-0 (Ubuntu 12.10) image from the catalog available on VM Depot and clicked on the Deployment Script button at the top of the page.

VM Depot Solr 4.0.0 (Ubuntu 12.10) Image

After selecting the Deployment Region from the drop down, I was presented with a command line script to create my Solr virtual machine. Keep this handy.

azure vm create DNS_PREFIX -o vmdepot-30-1-1 -l "West US" USER_NAME [PASSWORD] [--ssh] [other_options]

VM Depot Deployment Script

Create a storage account

We’ll need to create a storage account that hold the virtual machine’s OS Disk (vhd). Type the following at the command line, replacing $REGION with the data centre region you have selected and $VMNAME with the dns name you are giving your virtual machine.

azure account storage create --location "$REGION" "$VMNAME"

Azure CLI create storage account for image and vhd

You can see that I have chosen the West US as my region and pbsolr4 as the dns name for my virtual machine. You can also confirm the creation of the storage account in the portal.

Portal - verify storage account created

To ensure that this storage account is used for your virtual machine, set it as the default storage account for the interactions to follow on the command line.

Type the following at the command line, replacing $STORAGEACCOUNT with the name of your storage account. You can see that my storage account name is pbsolr4.

azure config set defaultStorageAccount "$STORAGEACCOUNT"

Azure CLI set storage account as default

Creating the VM

We’ll now use that command line script that the VM Depot site gave us to run as a deployment script. I’ll repeat it here.

azure vm create DNS_PREFIX -o vmdepot-30-1-1 -l "West US" USER_NAME [PASSWORD] [--ssh] [other_options]

I ran the command line script with an additional parameter –vm-size small to specify my virtual machine size. My complete command line script is as follows:

azure vm create pbsolr4 -o vmdepot-30-1-1 -l "West US" paul --ssh --vm-size small

Azure CLI VM created

Unlike the Powershell cmdlets which honour my default storage account, it seems as if the Windows version of the command line tools has decided to use the first storage account (paulbouwerbacpac) it found in my subscription and not the storage account I specified ! Hope this gets fixed sometime.

You can see the image copied from VM Depot under the Images tab of the Virtual Machines section in the portal. As can be seen from the command line output above, this image is deleted once the virtual machine creation is complete.

Portal - verify image created in storage account

If you have a look at the Disks tab of the Virtual Machines section in the portal, you’ll see the OS Disk (vhd) once your virtual machine has been created.

Portal - verify vhd created in storage account

And finally, the virtual machine is running. This can be confirmed via the portal.

Portal - verify VM running

Create http endpoint

Typing the following at the command line shows that our ssh public endpoint has been created as per the –ssh switch when we created out virtual machine.

azure vm endpoint list pbsolr4

Solr is web based so we’ll need to add an http public endpoint too. Typing the following at the command line will result in a public endpoint being created on port 80 which will be routed internally to port 80 and will have a name of http.

azure endpoint create pbsolr4 80 80 --endpoint-name "http"

Azure CLI create http endpoint

The creation of this new endpoint can be confirmed in the portal by browsing to the Endpoints tab of our virtual machine.

Portal - verify endpoints

It’s running !

Browsing to pbsolr4.cloudapp.net/solr returns the Solr dashboard. Solr is up and running and our public endpoint on port 80 (http) is working.

Solr 4 running

I can also SSH to pbsolr4.cloudapp.net on port 22 (ssh).

SSH into pbsolr4

That was too easy ! There are loads of images currently available and I expect lots more to follow. VM Depot and their partners are making it really easy to get up and running quickly with these pre-configured environments. I’m impressed !

ASafaWeb, Excessive Headers and Windows Azure

ASafaWeb - Automated Security Analyser for ASP.NET WebsitesTroy Hunt is a Microsoft MVP for Developer Security and regularly blogs and tweets about security principles.  He is the creator of ASafaWeb, the Automated Security Analyser for ASP.NET Websites. Troy has observed that around 67% of ASP.NET websites have serious configuration related security vulnerabilities. ASafaWeb makes scanning for these vulnerabilities dead easy and provides both remediation and additional reading around the resolution.

Troy blogged about excessive response headers in Shhh… don’t let your response headers talk too loudly.

By default, an ASP.NET application running on IIS returns a lot of information about the server and framework via the headers. This information can be used to help exploit vulnerabilities that are present in the technologies the headers identify. If any of the following response headers are returned, ASafaWeb will report a warning for this scan:

  • Server
  • X-Powered-By
  • X-AspNet-Version
  • X-AspNetMvc-Version

I set off to discover how to remove this configuration vulnerability from an ASP.NET Web API application running on Windows Azure.

Create a simple ASP.NET Web API application

Start Visual Studio 2012 as Administrator (we’ll need elevated privileges later when running the Windows Azure Emulator).

Create a new project with File > New > Project and select ASP.NET MVC 4 Web Application. Give the project a name of WebAPI and the solution a name of ExcessiveResponseHeaders. Click OK. Select the Web API template and click OK.

Right click on the WebAPI project in the Solution Explorer and select Add Windows Azure Cloud Service Project. A WebAPI.Azure project will be created with a WebAPI Role associated with the WebAPI Project. The WebAPI.Azure project will be configured as the startup project.

ExcessiveResponseHeaders project - Solution Explorer

The default Web API project created by Visual Studio will contain a simple Values Controller. Throughout this post I’ll be using the GET api/values REST endpoint which will simply return a list of 2 strings [ “value1″, “value2″ ].

ExcessiveResponseHeaders project - Values Controller GET method

Hit F5 and start up the project in the Windows Azure emulator.

Removing the Headers

Using Fiddler and issuing a GET api/values to the WebAPI application running on the Windows Azure emulator we can observe the following response headers:

  • Server: Microsoft-IIS/8.0
  • X-AspNet-Version: 4.0.30319
  • X-Powered-By: ASP.NET

Fiddler - Output for GET

No X-AspNetMvc-Version response header is observed since the api/values route is being managed by the WebAPI framework. If you issued a GET to the following URL: http://127.0.0.1:81/a/a you would observe the MVC header as the MVC framework routing would be brought into play:

  • X-AspNetMvc-Version: 4.0

To remove the Server header we add the following method and code to the Global.asax.cs file in the WebAPI project:

 
protected void Application_PreSendRequestHeaders(object sender, EventArgs e) 
{ 
  HttpContext.Current.Response.Headers.Remove("Server"); 
} 

To remove the X-AspNetMvc-Version header we add the following code to the Application_Start() method in the Global.asax.cs file in the WebAPI project:

 
MvcHandler.DisableMvcResponseHeader = true; 

Add code to Global.asax.cs

To remove the X-AspNet-Version header ensure that the Web.config has the following xml:

 
<configuration>
  <system.web>
    <httpRuntime enableVersionHeader="false" />
  </system.web>
</configuration>

To remove the X-Powered-By header ensure that the Web.config has the following xml:

 
<configuration>
  <system.webServer>
    <httpProtocol>
      <customHeaders>
        <remove name="X-Powered-By" />
      </customHeaders>
    </httpProtocol>
  </system.webServer>
</configuration>

Add config to web config

Hit F5 in Visual Studio again to start up the application with our changes. Using Fiddler and submitting a GET api/values to the WebAPI application now shows that the headers have been suppressed.

Fiddler - No Headers

Deploying this code to Windows Azure (http://pbwebapi.cloudapp.net) confirms that this solution works there too.

In Azure running the default Web API solution (before suppressing the headers):

Headers not suppressed in Azure

In Azure running the modified Web API solution (after suppressing the headers):

Headers suppressed in Azure

Is there a NuGet package for that ?

There always is … And its name is NWebsec.

Revert all the changes made previously in the Global.asax.cs and Web.config files. Right click on the WebAPI project and select Manage NuGet Packages … Search for NWebsec and install.

NWebsec NuGet Package

The installation of the NuGet package modifies the Web.config in the WebAPI project by adding a nwebsec sectionGroup and an entry in modules.

NWebsec web config updates

The X-Powered-By header is added by IIS and cannot be removed in the processing pipeline. Therefore NWebsec cannot remove it. To remove the X-Powered-By header ensure that the Web.config has the following xml:

 
<configuration>
  <system.webServer>
    <httpProtocol>
      <customHeaders>
        <remove name="X-Powered-By" />
      </customHeaders>
    </httpProtocol>
  </system.webServer>
</configuration>

To suppress all the other headers ensure that the Web.config contains the following xml:

 
<configuration>
  <nwebsec>
    <httpHeaderModule>
      <suppressVersionHttpHeaders enabled="true" />
    </httpHeaderModule>
  </nwebsec>
</configuration>

NWebsec configuration in web config

That was much better. We only had to add some configuration to the Web.config. There was no code involved.

The NWebsec package does not suppress the Server header but defaults it to a value of Webserver 1.0 – this value can be configured in the Web.config. All other headers are suppressed as can be seen via Fiddler.

Headers suppressed locally

Once the update is deployed to Azure (http://pbwebapi.cloudapp.net) it is confirmed that this solution works there too.

Headers suppressed in Azure

Scanning with ASafaWeb

Scanning the api/values REST endpoint of our default Web API application via the ASafaWeb analyser returns a warning about the same headers we observed via Fiddler.

ASafaWeb - Headers not suppressed

Scanning the api/values REST endpoint of our NWebsec enabled Web API application via the ASafaWeb analyser returns a warning about the same headers we observed via Fiddler.

ASafaWeb - NWebSec

Scanning the api/values REST endpoint of our manually configured Web API application via the ASafaWeb analyser returns a warning about a Server: Microsoft-IIS/8.0 header. This is not what we observed via Fiddler.

ASafaWeb - Manual

But wait … What is that URL that is being scanned ?
http://pbwebapi.cloudapp.net/api/values/elmah.axd ?

Sneaky ! That URL was resulting in a 404 being raised by IIS since the file did not exist. This was bypassing the processing pipeline and our attempts to suppress the headers.

Closing the 404 – Not Found hole

We’ll be continuing with the manual version of our suppressing headers code – I’m aiming to suppress all headers and the NWebsec package still sends down a Server header.

Again there is a NuGet package that comes to the rescue. NotFoundMVC automatically installs itself during web application start-up. It handles all the different ways a 404 HttpException is usually thrown by ASP.NET MVC. This includes a missing controller, action and route.

Right click on the WebAPI project and select Manage NuGet Packages … Search for NotFoundMVC and install.

NotFoundMVC NuGet Package Install

The NotFoundMVC package will make the following modifications to the Web.config file in the WebAPI project. The Error.cshtml placed under Views\Shared may be modified to provide customised 404 pages.

NotFoundMVC updates to Web Config

Deploying to Azure and scanning the api/values REST endpoint of our manually configured Web API application via the ASafaWeb analyser returns a PASS.

ASafaWeb - Pass

But wait – there’s more …

You’d think I would be happy with the PASS ! But I had seen something in one of the URLs attempted by the ASafaWeb analyser. It had thrown in an illegal character into the URL. So I tried the following URL locally:

HTTP.SYS Bad Request

Why is that Server header back ? And why is it now reporting Microsoft-HTTPAPI/2.0 ?

When you see a Server header value containing Microsoft-HTTPAPI instead of the IIS version it means the HTTP.SYS driver handled the request and it did not traverse all the way to the IIS user mode process.

So how do we get rid of that header now ? This header is being added right down at the kernel level. Time to break out the trusty regedit utility. Add a DisbableServerHeader REG_DWORD with a value of 1 to the following:

  • HKLM\SYSTEM\CurrentControlSet\Services\HTTP\Parameters

HTTP.SYS Registry Editor

This requires the HTTP.SYS service to be restarted. I battled with service dependencies but a reboot of my machine ensured the restart. Once that was done, the illegal URL no longer exposed the Server header.

HTTP.SYS Bad Request with suppressed header

And finally – modifying HTTP.SYS in Windows Azure

Trying the URL with the illegal character produced the same result in Windows Azure.

HTTP.SYS Server Header - Azure

To edit the registry on the web role in our cloud service in Windows Azure we’ll need to write a batch file. Right click the WebAPI project and click Add > New Item … Add a Text File with name Configure_HTTP.SYS.cmd and populate with the following:

 
@echo off
setlocal
set regpath=HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\HTTP\Parameters
reg query "%regpath%" /v "DisableServerHeader"
if errorlevel 1 (
   reg add %regpath% /v DisableServerHeader /t REG_DWORD /d 00000001
   shutdown /r /t 0
) 

This will add the registry entry if it does not exist and reboot the instance. This will only occur whenever the instance is re-provisioned as the registry entry will then no longer be set.

Make sure that the Build Action is set to None and the Copy to Output Directory is set to Copy always for the batch file.

Add command to project

Ensure that the batch file is configured as a startup task and is run with elevated privileges. Add the following xml to the ServiceDefinition.csdef file in the WebAPI.Azure project:

 
<Startup>
  <Task commandLine="Configure_HTTP.SYS.cmd" executionContext="elevated" />
</Startup>

Add to ServiceDefinition.csdef

Deploying to Azure and trying the URL with the illegal character now results in all headers being suppressed.

HTTP.SYS Suppressed Server Header - Azure

Conclusion

With a bit of work you can suppress all the headers in your ASP.NET application running in Windows Azure.

Be mindful though, that removing these excessive headers is not a silver bullet that will magically make your web application 100% less likely to be attacked. But by adding in these additional layers you at least can exclude attackers that look at specific response headers to determine whether or not you are an interesting target.

StreamInsight at Brisbane Azure User Group

As I have mentioned before I am a big fan of StreamInsight, I gave a talk at the Brisbane Azure User Group on the 17th October 2012 titled Extracting Realtime Business Value with StreamInsight. Here are my slides.

Slides - Extracting Realtime Business Value With StreamInsight

Paul presenting

I also strongly recommend looking at the Hitchhiker’s Guide to StreamInsight 2.1 Queries from the StreamInsight Team. It includes the PDF Guide and a Visual Studio Solution containing all the examples. It is an excellent tool to familiarise yourself with some of the concepts in StreamInsight.

I also got to demo StreamInsight in Azure (codename Austin) to the user group. The discussions around the possibilities with StreamInsight and Complex Event Processing definitely got the creative juices flowing.

StreamInsight on Azure

For the 2 days prior to TechEd Australia 2012 I experienced my first Mexia Code Camp down the Gold Coast, where I got to enjoy geeking out with the rest of the Mexia team. We stayed in luxury beachfront apartments with amazing views of the beach and the Gold Coast coastline. The Code Camp was all about exploring new technology that excited us. This was definitely the coolest environment I’d ever written code in !

La Grande Oceanfront Luxury Apartments

Mexia’s Ben Simmonds and I are both big fans of StreamInsight, Microsoft’s Complex Event Processing Engine and are lucky enough to be part of Microsoft’s StreamInsight Advisory Group which has afforded us early access to StreamInsight in Azure (codename Austin). I’m really excited by the possibilities of StreamInsight and complex event processing and really enjoyed exploring the technology at the Code Camp.

StreamInsight covers the Velocity dimension of Gartner’s 3Vs of Big Data – Volume, Velocity and Variety. It’s beauty lies in its ability to extract relevant knowledge from one or more large streams of data. Big data can be analysed in real-time and events can be raised when something relevant is detected within the large stream of information. For example, if a monitoring heartbeat on some resource was not detected, this would constitute a relevant signal we are interested in amid the irrelevant noise of the regular heartbeat.

Big Data: Velocity and Volume dimensions (source: http://blogs.msdn.com/b/microsoft_business_intelligence1/archive/2012/02/22/big-data-hadoop-and-streaminsight.aspx)

The project that I tackled involved the following:

  1. Integrating with Ben‘s StreamInsight Austin instance to host a custom IObservable that monitored a heartbeat running against a Windows Azure Service Bus subscription.
  2. Implementing a LINQ query to find missing heartbeats and drop an event into Ben’s Windows Azure SQL Database sink.
  3. An ASP.NET MVC application to display the events dropped into the SQL Database sink from those queries bound to it – Ben’s maximum and average latency events and my missing heartbeat events.

First up I created an observable that generated point events for the regular heartbeat coming from some code checking whether or not a Windows Azure Service Bus subscription was available. This was turned into a stream and then I created a LINQ query over the stream that filtered on events where the subscription was not available. This filtered set of events was then passed through a tumbling window which was 10 seconds long and grouped to get a count of missed heartbeats within that window.

var serviceBusHeartbeatMissingStream = myApp.DefineStreamable(() => serviceBusInputPointObservable); 
var serviceBusHeartbeatWindow = from e in serviceBusHeartbeatMissingStream 
		where e.IsAvailable == 0 
		group e by e.Subscription into gs 
		from win in gs.TumblingWindow(TimeSpan.FromSeconds(10)) 
		select new ServiceBusDown 
		{ 
			Id = null, 
			Subscription = gs.Key, 
			DateTime = DateTime.UtcNow, 
			Count = win.Count() 
		}; 

This small block of code is doing something amazing. Every 10 seconds it will aggregate the last 10 seconds of events that have passed through the engine and output an event of type ServiceBusDown if that aggregate contains any events where the heartbeat was missed. Extracting the relevant information from a torrent of data – it’s a beautiful thing !

These events were bound to the Windows Azure SQL Database sink.

serviceBusHeartbeatWindow.Bind(sqlSinkForServiceBusDown(targetSqlConnectionString, false)).Run();

This meant that they were available for reporting against in a dashboard.

A simple ASP.NET MVC application returned data from the Windows Azure SQL Database sink and displayed real-time the values of these business events:

  • Maximum Latency (in ms)
  • Average Latency (in ms)
  • Service Bus Up/Down

Simple dashboard

Even though these were really simple examples of what is possible with StreamInsight I am blown away by the technology. As the sheer volume of data we are processing increases, it becomes critical to ensure that you have the capabilities to extract business value from the latent information filling up our data stores or washing over our systems.

I’m presenting a session at the Brisbane Azure User Group this month (October 2012) on StreamInsight titled Extracting Realtime Business Value with StreamInsight.

Windows Azure Training Workshop

I work for Mexia as a consultant and we are currently partnering with Microsoft in Australia to deliver the FREE 1 Day Windows Azure Dev Camp and the 3 Day Windows Azure Foundation Training Workshop. Last week I delivered the 3 Day Workshop in Melbourne with Mexia’s Ben Simmonds. This took the form of an instructor led training workshop and was based on the August 2012 Refresh of the Windows Azure Training Kit.

Windows Azure Traing Kit - August 2012 REFRESH

There was plenty of content to get foundational concepts across to beginners and a number of excellent hands on labs for cementing these concepts.

Windows Azure Training - Melbourne

Make sure to check the Mexia website for date confirmations – we’ll be delivering the 1 Day Windows Azure Dev Camp and the 3 Day Windows Azure Foundation Training Workshop in Sydney and Brisbane in Nov/Dec this year.