Working with scriptcs in Atom on Windows

Atom is a text editor from the folks at GitHub. I’ve been intrigued, but up until now it has only been available on the OS X platform. And since I currently don’t have a Mac I haven’t yet taken it for a spin. But all that has changed … Atom is now available for Windows.

Atom - text editor

Installing Atom

Installing Atom on Windows is really easy. It is available as a Chocolatey package. If you don’t have Chocolatey on your Windows machine, install it as per the instructions on the Chocolatey website.

Then simply run the following command from the command line to install Atom.

 cinst atom 

And you’ll be greeted by your shiny new text editor when launching Atom.

Atom - welcome

Add C# language support

Support for the C# language is not provided out the box with Atom, but this is quickly solved with an Atom package.

Atom comes with the Atom Package Manager which is easily launched by issuing the following command at the command line:

 apm 

apm - Atom package manager

The Atom package manager allows you to install Atom packages that can be used to extend the functionality of Atom. You use the apm install command to install packages. You can get help for any command by using apm help <command> as shown below.

apm help install

We are interested in the language-csharp Atom package. This adds syntax highlighting and snippets for both C# and scriptcs specific grammars.

language-csharp

Install the package by issuing the following command at the command line:

 apm install language-csharp 

apm install language-csharp

Add support for running scriptcs

So now we have C# language support in Atom, but cannot yet run our C# script files using scriptcs. To enable this we require another Atom package – atom-runner. This package allows you to run code or scripts from within Atom.

atom-runner

Install the package by issuing the following command at the command line:

 apm install atom-runner 

apm install atom-runner

We then need to configure atom-runner and associate csx files with scriptcs. This will allow us to execute our csx files from within Atom. We need to add this configuration information to Atom’s config.cson configuration settings file.

The easiest way to open this file is to use Atom’s command palette. Press ctrl-shift-p to bring up the command palette and then type config. Hit enter to open the config.cson file for your user profile.

command palette

Add the following lines to the end of the file.

'runner': 
  'extensions': 
    'csx': 'scriptcs' 

config.cson

See scriptcs in action

Atom has now been configured to provide syntax highlighting and snippets for C# and scriptcs. It is also now capable of executing csx files from within Atom. So let’s see this in action.

Create a csx file and write a simple Console.WriteLine statement. I’ve created a file hello.csx in the C:\Labs folder and added the message “Hello from atom & scriptcs!” to the Console.WriteLine statement. Ensure that the file is saved.

hello.csx

Next bring up the command palette again (ctrl-shift-p) and type runner. Select the Runner: Run item and hit enter. This will invoke the Atom Runner and provide it with the path to the hello.csx file which is the active tab in the editor.

Runner: Run

The csx file will be run by scriptcs and the output captured in the Atom Runner window.

Run scriptcs csx file

Now you can write your scriptcs csx files in Atom with C# syntax highlighting and snippets. You can even execute your csx files from within Atom.

Add keybinding for Atom Runner

Starting the Atom Runner via the command palette just felt like too many keystrokes for me. So I decided to have a look at the keymap functionality within Atom in order to bind a set of keys to the run event of the Atom Runner.

Bring up the command palette again (ctrl-shift-p) and type keymap. Hit enter to open the keymap.cson file for your user profile.

keymap

Add the following lines to the end of the file.

'.platform-win32 .workspace .editor': 
  'ctrl-shift-r': 'runner:run' 

keymap configuration

This will map ctrl-shift-r to the Run event of the Atom Runner on the Windows platform. So this is all you need to use now to execute your csx files.

The Atom Runner has it’s own keymap file (%UserProfile%\.atom\packages\atom-runner\keymaps\atom-runner.cson) that is used by Atom but this is currently OS X specific.

Atom Runner - keymap

Acknowledgements

I’d like to thank Adam Ralph for doing the hard yards and documenting the steps on the scriptcs wiki for how to get this up and running quickly.

If you are interested in how to do the same with PowerShell and obtaining syntax highlighting, snippets and script execution within Atom then have a look at Doug Finke‘s blog post – PowerShell and The Github Atom Editor.

Troubleshooting adventures with scriptcs and Edge

What was the issue ?

Glenn Block released a new scriptcs script pack – ScriptCs.Edge. There was a call on twitter for people to test and I was happy to help out. It worked first time for me. Awesome to combine Tomasz Janczuk‘s work on Edge with scriptcs !

tweet

But Morten Christensen was having an issue.

tweet

And it seemed like it had something to do with having Visual Studio 2013 installed on the machine. Which I had but Morten didn’t.

tweettweet

Tomasz confirmed that Edge required that msvcr120.dll was available on the machine. This assembly is the Microsoft C Runtime library and is installed via Visual Studio 2013. Mystery solved :)

tweet

But I wondered how we may have solved this issue if we hadn’t got a quick reply from Tomasz …

Replicate the issue

First I needed an environment to replicate the issue. I really didn’t feel like uninstalling Visual Studio 2013 from my machine so I created a Windows 8.1 VM on Microsoft Azure. A Windows 8.1 image is now available to MSDN subscribers. It does not have Visual Studio 2013 installed so was perfect.

Windows 8.1 VM image

After installing scriptcs and the ScriptCs.Edge script pack I found that I was getting the same error as Morten. This was expected. So now the question was – how could I figure out what was going wrong?

ScriptCs.Edge failure

From the error one could deduce that something is not being loaded and given the evidence that this works on a machine with Visual Studio 2013 but not on one without Visual Studio 2013 it seems likely we are looking for a file that is missing.

When you are looking at low level tasks in Windows you can almost guarantee that Mark Russinovich has written some SysInternals tool to help you. And there it was … Process Monitor.

Running Process Monitor on the machine while testing the script pack showed that a specific file could not be found (msvcr120.dll) just after the edge.node module had been successfully loaded. This matched what we were seeing in the error message. So we had found the culprit.

SysInternals - Process Monitor

There is a firehose of information that Process Monitor will display. I restricted this information via filters. I only displayed file activities via the Show File System Activity button the the menu. I further filtered the entries to only those produced by the scriptcs process by applying a filter as shown below.

Process Monitor - filter

Resolve the issue

To test that having the msvcr120.dll assembly would resolve the issue I copied it from my local machine (that had Visual Studio 2013 installed) and placed it in the same folder as the edge.node module on the Windows 8.1 VM in Azure. This was one of the folders searched so I assumed the assembly would be picked up from here.

Missing assembly

Success !

You can see the Node js v0.10.28 welcomes .NET message in the console below. The msvcr120.dll assembly is also clearly loaded as can be seen in the Process Monitor screen.

06-ItWorks-callouts

It was great to see if I could resolve this issue by troubleshooting the process. I now have another tool that I can add to my troubleshooting belt.

And soon Tomasz will be including this assembly in the Edge NuGet package. So no need to copy around assemblies.

My first Pluralsight course is live !

The last few months have been an incredible journey for me. And that journey has resulted in my first Pluralsight course, Introduction to scriptcs, being published on 2 May 2014.

Introduction to scriptcs

Twitter - @Pluralsight

The scriptcs project was started by Glenn Block and was heavily inspired by node.js. It aims to introduce a low friction experience to the world of C# and even better bring that experience to you across Windows, Mac OS X and Linux. If you haven’t looked at it yet, download it and start playing.

I am also extremely grateful for the support from the scriptcs team.

Twitter - @gblock

Twitter - @scriptcsnet

Twitter - @filip_woj

Twitter - @khellang

And for the awesome feedback from Filip and Glenn !

Twitter - @filip_woj

Twitter - @gblock

Twitter - @gblock

Love at first site – scriptcs and WAML at Brisbane Azure User Group

I gave a talk at the Brisbane Azure User Group on the 13th November 2013 titled Love at first site – scriptcs and WAML. Here is my slide deck.

scriptcs is putting C# on a diet and decoupling your favourite language from Visual Studio. The Windows Azure Management Library (WAML) is a C# library that wraps the Windows Azure Management REST APIs. These two belong together !

In this talk I introduced scriptcs and the Windows Azure Management Library, before showing how to combine these two awesome resources to script the management of your Windows Azure assets with the full power of C#.

First Look at Built-in Autoscaling and Alerting at Brisbane Azure User Group

I gave a talk at the Brisbane Azure User Group on the 21st August 2013 titled First Look at Built-in Autoscaling and Alerting. Here is my slide deck.

image

Autoscaling has finally been built in to Windows Azure via Microsoft’s acquisition of MetricsHub. The Autoscaling funtionality from MetricsHub has been rolled directly into the Windows Azure platform. Other features that have also been rolled into the Windows Azure platform from MetricsHub include Availability Monitoring and Alerting.

Win an Aston Martin with your MSDN Subscription!

Most Microsoft developers have an MSDN subscription. Yet not many know that you get up to $150 Windows Azure credits per month with your MSDN subscription. Activate the Windows Azure benefits included with your MSDN subscription and you could win an Aston Martin V8 Vantage !

Simply activate your Windows Azure MSDN benefit before 30 September 2013 and deploy at least one Web Site or Virtual Machine. What are you waiting for ?

Activate your Windows Azure MSDN Benefit !

Still need convincing ?

You no longer need to provide credit card details when activating your Windows Azure MSDN benefit !

MSDN Professional Subscribers receive $50/month worth of Windows Azure monetary credits, MSDN Premium Subscribers $100/month, and MSDN Ultimate Subscribers $150/month. These credits can be applied towards any Windows Azure resource being used for Dev/Test purposes. It is up to you to decide how you would like to use them.

Here are some examples of how a MSDN Premium Subscriber could use their monthly monetary credits.

image

Get started now ! Activate your Windows Azure MSDN benefit.

Why you’ll love Windows Azure SDK 2.0 at the Brisbane Azure User Group

I gave a talk at the Brisbane Azure User Group on the 15th May 2013 titled Why you’ll love Windows Azure SDK 2.0. Here is my slide deck.

Why you'll love the Windows Azure SDK 2.0

This major refresh of the Windows Azure SDK was released on 30th April with some really great new features and enhancements.  In the talk I explored the new capabilities in version 2.0 using Scott Guthrie’s post and Damir Dobric’s excellent Service Bus 2.0 post and sample code as guidance.

We looked at the following:

Web Sites
Improved Visual Studio Tooling around Publishing
Management Support within the Visual Studio Server Explorer
Streaming Diagnostic Logs

Cloud Services
Support for new High Memory VM Instances
Faster Deployment Support with Simultaneous Update Option
Visual Studio Tooling for configuring, managing and viewing Diagnostics Data

Storage
Storage Client 2.0 included in New Projects
Visual Studio Table Explorer

Service Bus
Updated Client Library
Message Browse Support
New Message Pump Programming Model
Auto-delete for Idle Messaging Entities

PowerShell
PowerShell 3.0
PowerShell Remoting
New Automation Commands for Web Sites, Cloud Services, Virtual Machines, Service Bus, Windows Azure Store, Storage, Scaffolding cmdlets for Web/Worker Role

Understanding the benefits of Windows Azure geo-redundancy in Australia

The Windows Azure family has been extended with deployment regions in Australia. The deployment regions are paired for geo-redundancy and in Australia are located in the New South Wales and Victoria sub-regions.

Windows Azure data centres

Why is geo-redundancy important ?

Windows Azure Storage is an important service that underpins a number of other Windows Azure services – Blob Storage, Table Storage, and Virtual Machine (OS and Data Volumes).

Geo-replication is enabled by default in Windows Azure Storage and provides the highest level of storage durability. Data is asynchronously replicated from your primary location to a secondary location within the same region. These locations are guaranteed to be at least 400kms apart to ensure that data durability across catastrophic events.

The following picture shows the New South Wales and Victoria sub-regions within the Australia geographical region. Here the NSW deployment region is the primary location and is asynchronously replicating data to the Victoria deployment region which is the secondary location.

Geo-redundancy

In the event of a catastrophic event in the primary location where the primary location cannot be restored, all traffic will be failed over to the geo-replicated secondary location. This ensures business continuity.

What about redundancy within the deployment region ?

Within each deployment region there is an additional layer of redundancy. All data is synchronously replicated to 3 different storage nodes across 3 separate fault and upgrade domains. This allows each deployment region to recover from common issues such as disk, node or rack failure.

The following picture demonstrates the 3 copies across 3 separate racks and the creation of a new copy after the failure of a storage node.

image

When geo-replication is enabled, you will effectively have 6 copies of your data distributed across 2 geo-graphically dispersed deployment regions. The multiple layers and mechanisms ensuring highly durable data will provide business continuity across a number of scenarios.

How do these new deployment regions affect Australian businesses ?

The most obvious answers would be reduced latency and data storage within Australia.

Even though a Windows Azure CDN has been available out of Sydney for a while, it has only offered lower latency on Blob Storage for very specific read-only workloads. With an Australian region available now, lower latency is available to a wider range of Windows Azure services and this will benefit Australian businesses utilising the Windows Azure platform.

Content in Blob Storage that is produced and consumed within the local Australian market will benefit from the lower latency. A more compelling case can now also be made for cloud integrated storage solutions such as StorSimple. Those customers within Australia that have regulatory pressures preventing them from storing data outside of Australia will also be heartened and this announcement should remove this hurdle.

Australian businesses utilising Virtual Machines within the Windows Azure Infrastructure Services will also enjoy lower latency when connecting via Remote Desktop or SSH.

This is a truly exciting announcement that will hopefully see more Australian companies begin their journey into the cloud.

Blob Storage Deep Dive at Brisbane Azure User Group

I gave a talk at the Brisbane Azure User Group on the 16th January 2013 titled Windows Azure Blob Storage – A Deep Dive. Here is my slide deck.

Windows Azure Blob Storage - A Deep Dive

The deep dive covered the basics before diving into advanced, performance and security based topics. Fiddler was used extensively to show the actual REST API calls generated by the C# storage library.

I have pushed the demo code to GitHub and will be starting a series of posts where I will walk through each of the demos and the HTTP traffic observed via Fiddler.

image

Deploying Linux VMs to Azure from VM Depot

You can currently add Linux Virtual Machines to Windows Azure via the Windows Azure Portal from base OS images of the following distributions: CentOS, SUSE and Ubuntu. But it is then up to you to install and configure any additional software you require.

With the announcement of Microsoft Open Technologies’ VM Depot you now have access to a community-driven catalog of preconfigured operating systems, applications, and development stacks that can easily be deployed on Windows Azure.

VM Depot by Microsoft Open Technologies

What do we need to get started ?

The help on VM Depot indicates that the Windows Azure command line tool is the mechanism to use when deploying images from VM Depot to Windows Azure. This tool is available for Mac and Linux but I wanted to deploy from my Windows machine. The only command line tools available for Windows via the Windows Azure Tools Download page are the Windows Azure Powershell cmdlets which do not support the new community virtual machine image switch.

Thankfully I happened across Sandrino Di Mattia’s post Using the Windows Azure CLI on Windows and from within Visual Studio which was exactly what I was after.

So – if you are doing the deployment from a Mac or Linux box download the command line tool from the Windows Azure website. If you are doing the deployment from a Windows box continue reading to install the Windows version of the command line tool. All steps after this installation will work for all platforms since we will be leveraging a common tool.

Download the command line tools for windows and install.

Windows Azure Command Line Tools Setup

This will install the tools to the following location and configure your PATH environment variable.

Installation Location

Firing up a Command Prompt and typing azure will provide usage information for the command line tool.

Azure CLI command prompt

Typing the following at the command line shows that the option we require for creating a virtual machine from a community image is available in the Windows version of the command line tool.

azure vm create --help

Azure CLI vm create with community option

Now we are ready to begin.

Configuring the command line tool

As with all the Windows Azure tools, we’ll need to ensure our management certificates are configured in the tool so that we can interact with the Windows Azure management services.

Type the following at the command line to download your publishsettings file.

azure account download

This file contains all the information for your subscriptions and management certificates. This command will fire up a browser and will require you to authenticate with your Windows Azure credentials if you are not currently logged in. The publishsettings file will then be downloaded.

Azure CLI download publish settings

Once the file has downloaded, type the following at the command line, replacing $FILENAME with the path to the publishsettings file just downloaded. You can see that mine was downloaded to C:\Users\Paul\Downloads\.

azure account import "$FILENAME"

Azure CLI import publish settings

You can see that I have 2 subscriptions, a personal subscription and an MSDN subscription. Next we’ll set the subscription that we wish to work with when creating our virtual machine.

Type the following at the command line, replacing $SUBSCRIPTION with your subscription name. You can see that I have selected my MSDN subscription.

azure account set "$SUBSCRIPTION"

Azure CLI set account

Selecting a VM from the catalog

I selected the Solr 4.0.0-0 (Ubuntu 12.10) image from the catalog available on VM Depot and clicked on the Deployment Script button at the top of the page.

VM Depot Solr 4.0.0 (Ubuntu 12.10) Image

After selecting the Deployment Region from the drop down, I was presented with a command line script to create my Solr virtual machine. Keep this handy.

azure vm create DNS_PREFIX -o vmdepot-30-1-1 -l "West US" USER_NAME [PASSWORD] [--ssh] [other_options]

VM Depot Deployment Script

Create a storage account

We’ll need to create a storage account that hold the virtual machine’s OS Disk (vhd). Type the following at the command line, replacing $REGION with the data centre region you have selected and $VMNAME with the dns name you are giving your virtual machine.

azure account storage create --location "$REGION" "$VMNAME"

Azure CLI create storage account for image and vhd

You can see that I have chosen the West US as my region and pbsolr4 as the dns name for my virtual machine. You can also confirm the creation of the storage account in the portal.

Portal - verify storage account created

To ensure that this storage account is used for your virtual machine, set it as the default storage account for the interactions to follow on the command line.

Type the following at the command line, replacing $STORAGEACCOUNT with the name of your storage account. You can see that my storage account name is pbsolr4.

azure config set defaultStorageAccount "$STORAGEACCOUNT"

Azure CLI set storage account as default

Creating the VM

We’ll now use that command line script that the VM Depot site gave us to run as a deployment script. I’ll repeat it here.

azure vm create DNS_PREFIX -o vmdepot-30-1-1 -l "West US" USER_NAME [PASSWORD] [--ssh] [other_options]

I ran the command line script with an additional parameter –vm-size small to specify my virtual machine size. My complete command line script is as follows:

azure vm create pbsolr4 -o vmdepot-30-1-1 -l "West US" paul --ssh --vm-size small

Azure CLI VM created

Unlike the Powershell cmdlets which honour my default storage account, it seems as if the Windows version of the command line tools has decided to use the first storage account (paulbouwerbacpac) it found in my subscription and not the storage account I specified ! Hope this gets fixed sometime.

You can see the image copied from VM Depot under the Images tab of the Virtual Machines section in the portal. As can be seen from the command line output above, this image is deleted once the virtual machine creation is complete.

Portal - verify image created in storage account

If you have a look at the Disks tab of the Virtual Machines section in the portal, you’ll see the OS Disk (vhd) once your virtual machine has been created.

Portal - verify vhd created in storage account

And finally, the virtual machine is running. This can be confirmed via the portal.

Portal - verify VM running

Create http endpoint

Typing the following at the command line shows that our ssh public endpoint has been created as per the –ssh switch when we created out virtual machine.

azure vm endpoint list pbsolr4

Solr is web based so we’ll need to add an http public endpoint too. Typing the following at the command line will result in a public endpoint being created on port 80 which will be routed internally to port 80 and will have a name of http.

azure endpoint create pbsolr4 80 80 --endpoint-name "http"

Azure CLI create http endpoint

The creation of this new endpoint can be confirmed in the portal by browsing to the Endpoints tab of our virtual machine.

Portal - verify endpoints

It’s running !

Browsing to pbsolr4.cloudapp.net/solr returns the Solr dashboard. Solr is up and running and our public endpoint on port 80 (http) is working.

Solr 4 running

I can also SSH to pbsolr4.cloudapp.net on port 22 (ssh).

SSH into pbsolr4

That was too easy ! There are loads of images currently available and I expect lots more to follow. VM Depot and their partners are making it really easy to get up and running quickly with these pre-configured environments. I’m impressed !