Category Archives: Windows Azure How To

Stupid Azure Trick #4 – C#, Node.js, and Python side-by-side – Three Simple Command Line Tools to Copy Files up to Windows Azure Blob Storage

Windows Azure has a cloud file storage service known as Blob Storage.

[Note: Windows Azure Storage is broader than just Blob Storage, but in this post I will ignore its sister services Table Storage (a NoSQL key/value store) and Queues (a reliable queuing service).]

Before we get into the tricks, it is useful to know a bit about Blog Storage.

The code below is very simple – it uploads a couple of files to Blob Storage. The files being uploaded are JSON, so it includes proper setting of the HTTP content-type and sets up caching. Then it lists a directory of the files up in that particular Blob Storage container (where a container is like a folder or subdirectory in a regular file system).

The code listed below will work nicely on a Windows Azure Dev-Test VM, or on your own desktop. Of course you need a Windows Azure Storage Account first, and the storage credentials. (New to Azure? Click here to access a free trial.) But once you do, the coding is straight-forward.

  • For C#: create a Windows Console application and add the NuGet packaged named “Windows Azure Storage”
  • For Node.js: run “npm install azure” (or “npm install azure – –global”)
  • For Python: run “pip install azure” to get the SDK
  • We don’t cover it here, but you could also use PowerShell or the CLI or the REST API directly.

Note: these are command line tools, so there isn’t a web project with config values for the storage keys. So in lieu of that I used a text file on the file system. Storage credentials should be stored safely, regardless of which computer they are used on, so beware my demonstration only using public data so my storage credentials in this case may not be as damaging, if lost, as some others.

Here’s the code. Enjoy!

using System;
using System.Diagnostics;
using System.IO;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
internal class Program
{
private static void Main(string[] args)
{
var storageAccountName = "azuremap";
// storage key in file in parent directory called <storage_account_name>.storagekey
var storageAccountKey = File.ReadAllText(String.Format("d:/dev/github/{0}.storagekey", storageAccountName));
//Console.WriteLine(storageAccountKey);
var storageContainerName = "maps";
var creds = new StorageCredentials(storageAccountName, storageAccountKey);
var storageAccount = new CloudStorageAccount(creds, useHttps: true);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(storageContainerName);
string[] files = {"azuremap.geojson", "azuremap.topojson"};
foreach (var file in files)
{
CloudBlockBlob blockBlob = container.GetBlockBlobReference(file);
var filepath = @"D:\dev\github\azuremap\upload\" + file;
blockBlob.UploadFromFile(filepath, FileMode.Open);
}
Console.WriteLine("Directory listing of all blobs in container {0}", storageContainerName);
foreach (IListBlobItem blob in container.ListBlobs())
{
Console.WriteLine(blob.Uri);
}
if (Debugger.IsAttached) Console.ReadKey();
}
}
view raw upload.cs hosted with ❤ by GitHub
var azure = require('azure');
var fs = require('fs');
var storageAccountName = 'azuremap' // storage key in file in parent directory called <storage_account_name>.storagekey
var storageAccountKey = fs.readFileSync('../../%s.storagekey'.replace('%s', storageAccountName), 'utf8');
//console.log(storageAccountKey);
var storageContainerName = 'maps';
var blobService = azure.createBlobService(storageAccountName, storageAccountKey, storageAccountName + '.blob.core.windows.net');
var fileNameList = [ 'azuremap.geojson', 'azuremap.topojson' ];
for (var i=0; i<fileNameList.length; i++) {
var fileName = fileNameList[i];
console.log('=> ' + fileName);
blobService.createBlockBlobFromFile(storageContainerName, fileName, fileName,
{ contentType: 'application/json', cacheControl: 'public, max-age=3600' }, // max-age units is seconds, so 31556926 is 1 year
function(error) {
if (error) {
console.error(error);
}
});
}
blobService.listBlobs(storageContainerName,
function(error, blobs) {
if (error) {
console.error(error);
}
else {
console.log('Directory listing of all blobs in container ' + storageContainerName);
for(var i in blobs) {
console.log(blobs[i].name);
}
}
});
view raw upload.js hosted with ❤ by GitHub
from azure.storage import *
storage_account_name = 'azuremap' # storage key in file in parent directory called <storage_account_name>.storagekey
storage_account_key = open(r'../../%s.storagekey' % storage_account_name, 'r').read()
//print(storage_account_key)
blob_service = BlobService(account_name=storage_account_name, account_key=storage_account_key)
storage_container_name = 'maps'
blob_service.create_container(storage_container_name)
blob_service.set_container_acl(storage_container_name, x_ms_blob_public_access='container')
for file_name in [r'azuremap.geojson', r'azuremap.topojson']:
myblob = open(file_name, 'r').read()
blob_name = file_name
blob_service.put_blob(storage_container_name, blob_name, myblob, x_ms_blob_type='BlockBlob')
blob_service.set_blob_properties(storage_container_name, blob_name, x_ms_blob_content_type='application/json', x_ms_blob_cache_control='public, max-age=3600')
# Show a blob listing which now includes the blobs just uploaded
blobs = blob_service.list_blobs(storage_container_name)
print("Directory listing of all blobs in container '%s'" % storage_container_name)
for blob in blobs:
print(blob.url)
# format for blobs is: <account>.blob.core.windows.net/<container>/<file>
# example blob for us: pytool.blob.core.windows.net/pyfiles/clouds.jpeg
view raw upload.py hosted with ❤ by GitHub

Useful Links

Python

http://research.microsoft.com/en-us/projects/azure/an-intro-to-using-python-with-windows-azure.pdf

http://research.microsoft.com/en-us/projects/azure/windows-azure-for-linux-and-mac-users.pdf

http://www.windowsazure.com/en-us/develop/python/

SDK Source for Python: https://github.com/WindowsAzure/azure-sdk-for-python

Node.js

http://www.windowsazure.com/en-us/develop/nodejs/

SDK Source for Node.js: https://github.com/WindowsAzure/azure-sdk-for-node

http://www.windowsazure.com/en-us/documentation/articles/storage-nodejs-how-to-use-blob-storage/

C#/.NET

http://www.windowsazure.com/en-us/develop/net/

Storage SDK Source for .NET: https://github.com/WindowsAzure/azure-storage-net

Storage Client Library 3: http://msdn.microsoft.com/en-us/library/dn495001%28v=azure.10%29.aspx

[This is part of a series of posts on #StupidAzureTricks, explained here.]

Advertisement

Stupid Azure Trick #3 – Create a Dev Virtual Machine in Windows Azure

“Everyone” knows about using cloud services for running web applications and databases. For example, Windows Azure offers a bevy of integrated compute, storage, messaging, monitoring, networking, identity, and ALM services across its world-wide data centers.

But what about the idea of leveraging the cloud for software development and testing? Of course there is great productivity in using hosted services for a lot of the ancillary tasks in software development – source control, issue tracking, and so on. Example cloud solutions for source control would include two that I use regularly, GitHub and Team Foundation Service (TFS). But what about for hands-on software development – creating, running, testing, and iterating on code?

There are really two significant ways you can go here. One way – that I will not be drilling into – is to use a cloud-hosted web browser-based development environment. This is what’s going on with Monaco, which is a cloud-hosted version of Visual Studio that runs entirely in a web browser – but (very awesomely) integrates with Windows Azure. There are also third-parties playing in this space, such as Cloud 9.

The other way – the one I am going to drill into – is using a Windows Azure Virtual Machine for certain development duties.

[Making a case for when and why one might create a dev-test environment in the cloud will be left for another time…]

With great power comes great responsibility

Spiderman knows this, and you need to know it as well.

Virtual Machines in the cloud cost money while they are deployed. It is your great responsibility to turn them off when you don’t need them.

The pricing for “normal” virtual machines (as opposed to MSDN Pricing which is described below) is listed at http://www.windowsazure.com/en-us/pricing/details/virtual-machines/. For example, at the time of this writing, the price for a Windows Server VM ranges from $0.02 (two cents) to $1.60 per hour, while the price for a Windows Server VM with SQL Server ranges from $2.92 to $7.40 per hour. The $7.40/hour VM is an instance running on a VM with 8 cores and 56 GB of RAM.

NOTE: just before publication time, Windows Azure announced some even larger “compute-intensive” VMs, A8 and A9 sizes. The A9 costs $4.90 per hour and sports 16 cores, 112 GB of memory, and runs on a “40 Gbit/s InfiniBand network that includes remote direct memory access (RDMA) technology for maximum efficiency of parallel Message Passing Interface (MPI) applications. […] Compute-intensive instances are optimal for running compute and network-intensive applications such as high-performance cluster applications, applications using modeling, simulation and analysis, and video encoding.” Nice! These are available for VMs in Cloud Services, and I would expect them to become available for all VMs in due course.

Some VMs cost more per hour (I’m looking at you BizTalk Server) and some costs are as yet unknown (such as for Oracle databases, which are in preview and production pricing has yet to be revealed).

VM prices vary for two reasons: (a) resources allocated (e.g., # of cores, how much RAM) and (b) licensing. For the same sized VM, one running SQL Server will cost more than one running Windows Server only. This is a feature – for example, you can rent a SQL Server license for 45 minutes if you like.

Of course, while inexpensive, and nearly inconsequential in small quantities, these prices can add up if you use a lot of VM hours. The good news is, you can release VM resources when you are not using them. You don’t incur VM costs when the VM is not occupying a VM, though there is a small storage cost that starts at $0.07 (seven cents) per GB per month.

Just don’t forget to free your resources before leaving for vacation.

Fortunately, VMs can easily be stopped in the portal, by using the Remove-AzureVM PowerShell cmdlet, by using the azure vm shutdown command from the cross-platform CLI, through management REST APIs, or using one of the language SDKs.

Example prices were expressed in terms of “per hour” but the pricing granularity is actually by the minute. In some clouds, usage granularity is hourly, or possibly “any part of the hour” meaning a VM deployed from, say, 7:50 to 8:10 would incur 120 minutes of billing (two hours), even though actual time was 20 minutes. In Azure, you would be billed 20 minutes. The billing granularity matters more when using VMs for focused tasks like developers and testers would tend to do.

Further, there’s a data transfer price for data leaving the data center.

You may be interested in Windows Azure Billing Alerts.

MSDN Pricing – A Big Cloudy Discount

If you have an MSDN account (not just for big companies, but also with startups) – as long as you claim your Azure benefits – magically, you are eligible for special MSDN Pricing. Check for the current MSDN discounted pricing, but as of this writing MSDN includes either $50, $100, or $150 of Azure credits per month, depending on your level of MSDN. Anyone on your team with an MSDN account will have their own Azure credits.

This means that your monthly bill will draw from this balance before you incur actual costs. You can also choose to configure the account to not allow overages, such that when your monthly allotment is exhausted, consumption stops. This way you know your credit card will not be charged. You can selectively re-enable it for the rest of the month. This is not a bad default setting to avoid runaway dev-test costs due to forgetting to turn off resources when you didn’t need them.

Beyond this, you get a huge discount on other VMs – no matter what the VM is, you never pay more than $0.06 per hour per small VM unit.

MSDN pricing only applies to resources used for Dev-Test – it is not licensed for production use, nor does it come with an SLA.

But that’s such a good deal, that anyone using Windows Azure for Dev-Test should take a hard look at this option if they don’t already have an MSDN account. But this post is all about creating a Dev-Test VM, so let’s get on with it.

Creating a Dev-Test Virtual Machine in Windows Azure

Let’s set up for C#, Python, and Node.js development.

First, log into your Windows Azure account at https://manage.windowsazure.com.

image

image

image

image

If the MSDN checkbox is disabled, you have logged into a Windows Azure account that is not associated with your MSDN account. Change to the correct account to proceed.

Select the MSDN checkbox to filter out any VM image not specific to MSDN subscribers, and see the list of available VM images change to the following:

image

Note the text on the descriptive text on the right-hand side, which I’ve included here since it provides some useful information.

The Visual Studio Professional 2013 developer desktop is an offering exclusive to MSDN subscribers. The image includes Visual Studio Professional 2013, SharePoint 2013 Trial, SQL Server 2012 Developer edition, Windows Azure SDK for .NET 2.2 and configuration scripts to quickly create a development environment for Web, SQL and SharePoint 2013 development.

To learn how to configure any development environment you can follow the links on the desktop.

We recommend a Large VM size for SQL and Web development and ExtraLarge VM size for SharePoint development.

Please see http://go.microsoft.com/fwlink/?LinkID=329862 for a detailed description of the image. Privacy note: This image has been preconfigured for Windows Azure, including enabling the Visual Studio Experience Improvement Program for Visual Studio, which can be disabled.”

Choose one of the Visual Studio images (I will choose Visual Studio Professional 2013) and go to the next page by clicking the arrow at the bottom-right.

image

Fill in the fields. The username and password will be needed later to RDP into the box. Click the arrow to go to the next page.

image

I kept most of the defaults, only changing the REGION to be “East US” to minimize latency to my current location. Click arrow to go to next page.

If I planned to use this for giving a talk in another geographic location, I may choose a different region. For example, I may choose “North Europe” (Dublin) if I was speaking in Ireland (which would be wonderful and I hope happens some day :-)).

image

No changes on this page, so click check-mark to finish.

image

The portal will “think” for a short time, then your new virtual machine – listed under the name you gave it (“vspro-demo” for me), with the corresponding cloud service that was created (“vspro-demo.cloudapp.net” for me) which also serves as its DNS name (that you’ll use to access it via RDP).

image

Once it finishes, you can select it and hit CONNECT. This will download a file that will launch the RDP client which will allow you to login.

image

I usually check off “Don’t ask me again…” because I know this connection is fine.

image

Note that here you will want to click “Use another account” so you can specify your VM-specific credentials.

image

Click OK then…

image

I usually check off “Don’t ask me again…” because I know this connection is fine.

Now I’m in!

image

Configuring your Dev-Test Machine on Windows Azure

When configuring a new machine, there are many tools you may want to install. For this exercise, I will keep it simple. (The following use my handy “which” function in PowerShell to find locations of commands in the path. If you add “which” to your environment, be sure to close your PowerShell shell and open a new one so that the new $PROFILE is processed. If you
choose to not install “which” then issue the same commands and you should just get errors instead.)

With a PowerShell shell, let’s investigate what we have on a new machine.

image

We can see that, in turn, that:

  • While PowerShell is installed (we are running in a PowerShell shell), there are no PowerShell cmdlets with “Azure” in the name.
  • Node.js is not found (no Node Package Manager (npm) and no Node runtime (node).
  • The cross-platform (xplat) Command Line Interface (CLI) is not installed. This has Node.js as a dependency.
  • No Python interpreter is installed.
  • The Web Platform Installer actually is installed, so let’s use that to add the other pieces to our development environment.

image

After filtering, in succession, (in search box at the top-right)…

.. on PowerShell:

image

Click the “Add” button to add the latest “Windows Azure PowerShell” release.

.. on Cross-platform:

image

Click the “Add” button to add the latest “Windows Azure Cross-platform Command Line Tools” release.

and .. on Python:

image

Click the “Add” button to add the latest “Windows Azure SDK for Python” release.

image

Click the “Add” button to add the latest “Python Tools 2.0 for Visual Studio 2013” release. This includes some really cool python tooling for Visual Studio, though we won’t discuss it further in this post.

Now click the “Install” button to start the installation.

image

You can accept all the licensing with one click.

The installation will download and install the items you selected, including any dependencies.

image

image

image

(compiling Python distribution as part of the installation…)

image

image

image

Installation is complete.

Verifying the Installation

Open a new PowerShell Window to explore once again.

image

Note that we ran the “get-help azure” command through a filter (the Measure-Object cmdlet, which was used to count lines) since output would otherwise not have fit on one screen (there are a couple of hundred Azure cmdlets in the list). Of npm, node, azure, and python, only azure (via azure.cmd, the entry point to the CLI) shows up in our path. This is okay, since we can now run azure at the command line and it knows where to find Node.js.

image

As for python, that is now installed at c:\python27\python.exe. We can either add c:\python to our path, or invoke it explicitly using the full path. For our simple example, we’ll just invoke it explicitly. To see that the Windows Azure SDK for Python is installed, we can use pip, a Python package manager, to list the installed packages.

image

We can see that “azure (0.7.1)” is installed.

Done. Now go write some Python, Node, or C# code!

Useful Links

[This is part of a series of posts on #StupidAzureTricks, explained here.]

Start Windows Azure Storage Emulator from a Shortcut

When building applications to run on Windows Azure you can get a lot of development and testing done without ever leaving your developer desktop. Much of this is due to the convenient fact that much code “just works” on Windows Azure. How can that be, you might wonder? Running on Windows Azure in many cases amounts to nothing different than running on Windows Server 2012 (or Linux, should you chose). In other words, most generic PHP, C#, C++, Java, Python, and <your favorite language here> code just works.

Once your code starts accessing specific cloud features, you face a choice: access those services in the cloud, or use the local development emulator. You can access most cloud services directly from code running on your developer desktop – it usually just amounts to a REST call under the hood (with some added latency from desktop to cloud and back) – it is an efficient and effective way to debug. But the development emulator gives you another option for certain Windows Azure cloud services.

A common use case for the local development emulator is to have web applications such as with ASP.NET, ASP.NET MVC, and Web API that run either in Cloud Services or just in a Web Site. This is an important difference because when debugging, Visual Studio will start the Storage Emulator automatically, but this will not happen if you debugging web code that does not run from a Cloud Service. So if your web code is accessing Blob Storage, for example, when you run it locally you will get a timeout when it attempts to access Storage. That is, unless you ensure that the Storage Emulator has been started. Here’s an easy way to do this. (Normally, you only need to do this once per login (since it keeps running until you stop it).)

In my case, it was very convenient to have a shortcut that I could click to start the Storage Emulator on occasion. Here’s how to set it up. I’ll explain it as a shortcut (such as on a Windows 8 desktop), but the key step is very simple and easily used elsewhere.

Creating the Desktop Shortcut

  1. Right-click on a desktop
  2. From pop-up menu, choose New –> Shortcut      image
  3. You get a dialog box asking about what you’d like to create a shortcut for:image
  4. HERE’S IMPORTANT PART 1/2: click hit the Browse button and navigate to wherever your Windows Azure SDK is installed and drill into the image
  5. In my case this places the path "C:\Program Files\Microsoft SDKs\Windows Azure\Emulator\csrun.exe" into the text field.
  6. HERE’S IMPORTANT PART 2/2: Now after the end of the path (after the second double quote) add the parameter /devstore:start which indicates to start up the Storage Emulator.
  7. Click Next to reach the last step – naming the shortcut: image
  8. Perhaps change the name of the shortcut from the default (csrun.exe) to something like Start Storage Emulator: image
  9. Done! Now you can double-click this shortcut to fire up the Windows Azure Storage Emulator: image 

On my dev computer, the path to start the Windows Azure Storage Emulator was: "C:\Program Files\Microsoft SDKs\Windows Azure\Emulator\csrun.exe" /devstore:start

Now starting the Storage Emulator without having to use a Cloud Service from Visual Studio is only a double-click away.

RELATED

Azure FAQ: How to Use .NET 4.5 with Windows Azure Cloud Services?

Microsoft released version 4.5 of its popular .NET Framework in August 2012. This framework can be installed independently on any compatible machine (check out the .NET FrameworkThe Azure FAQ Deployment Guide for Administrators) and (for developers) come along with Visual Studio 2012.

Windows Azure Web Sites also support .NET 4.5, but what is the easiest way to deploy a .NET 4.5 application to Windows Azure as a Cloud Service? This post shows how easy this is.

Assumption

This post assumes you have updated to the most recent Windows Azure Tools for Visual Studio and the latest SDK for .NET.

For any update to a new operating system or new SDK, consult the Windows Azure Guest OS Releases and SDK Compatibility Matrix to understand which versions of operating systems and Azure SDKs are intended to work together.

You can do this with the Web Platform Installer by installing Windows Azure SDK for .NET (VS 2012) – Latest (best option) – or directly here (2nd option since this link will become out-of-date eventually).

Also pay close attention to the release notes, and don’t forget to Right-Click on your Cloud Service, hit Properties, and take advantage of some of the tooling support for the upgrade:

UpgradeFall2012

Creating New ASP.NET Web Role for .NET 4.5

Assuming you have up-to-date bits, a File | New from Visual Studio 2012 will look something like this:

image

Select a Cloud project template, and (the only current choice) a Windows Azure Cloud Service, and be sure to specify .NET Framework 4.5. Then proceed as normal.

Updating Existing ASP.NET Web Role for .NET 4.5

If you wish to update an existing Web Role (or Worker Role), you need to make a couple of changes in your project.

First, update the Windows Azure Operating System version use Windows Server 2012. This is done by opening your Cloud project (pageofphotos in the screen shot) and opening ServiceConfiguration.Cloud.cscfg.

image

Change the osFamily setting to be “3” to indicate Windows Server 2012.

   osFamily=”3″

As of this writing. the other allowed values for osFamily are “1” and “2” to indicate Windows Server 2008 SP2 and Windows Server 2008 R2 (or R2 SP1) respectively. The up-to-date settings are here.

Now you are set for your operating system to include .NET 4.5, but none of your Visual Studio projects have yet been updated to take advantage of this. For each project that you intend to update to use .NET 4.5, you need to update the project settings accordingly.

image

First, select the project in the Solution Explorer, right-click on it, and choose Properties from the pop-up menu. That will display the screen shown. Now simply select .NET Framework 4.5 from the available list of Target framework options.

If you open an older solution with the newer Azure tools for Visual Studio, you might see a message something like the following. If that happens, just follow the instructions.

WindowAzureTools-dialog-NeedOct2012ToolsForDotNet45

That’s it!

Now when you deploy your Cloud Service to Windows Azure, your code can take advantage of .NET 4.5 features.

Troubleshooting

Be sure you get all the dependencies correct across projects. In one project I migrated, I realized the following came up because I had a mix of projects that needed to stay on .NET 4.0, but those aspects deployed to the Windows Azure Cloud could be on 4.5. If you don’t get this quite right, you may get a compiler warning like the following:

Warning  The referenced project ‘CapsConfig’ is targeting a higher framework version (4.5) than this project’s current target framework version (4.0). This may lead to build failures if types from assemblies outside this project’s target framework are used by any project in the dependency chain.    SomeOtherProjectThatReferencesThisProject

The warning text is self-explanatory: the solution is to not migrate that particular project to .NET 4.5 from .NET 4.0. In my case, I was trying to take advantage of the new WIF features, and this project did not have anything to do with Identity, so there was no problem.

Two Roles and a Queue – Creating an Azure Service with Web and Worker Roles Communicating through a Queue

Two Roles and a Queue Lab from Boston Azure Firestarter

At the Firestarter event on May 8, 2010, I spoke about Roles and Queues and worked through a coding lab on same. The final code is available in a zip file. The Boston Azure Firestarter – Bill Wilder – Roles and Queues deck can be downloaded – though since there were so many questions we didn’t get to covering a number many of the slides! – this was a hot topic!

The remainder of this post contains the narrative for the LAB we did as a group at the Firestarter. It probably will not stand alone super well, but may be of interest to some folks, so I’ve posted it.

The following procedure assumes Microsoft Visual Web Developer 2010 Express on Windows 7. The same general steps apply to Visual Studio 2008, Visual Studio 2010, and Web Developer 2008 Express versions, though details will vary.

0. Open Microsoft Visual Web Developer 2010 Express and select File | New Project

1. Select Windows Azure Service and click Okay:

image

If you have trouble finding the Windows Azure Service template, you can type “Azure” into the search box in the top-right to narrow the options. Also, if you don’t have the Windows Azure SDK installed, you will need to install that before proceeding – but there will be a link provided by Visual Web Developer 2010 Express that will direct you to the right page. Install it if you need to and try again up to this point.

2. You will see a special dialog box for New Cloud Service Project from which you will add both a Web Role

image

and a Worker Role

image

Verify that both WebRole1 and WorkerRole1 are in the list on the right side, then click OK.

3. Before you begin making code changes, you can run your new application. You can run it in the debugger by pressing the F5 key.

You will probably get the following error message:

image

The error message is telling you that you need to close Visual Web Developer 2010 Express and restart it with elevated privileges.

4. To start any Windows program with elevated privileges , right-click on the application then choose Run as administrator from the pop-up menu:

image

Before it obeys your request to run as administrator, Windows 7 will double-check by popping up a security dialog.

Now you can reload your project and try running it again. The app should run and you should see a blank web browser page.

5. Once you’ve proven your application runs, it is time to make some changes.

Make the code changes indicated for the Two Roles and A Queue Lab in CODING STEP 1.

Note: the “coding step 1” and future coding steps were handouts (paper!) at the Boston Azure Firestarter on Sat May 8, 2010. In lieue of reproducing them here, I will post the final solution.

This lab will establish some WebRole basics.

6. When done applying CODING STEP 1, run the application again.

7. After demonstrating your application runs, Deploy it to Azure.

This is a simple application so it helps us get through the initial deployment with minimal challenges.

8. Apply CODING STEP 2 – Add Queue (in local dev fabric storage)

9. CODING STEP 3 – Add “DumpQueue” method and “FirestarterWebRoleHelpers.cs”

image

You will get the following dialog box – type “code file” into the search area on the top-right, select Visual C# Code File, and type in the filename “FirestarterWebRoleHelpers.cs” as shown and click Add:

image

The new file “FirestarterWebRoleHelpers.cs” will open in the editor. It should be empty to begin with. Cut and Paste in the contents from http://bostonazure.org/files/FirestarterWebRoleHelpers.cs.txt.

Why? The contents of this file has little to do with Windows Azure, so we don’t want to focus on it. But we want to use some utility routines from it so that we can focus on Azure concepts.

10. CODING STEP 4 – Adding Cloud-based Queue

First we need to configure the cloud.

Go to http://windows.azure.com and log in. You may wish to consult instructions on redeeming a token at https://blog.codingoutloud.com/2010/05/06/redeeming-an-azure-token/ or http://bit.ly/dgCuMn

image

Your storage account has a subdomain, as circled above. This – and the Access Key – need to be added to your Web Role and Worker Role so that they can access (and share the same queue within) cloud-hosted storage.

Right-click in Visual Studio on the WebRole1, select Properties, and select the Settings tab on the left. It will appear something like this:

image

Now click on Add Setting and give the new item the name “DataConnectionString”, the Type “Connection String”, and click on the “…”

image

This will bring up the Storage Connection String editor – fill in the fields – where your “Account name” is the same as the subdomain shown on the Storage Service (see above – in that screen shot it is “bostonazurequeue”) and the Key can be either Primary or Secondary Access Key (from same area in the Azure Portal):

image

You are NOT DONE in the screen yet. Also add a Setting named “StatusUpdateQueueName”– of Type “String” – with Value “updatemessagequeue1” as follows:

image

Click OK.

11. Now REPEAT BOTH STEPS for WorkerRole1.

Yes, add both Settings also to WorkerRole1 – they both will end up with the same settings. You can “cheat” with cut and paste in the .cscfg and .csdef files.

12. Enable Cloud-hosted Queue from Web Role

Now you are ready go on to make the code changes to use this new configuration item.

Apply CODING STEP 4: Enabling the Cloud-hosted Queue from the Web Role

Now run your application using cloud storage for the queue:

image

Note that you can also examine the contents of the queue online by visiting http://myAzureStorage.com and providing the same credentials you used when setting up the DataConnectionString above for both the Web and Worker roles.

13. Enable Cloud-hosted Queue from Worker Role

Now you are ALMOST ready go on to make the code changes to use this new configuration item.

Before applying the coding, we need to add a project reference (otherwise you won’t be able to Resolve use of networking classes used in the FirestarterWorkerRoleHelpers.). In Visual Studio on the right side, under the Solution Explorer, right-click on the References element underneath WorkerRole1 and select Add Reference, then from the .NET tab, select System.Web and click okay:

image

Also, similar to step 9 above, add a new Code File called “FirestarterWorkerRoleHelpers.cs” to hold some additional needed (but not core to Azure) code.

The new file “FirestarterWorkerRoleHelpers.cs” will open in the editor. It should be empty to begin with. Cut and Paste in the contents from http://bostonazure.org/files/FirestarterWorkerRoleHelper.cs.txt.

Now you can apply Apply CODING STEP 5: Enabling the Cloud-hosted Queue from the Worker Role.

14. Deploying to Staging Area in Cloud to Staging

15. Cutover from Staging to Production

16. Add in secret Twitter posting code from your Worker Role…

Yes, this can be done by including a hash character (#) as part of the message you type into your web application.

Redeeming an Azure Token

At some select events (like Boston Azure Firestarter, Boston Azure User Group hands-on meeting, or even Protein Folding with Azure @home), Microsoft sometimes provides tokens for participants who wish to try out Windows Azure for real – by deploying real bits into the cloud – deploying multiple instances of Web Roles and Worker Roles, using Queue for scaling, storing data and blobs in Azure Storage and exercising SQL Azure… Some of the tokens are good for up to 4 weeks – which is awesomely convenient for really kicking the tires on Azure if you are a developer. Which I am… Here is a little guidance on getting your account set up once you have a token in hand.

Note that you will be interacting with the Windows Azure Developer Portal (or Dev Portal for short) to redeem your token and establish your temporary account. The Dev Portal is useful to learn about and get to know.

1. First visit http://windows.azure.com and log in with the provided credentials. Use the provided email address for your Windows Live ID.

(NOTE: If any of the images in this post are too small to read, click on them to see a larger version.)

image

2. You will see a screen like the following. Note the row with the light blue background; this background color only appears when your mouse is hovering there. Click on the Project Name that matches your token account name.

image

(Notice that the account owner is “waaccts@microsoft.com” – this is because you are using a Token. Azure supports having an overall account that pays the bills, then sub-accounts for developers. This is an example.)

3.  Now you are in! You can proceed to review some of the help resources lists, or click around on any of the tabs to the left. But to create a new application that you can host on the Azure cloud, you can click on the “New Service” link next to the green “+” sign.

image

4. After you choose “New Service” you will see the following. Note the two main options in the middle for Storage Account and Hosted Services.

image 

Select Hosted Services to begin. Be sure to click on the words “Hosted Services” as opposed to the “Learn More” link, as they are different.

5. The next page will ask you for a name – this name will only be used to help you identify this service from a list in the developer portal, so don’t spend too much time coming up with the perfect name. You don’t need to provide anything for the Description.

image

After providing a name, click Next.

6.  Now you are faced with a form where the choices you make actually do matter.  Here’s what’ you’ll need to do:

image

Type in a “Public Service Name” – this will be the Internet-visible sub-domain from which your deployed application will be visible. For example, if you choose “foo” then your Azure Service will live at http://foo.cloudapp.net after you publish it.

After you settle on a Public Service Name (using Check Availability button as need), you also need to select a Region. Pick the “anywhere” region in your continent (or closest to your continent) such as Anywhere US and click Create.

Here’s what mine looked like before I clicked Create:

image

Now your Azure Service has been created.

7. You will see a screen inviting you to Deploy a Hosted Service Package. We won’t do that now (though you could if you had an application ready). Instead, we will create an Azure Storage Account. From here:

image

Click on the “New Service” link which is near the top-left – below the large Windows Azure logo – and you will see the same screen you saw in step 4:

 image

This time select Storage Account and you will see the following:

image

Give it a name, as I did in screenshot, and click Next.

8. As in step 5, this is also an important choice, though not visible to humans visiting your site. You will need to know this address to program against it. Of course you can look it up in the Dev Portal at any time, but why not choose a logical name. Fill in the fields similar to step 5 – be sure to choose the same Region you chose with step 5 – and click Create.

image

9. You are now ready to build and deploy Azure applications that use Web Roles, Worker Roles, and various kinds of storage.

You will need the keys shows to programmatically access your storage.

image

You can always come back and look up the values of these keys, of course. Also, if a key is compromised, you can regenerate it easily, invalidating the prior one. There are two separate keys that can be used/invalidated independently. These keys are specific to this Storage Service you created; you can create more Storage Services with different keys and even use multiple of them together.