Tag Archives: #StupidAzureTricks

Stupid Azure Trick #10 – Use SSL on MSDN Visual Studio Azure VMs

If you are trying to Embrace SSL During Development when authenticating with Azure Active Directory, you may run into a little glitch if you do so on one of those handy MSDN Dev/Test VMs in Azure.

The glitch is that when running SSL on the MSDN VM the digital certificate for the SSL cert isn’t quite right. Here is a description of what you might see, followed by a workaround (until fixed at the source in the VM image).

The Problem

Visual Studio 2013 uses IIS Express by default and offers a very simple experience for HTTPS locally:

  • Create a web application
  • Look at the properties for ‘WebApplication1’ and you’ll see an option SSL Enabled — by default it is false, but change it to true
  • By setting SSL Enabled to true, you will now have a value forSSL URL which is something like https://localhost:44300 or above (ports 44300-44399 are reserved for this I think, and next new project gets next available – check out C:\Users\YOURACCOUNT\Documents\IISExpress\config\applicationhost.config to see the bindings that were set up)
  • Hit F5 to run, and if you can navigate to the HTTPS URL and you get the “hey, this cert isn’t trusted!” warning, but otherwise works fine — at least on the desktop. The behavior is different in the MSDN Visual Studio Azure VMs (NOTE: these are very specific VMs, described here – for those of you interested in taking advantage of those specially licensed VM resources associated with MSDN accounts).

Using MSDN Visual Studio Azure VMs, this developer experience does not quite work out of the box. SSL Enabled is true automatically when creating an ASP.NET app that uses Azure Active Directory for org authentication. If you create a new web app, then simply click Change Authentication and select Organizational Accounts, set one up, and then proceed as normal, then hit F5. When your app runs, it will try to authenticate over HTTPS, and it fails as in the scenario above if running on one of these MSDN Visual Studio Azure VMs.

The Solution

Follow these steps:

  1. RDP into your MSDN Visual Studio Azure VM
  2. Paste the following into a PowerShell Window and run them:
  3. $thumb = (dir Cert:\LocalMachine\my | Where-Object Subject -eq ‘CN=localhost’ | Select-Object Thumbprint –First 1).Thumbprint
  4. if ($thumb –ne $null) { del Cert:\LocalMachine\my\${thumb} }
    control /name Microsoft.ProgramsAndFeatures

    The above code will work in the default state of these VMs at this time which assumes only a single certificate with Subject of ‘CN=localhost’ is present in the certificate store.

  5. Right-click on IIS Express and select Repair.
  6. Celebrate your now functioning local F5-ready SSL experience.

 

[This is part of a series of posts on #StupidAzureTricks, explained here.]

Stupid Azure Trick #9 – Embrace SSL During Development when authenticating with Azure Active Directory

If you are developing applications that authenticate users or handle sensitive personal or business data, you should be using SSL for your whole site. That’s the most secure approach. Plain old HTTP is not gonna cut it, and flipping between HTTP and HTTPS exposes undesirable vulnerabilities.

So let’s suppose you are building a Windows Azure Web Site using ASP.NET MVC and you want to take advantage of Azure Active Directory for authentication. Maybe you create an Azure Active Directory account, add some users, now you are ready to use it for authentication within your application.

Using SSL during development will help you smoke out issues – one might be cross-protocol warnings – while also keeping your credentials secure on the wire (if you develop locally using AAD, logins still travel over public internet). It’s just good hygiene. But there is a nuisance factor because, by default, using SSL locally (in the latest tool stack for ASP.NET development) uses the SSL certificate that ships with IIS Express, and that’s not trusted by your web browser, so you get a warning every time. This tip today will show you how to easily fix that. (To skip all the context and get right to the main point, search for the word ‘core’ below.)

Certificate Store on Windows

The Certificate Storage on Windows (desktop and server) is a trusted location for storing digital certificates for all kinds of reasons, including those used by Web Browsers to trust whether or not to trust an SSL connection to a web site, or whether to give a warning.

Only certificates that live in a special location in your local Windows Certificate Store – or digital certificates signed by those certificates (or in a signing chain) – are allowed to be used without a warning. This special location is called Trusted Root Certification Authorities. If your certificate is not in there, or itself was not signed by a certificate in there, and so on, then the browsers will show the users a stern warning.

You can view the certificates in your Trusted Root Certification Authorities store by running certmgr.msc from a command program. Here’s what it looks like on my machine.

image

We’ll come back to this tool later.

Create a Simple ASP.NET MVC app that authenticates with Azure Active Directory

You can skip this section if you already know how to do this. This is a quick walkthrough showing how to use Visual Studio 2013 to create simple ASP.NET MVC application and connect it to an existing Azure Active Directory. (You can easily create an AAD either from the Windows Azure portal, or outside it. You can also substitute an Office 365 directory since that automatically uses AAD.)

File | New Project, choose as below:

image

Click OK.

image

Click Change Authentication.

image

Slect Organizational Accounts in the radio button on the left, and type in your AAD domain (could also be Office 365). Choose Single Sign On for Access Level for simple authentication, or choose Single Sign On, Read directory data if you also plan to use AAD for authorization (such as RBAC). Click OK.

After authenticating as a Global Administrator user on the specified domain, you will be back to your New ASP.NET Project dialog, though with a new value for Authentication setting.

image

Click OK. Now your project will be generated. If you display the Project Properties window for your project, as shown below, notice the configuration options for SSL. You also have both an SSL endpoint and a regular HTTP endpoint.

image

Simply hit F5 now to debug. The default configuration here will bring up the SSL endpoint. Let’s explore what happens below.

Web Browser, Please Protect Me!

Once you’ve started to debug, you won’t see your app directly, but rather you’ll see something like the following:

image

This is because of this entry in Web.config:

<system.web>

<authorization>
<deny users="?" />
</authorization>


</system.web>

This says, in a nutshell, only allow authenticated users access to my site, and if they are not authenticated already, send them to the configured AAD login screen.

(It is possible to selectively disable this for certain pages or areas, but we won’t cover that here. But you can see an example in you web.config that uses the location element.)

Also note that the login screen is using SSL. After logging in, we stay on SSL, and get the following warning:

image

Click the Continue to this websites (not recommended). link and you get your application page, but without the trusty padlock:

image

What does this mean – SSL without the padlock? It means your data is cryptographically secure on the wire (safe from snooping, because the channel is encrypted), but you are sending your data to a web site whose identity has not been independently verified.

The experience with Chrome and Firefox is similar:

Warning from Chrome – “The site’s security certificate is not trusted!”

Hit F5 from Visual Studio if Chrome is your default browser (or type the appropriate URL into Chrome while debugging from Visual Studio).

image

Warning from Firefox – “This Connection is Untrusted”

Hit F5 from Visual Studio if Firefox is your default browser (or type the appropriate URL into Firefox while debugging from Visual Studio).

image

Why Getting Rid of SSL Warnings is OKAYish Here

Before we get rid of the warning, let’s cover a couple of basics.

We get rid of the SSL warnings by telling Windows to trust the IIS Express certificate. In general, this is a Bad Idea, but in this narrow case it ought to be fine. Here’s the logic:

  1. Your IIS Express certificate is unique to your machine
  2. It only honors ports starting at 44300 (up to, I think, 44399)
  3. You can undo this
  4. You are a developer and Know What You Are Doing
  5. You would NEVER do this on an internet-facing production machine

We’ll use Internet Explorer to make the fix, but realize that since all browsers are using the same underlying Certificate Store on Windows, you only need to do this ONCE (in IE in our case) and the others will also automatically trust the certificate for SSL.

Getting Rid of SSL Warnings for *all* Browsers, Courtesy of IE

Here’s the core of the tip in this article, and it starts at the point after you’ve hit F5 in Visual Studio, and assumes IE is configured as the default browser (and, if not, simple load the page into IE before proceeding).

image

Simple click on Certificate error and you’ll see this popop:

image

Click on View certificates.

image

Click on Install Certificate.

image

Click Next (Current User is desired location).

image

Click Place all certificates in the following store and click browse:

image

Choose Trusted Root Certification Authorities. Click OK.

image

Click Next.

image

Click Finish.

There will be a Security Warning:

image

Now read it. If you are cool with it, click Yes.

Now if you run certmgr.msc again, you can see the new entry:

image

Undoing the Fix

To remove it again, simply select it, as show above, and hit the DELETE key. You’ll get a couple of warnings:

image

Click Yes.

Back to normal.

[This is part of a series of posts on #StupidAzureTricks, explained here.]

Stupid Azure Trick #8 – Take control of Management Certificate names

Examine your Windows Azure MANAGEMENT CERTIFICATES in the Windows Azure Portal (under “SETTINGS” in the left nav, then “MANAGEMENT CERTIFICATES” in the top nav). These are the certificates that control which people or which machines can programmatically manipulate your Windows Azure resources through the Service Management API.

Every time you initiate a Publish Profile file download (whether through the portal, with PowerShell, or through the CLI), a new certificate is generated and added to your list of management certificates. You cannot control these names – they are generated.

Upon examination, you may find that some certificates – like #1 shown below – have generated names. And also look at the several certificates immediately below #1 – they have similar names – also generated. These are hard to distinguish from each other.

SNAGHTML2f43f75e

But this is okay some of the time – it is convenient to let tools create these certificates for you since it saves time. It may be perfectly adequate on low security accounts – perhaps a developer’s individual dev-test account from MSDN, or an account only used to give demos with. But for a team account running production, you probably don’t want it to have 17 untraceable, indistinguishable certificates hanging off it.

Now look at the names for #2 and 3 shown above. They are custom names.

Managing Your Management Certificates Starts with Meaningful Names

While we can debate whether the custom names shown above are truly meaningful (this is a demo account), you can probably appreciate that seeing a certificate name like “BUILD SERVER” or “Person/Machine” (e.g., “Maura/DRAGNIPUR”) or “Foobar Contractor Agency” might be more useful than “Azdem123EIEIO” to a human.

Controlling Certificate Names

The Windows Azure Management Portal has some heuristics for deciding what to display for a certificate’s name, but the first one it considers is the Common Name, and will display its value if present. So the short answer: take control of the Common Name.

Here we show creating a Service Management certificate manually in two steps – first the PEM (for use locally) and second deriving a CER (for uploading to the portal).

openssl req -x509 -nodes -days 365 -newkey rsa:1024 -keyout mycert.pem -out mycert.pem -subj "/CN=This Name Shows in the Portal"
openssl x509 -inform pem -in mycert.pem -outform der -out mycert.cer

Note the use of -subj "/CN=This Name Shows in the Portal" when generating a PEM in the first command. The specified text will appear as the description for this certificate within the Windows Azure Portal. OpenSSL is available on Linux and Mac systems by default. For Windows, you can install it directly, or – if you happen to use GitHub for Windows – it gets installed along with it.

For a pure Windows solution, use makecert to create a Management Certificate for Windows Azure.

Considerations

Once you assume responsibility for naming your own certificates, you are simultaneously also taking on generating them, deploying the certificates containing the private keys to the machines from which your Windows Azure resources will be managed using the Service Management API, and uploading the CER public keys to the portal. To make some parts of this easier – especially if you are distributing to a team – consider building your own publish settings file. Also, realize the same certificate can be used by more than one client, and the can also be applied to more than one subscription on Windows Azure; its a many-to-many relationship that’s allowed.

Resources

Create and Upload a Management Certificate for Windows Azure

X.509 Certificates

Build your own Publish Settings File

[This is part of a series of posts on #StupidAzureTricks, explained here.]

Stupid Azure Trick #7 – Use Windows Azure’s Local Storage Emulator with Web Sites & VMs

[Ugh – editing 2nd week in a row after accidental early publishing.]

The original programming model for Windows Azure applications was to use Cloud Services (originally known as Hosted Services, but still the same thing). Of particular note, Cloud Services run on VMs with disks that are not-persistent – you can write data locally (some pointers here), but any locally stored data is not guaranteed to stick around. This is a powerful model for some scenarios, especially highly scalable applications. Another feature of Cloud Services has always been that it comes with an emulator you can run locally – on your laptop at 30,000 feet was a common way to hammer home the point. (Remember, Cloud Services were announced in 2008 – a long time before we had wifi on airplanes!) There are actually two emulators: Compute – which emulates the Cloud Service model by supporting Web Role and Worker Role abstractions, and Storage – which emulates Blob, Table, and [Storage] Queue Services. The rest of this post will focus specifically on the Storage Emulator.

waws-wavm-wacs-venn

Since their announcement in 2012, Windows Azure Web Sites and Virtual Machines have been taking on many of the common workloads that used to require Cloud Services. This diagram at least conceptually should capture the sense that the when to use which model decision has become blurred over time. This is good – with more choice comes the freedom to get started more simply – often a Virtual Machine is an easier onramp for existing apps, and a Web Site can be a great onramp for a website that adheres to some of the well-known programming stacks running on PHP, ASP.NET, Python, or Node.js. If you are a big success, consider upgrading to Cloud Services.

Notably absent from the diagram is the Storage Emulator. It should be in the middle of the diagram because while the local storage emulator is still useful for Cloud Services, you can also use it locally when developing applications targeting Windows Azure Web Sites or Virtual Machines.

This is awesome – of course, it will be popular to create applications destined for Windows Azure Web Sites or Virtual Machines that take advantage of the various Storage Services.

So that’s the trick – be sure to take advantage of the Storage Emulator, even when you are not targeting a Cloud Service. You need to know two things: how to turn it on, and how to address it.

Turning on the Storage Emulator

If you create a regular old Web Site and run that in Visual Studio, the Storage Emulator is not turned on. Visual Studio only turns on the Storage Emulator for you when you debug using a Cloud Service, but this is not convenient.

It is easy to turn on. I have a whole post that explains how to start the storage emulator from a shortcut, but the keys are:

  1. Find csrun.exe — In my case: “C:\Program Files\Microsoft SDKs\Windows Azure\Emulator\csrun.exe” 
  2. Run csrun.exe with the parameter /devstore:start which indicates to start up the Storage Emulator.
  3. Done. Of course you might want this is a bat file or as a PowerShell function.

Here’s PowerShell script that will turn it on:

Addressing the Storage Emulator

The other part is knowing how to set up your Storage Connection String so that it accesses local storage emulator instead of the cloud.

Here are the values to use to make it look like any other Storage Account, while still addressing local emulated storage rather than in the cloud:

Emulator Storage Account Name: devstoreaccount1
Emulator Storage Account Key: Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==

Resources

The latest version of the Windows Azure Storage Emulator (v2.2.1) is in Preview. This release has support for “2013-08-15” version of Storage which adds CORS and JSON and still has all those features from years gone by…

A comparison of emulated and cloud storage services is also available. There are some differences.

[This is part of a series of posts on #StupidAzureTricks, explained here.]

Stupid Azure Trick #6 – A CORS Toggler Command-line Tool for Windows Azure Blobs

[Edit: I originally accidentally published an old draft. The draft went out to all email subscribers and was public for around 90 minutes. Fixed now.]

In the most recent Stupid Azure Trick installment, I explained how one could host a 1000 visitor-per-day web site for one penny per month. Since then I also explained my choice to use CORS in that same application. Here I will dig into specifically using CORS with Windows Azure.

I also show how the curl command line tool can be helpful to examine CORS properties in HTTP headers for a blob service.

I also will briefly describe a simple tool I built that could quickly turn CORS on or off for a specified Blob service – the CORS Toggler. The CORS Toggler (in its current simple form) was useful to me because of two constraints that were true for my scenario:

  • I was only reading files from the Windows Azure Blob Service. When just reading, pre-flight request doesn’t matter when you are just reading. Simplification #1.
  • I didn’t care whether the blob resource is publicly available, rather than just available to my application. So the CORS policy was to open to any caller (‘*’). Simplification #2.

These two simplifications mean that the toggler knew what it meant to enable CORS (open up for reading to all comers) and to disable. (Though it is worth noting that opening up CORS to any caller is probably a common scenario. Also worth noting that tool could easily extended to support a whitelist for allowed domains or other features.)

First, here’s the code for the toggler – there are three files here:

  1. Driver program (Console app in C#) – handles command line params and such and then calls into the …
  2. Code to perform simple CORS manipulation (C# class)
  3. The above two and driven (in my fast toggler) through the third file (command line batch file) which passes in the storage keys and storage account name for the service I was working with

One simple point to highlight – CORS properties are simply available on the Blob service object (and would be same for Table or Queue service within Storage):

image

Yes, this is a very simple API.

Showing the Service Object Contents

For those interested in the contents of these objects, here are a few ways to show content of properties (in code) before turning on CORS and after. (The object views are created using the technique I described my post on using JSON.NET as an object dumper that’s Good Enough™.)

DUMPING OBJECT BEFORE CORS ENABLED (just CORS properties):

{“Logging”:{“Version”:”1.0″,”LoggingOperations”:0,”RetentionDays”:null},”Metrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”HourMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”Cors”:{“CorsRules”:[]},”MinuteMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”DefaultServiceVersion”:null}

DUMPING OBJECT AFTER CORS ENABLED:

{“Logging”:{“Version”:”1.0″,”LoggingOperations”:0,”RetentionDays”:null},”Metrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”HourMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”Cors”:{“CorsRules”:[{“AllowedOrigins”:[“*”],”ExposedHeaders”:[“*”],”AllowedHeaders”:[“*”],”AllowedMethods”:1,”MaxAgeInSeconds”:36000}]},”MinuteMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”DefaultServiceVersion”:null}

image

DUMPING OBJECT BEFORE CORS ENABLED (but including ALL properties):

Current Properties:
{“Logging”:{“Version”:”1.0″,”LoggingOperations”:0,”RetentionDays”:null},”Metrics
“:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”HourMetrics”:{“Versio
n”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”Cors”:{“CorsRules”:[]},”MinuteM
etrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”DefaultServiceV
ersion”:null}

DUMPING OBJECT AFTER CORS ENABLED (but including ALL properties):

Current Properties:
{“Logging”:{“Version”:”1.0″,”LoggingOperations”:0,”RetentionDays”:null},”Metrics
“:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”HourMetrics”:{“Versio
n”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”Cors”:{“CorsRules”:[{“AllowedOr
igins”:[“*”],”ExposedHeaders”:[“*”],”AllowedHeaders”:[“*”],”AllowedMethods”:1,”M
axAgeInSeconds”:36000}]},”MinuteMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”Rete
ntionDays”:null},”DefaultServiceVersion”:null}

image

Using ‘curl’ To Examine CORS Data:

image

CURL OUTPUT BEFORE CORS ENABLED:

D:\dev\github>curl -H “Origin: http://example.com” -H “Access-Control-Request-Method: GET” -H “Access-Control-Request-Headers: X-Requested-With” -X OPTIONS –verbose http://azuremap.blob.core.windows.net/maps/azuremap.geojson

* Adding handle: conn: 0x805fa8
* Adding handle: send: 0
* Adding handle: recv: 0
* Curl_addHandleToPipeline: length: 1
* – Conn 0 (0x805fa8) send_pipe: 1, recv_pipe: 0
* About to connect() to azuremap.blob.core.windows.net port 80 (#0)
*   Trying 168.62.32.206…
* Connected to azuremap.blob.core.windows.net (168.62.32.206) port 80 (#0)
> OPTIONS /maps/azuremap.geojson HTTP/1.1
> User-Agent: curl/7.31.0
> Host: azuremap.blob.core.windows.net
> Accept: */*
> Origin: http://example.com
> Access-Control-Request-Method: GET
> Access-Control-Request-Headers: X-Requested-With
>
< HTTP/1.1 403 CORS not enabled or no matching rule found for this request.
< Content-Length: 316
< Content-Type: application/xml
* Server Blob Service Version 1.0 Microsoft-HTTPAPI/2.0 is not blacklisted
< Server: Blob Service Version 1.0 Microsoft-HTTPAPI/2.0
< x-ms-request-id: 04402242-d4a7-4d0c-bedc-ff553a1bc982
< Date: Sun, 26 Jan 2014 15:08:11 GMT
<
<?xml version=”1.0″ encoding=”utf-8″?><Error><Code>CorsPreflightFailure</Code><Message>CORS not enabled or no matching rule found for this request.
RequestId:04402242-d4a7-4d0c-bedc-ff553a1bc982
Time:2014-01-26T15:08:12.0193649Z</Message><MessageDetails>No CORS rules matches this request</MessageDetails></Error>*
Connection #0 to host azuremap.blob.core.windows.net left intact

CURL OUTPUT AFTER CORS ENABLED:

D:\dev\github>curl -H “Origin: http://example.com” -H “Access-Control-Request-Method: GET” -H “Access-Control-Request-Headers: X-Requested-With” -X OPTIONS –verbose http://azuremap.blob.core.windows.net/maps/azuremap.geojson
* Adding handle: conn: 0x1f55fa8
* Adding handle: send: 0
* Adding handle: recv: 0
* Curl_addHandleToPipeline: length: 1
* – Conn 0 (0x1f55fa8) send_pipe: 1, recv_pipe: 0
* About to connect() to azuremap.blob.core.windows.net port 80 (#0)
*   Trying 168.62.32.206…
* Connected to azuremap.blob.core.windows.net (168.62.32.206) port 80 (#0)
> OPTIONS /maps/azuremap.geojson HTTP/1.1
> User-Agent: curl/7.31.0
> Host: azuremap.blob.core.windows.net
> Accept: */*
> Origin: http://example.com
> Access-Control-Request-Method: GET
> Access-Control-Request-Headers: X-Requested-With
>
< HTTP/1.1 200 OK
< Transfer-Encoding: chunked
* Server Blob Service Version 1.0 Microsoft-HTTPAPI/2.0 is not blacklisted
< Server: Blob Service Version 1.0 Microsoft-HTTPAPI/2.0
< x-ms-request-id: d4df8953-f8ae-441b-89fe-b69232579aa4
< Access-Control-Allow-Origin: http://example.com
< Access-Control-Allow-Methods: GET
< Access-Control-Allow-Headers: X-Requested-With
< Access-Control-Max-Age: 36000
< Access-Control-Allow-Credentials: true
< Date: Sun, 26 Jan 2014 16:02:25 GMT
<
* Connection #0 to host azuremap.blob.core.windows.net left intact

Resources

A new version of the Windows Azure Storage Emulator (v2.2.1) is now in Preview. This release has support for “2013-08-15” version of Storage which includes CORS (and JSON and other) support.

Overall description of Azure Storage’s CORS Support:

http://msdn.microsoft.com/en-us/library/windowsazure/dn535601.aspx

REST API doc (usually the canonical doc for any feature, though in code it is easily accessed with the Windows Azure SDK for .NET)

http://msdn.microsoft.com/en-us/library/windowsazure/hh452235.aspx

A couple of excellent posts from the community on CORS support in Windows Azure Storage:

[This is part of a series of posts on #StupidAzureTricks, explained here.]

Stupid Azure Trick #5 – Got a Penny? Run a Simple Web Site 100% on Blob Storage for a Month – Cost Analysis Provided

Suppose you have a simple static web site you want to publish, but your budget is small. You could do this with Windows Azure Storage as a set of blobs. The “simple static” qualifier rules out ASP.NET and PHP and Node.js – and anything that does server-side processing before serving up a page. But that still leaves a lot of scenarios – and does not preclude the site from being interactive or loading external data using AJAX and behaving like it is dynamic. This one does.

Check out the web site at http://azuremap.blob.core.windows.net/apps/bingmap-geojson-display.html.

image

You may recognize the map from an earlier post that showed how one could visualize Windows Azure Data Center Regions on a map. It should look familiar because this web site uses the exact same underlying GeoJSON data used earlier, except this time the map implementation is completely different. This version has JavaScript code that loads and parses the raw GeoJSON data and renders it dynamically by populating a Bing Maps viewer control (which is also in JavaScript).

But the neat part is there’s only JavaScript behind the scenes. All of the site’s assets are loaded directly from Windows Azure Blob Storage (plus Bing Maps control from an external location).

Here’s the simple breakdown. There is the main HTML page (the URL specifies that directly), and that in turn loads the following four JavaScript files:

  1. http://ecn.dev.virtualearth.net/mapcontrol/mapcontrol.ashx?v=7.0 – version 7.0 of the Bing Map control
  2. httpGetString.js – general purposes data fetcher (used to pull in the GeoJSON data)
  3. geojson-parse.js – application-specific to parse the GeoJSON data
  4. bingmap-geojson-display.js – application-specific logic to put elements from the GeoJSON file onto the Bing Map

I have not tried this to prove the point, but I think that to render on, say, Google Maps, the only JavaScript that would need to change would be bingmap-geojson-display.js (presumably replaced by googlemap-geojson-display.js).

Notice that the GeoJSON data lives in a different Blob Storage Container here:  http://azuremap.blob.core.windows.net/maps/azuremap.geojson. We’ll get into the details in another post, but in order for this to work – in order for …/apps/bingmap-geojson.html to directly load a JSON data file from …/maps/azuremap.geojson – we enabled CORS for the Blob Service within the host Windows Azure Storage account.

Costs Analysis

Hosting a very low-cost (and low-complexity) web site as a few blobs is really handy. It is very scalable and robust. Blob Storage costs come from three sources:

  1. cost of data at rest – for this scenario, probably Blob Blobs and Locally Redundant Storage would be appropriate, and the cost there is $0.068 per GB / month (details)
  2. storage transactions – $0.005 per 100,000 transactions (details – same as above link, but look lower on the page) – where a storage transaction is (loosely speaking) a file read or write operation
  3. outbound data transfers (data leaving the data center) – first 5 GB / month is free, then there’s a per GB cost (details)

The azuremap web site shown earlier weighs in at under 18 KB and is spread across 5 files (1 .html, 3 .js, 1 .geojson). If we assume a healthy 1000 hits a day on our site, here’s the math.

  • We have around 1000 x 31 = 31,000 visits per month.
  • Cost of data at rest would be 18 KB x $0.068 / GB = effectively $0. Since storage starts at less than 7 cents per GB and our data is 5 orders of magnitude smaller, the cost is too small to meaningfully measure.
  • Storage transactions would be 31,000 x 5 (one per file in our case) x $0.005 / 100,000 = $0.00775, or a little more than 3/4 of a penny in US currency per month, around 9 cents per year, or $1 every 11 years.
  • Outbound data transfer total would be 31,000 x 18 KB = 560 MB, which is around 1/10th of the amount allowed for free, so there’d be no charge for that.

So our monthly bill would be for less than 1 penny (less than US$0.01).

This is also a good (though very simple) example of the sort of cost analysis you will need to do when understanding what it takes to create cloud applications or migrate from on-premises to the cloud. The Windows Azure Calculator and information on lower-cost commitment plans may also prove handy.

Alternative Approaches

Of course in this day and age, for a low-cost simple site it is hard to beat Windows Azure Web Sites. There’s an entirely free tier there (details) – allowing you to save yourself nearly a penny every month. That’s pretty good since Benjamin Franklin, one of America’s founding fathers, famously quipped A penny saved is a penny earned!.BenFranklinDuplessis.jpg

Windows Azure Web Sites also has other features – your site can be in PHP or ASP.NET or Node.js or Python. And you can get continuous deployment from GitHub or Bitbucket or TFS or Dropbox or others. And you get monitoring and other features from the portal. And more.

But at least you know you can host in blob storage if you like.

[This is part of a series of posts on #StupidAzureTricks, explained here.]

Stupid Azure Trick #4 – C#, Node.js, and Python side-by-side – Three Simple Command Line Tools to Copy Files up to Windows Azure Blob Storage

Windows Azure has a cloud file storage service known as Blob Storage.

[Note: Windows Azure Storage is broader than just Blob Storage, but in this post I will ignore its sister services Table Storage (a NoSQL key/value store) and Queues (a reliable queuing service).]

Before we get into the tricks, it is useful to know a bit about Blog Storage.

The code below is very simple – it uploads a couple of files to Blob Storage. The files being uploaded are JSON, so it includes proper setting of the HTTP content-type and sets up caching. Then it lists a directory of the files up in that particular Blob Storage container (where a container is like a folder or subdirectory in a regular file system).

The code listed below will work nicely on a Windows Azure Dev-Test VM, or on your own desktop. Of course you need a Windows Azure Storage Account first, and the storage credentials. (New to Azure? Click here to access a free trial.) But once you do, the coding is straight-forward.

  • For C#: create a Windows Console application and add the NuGet packaged named “Windows Azure Storage”
  • For Node.js: run “npm install azure” (or “npm install azure – –global”)
  • For Python: run “pip install azure” to get the SDK
  • We don’t cover it here, but you could also use PowerShell or the CLI or the REST API directly.

Note: these are command line tools, so there isn’t a web project with config values for the storage keys. So in lieu of that I used a text file on the file system. Storage credentials should be stored safely, regardless of which computer they are used on, so beware my demonstration only using public data so my storage credentials in this case may not be as damaging, if lost, as some others.

Here’s the code. Enjoy!

Useful Links

Python

http://research.microsoft.com/en-us/projects/azure/an-intro-to-using-python-with-windows-azure.pdf

http://research.microsoft.com/en-us/projects/azure/windows-azure-for-linux-and-mac-users.pdf

http://www.windowsazure.com/en-us/develop/python/

SDK Source for Python: https://github.com/WindowsAzure/azure-sdk-for-python

Node.js

http://www.windowsazure.com/en-us/develop/nodejs/

SDK Source for Node.js: https://github.com/WindowsAzure/azure-sdk-for-node

http://www.windowsazure.com/en-us/documentation/articles/storage-nodejs-how-to-use-blob-storage/

C#/.NET

http://www.windowsazure.com/en-us/develop/net/

Storage SDK Source for .NET: https://github.com/WindowsAzure/azure-storage-net

Storage Client Library 3: http://msdn.microsoft.com/en-us/library/dn495001%28v=azure.10%29.aspx

[This is part of a series of posts on #StupidAzureTricks, explained here.]