Monthly Archives: February 2014

Stupid Azure Trick #7 – Use Windows Azure’s Local Storage Emulator with Web Sites & VMs

[Ugh – editing 2nd week in a row after accidental early publishing.]

The original programming model for Windows Azure applications was to use Cloud Services (originally known as Hosted Services, but still the same thing). Of particular note, Cloud Services run on VMs with disks that are not-persistent – you can write data locally (some pointers here), but any locally stored data is not guaranteed to stick around. This is a powerful model for some scenarios, especially highly scalable applications. Another feature of Cloud Services has always been that it comes with an emulator you can run locally – on your laptop at 30,000 feet was a common way to hammer home the point. (Remember, Cloud Services were announced in 2008 – a long time before we had wifi on airplanes!) There are actually two emulators: Compute – which emulates the Cloud Service model by supporting Web Role and Worker Role abstractions, and Storage – which emulates Blob, Table, and [Storage] Queue Services. The rest of this post will focus specifically on the Storage Emulator.

waws-wavm-wacs-venn

Since their announcement in 2012, Windows Azure Web Sites and Virtual Machines have been taking on many of the common workloads that used to require Cloud Services. This diagram at least conceptually should capture the sense that the when to use which model decision has become blurred over time. This is good – with more choice comes the freedom to get started more simply – often a Virtual Machine is an easier onramp for existing apps, and a Web Site can be a great onramp for a website that adheres to some of the well-known programming stacks running on PHP, ASP.NET, Python, or Node.js. If you are a big success, consider upgrading to Cloud Services.

Notably absent from the diagram is the Storage Emulator. It should be in the middle of the diagram because while the local storage emulator is still useful for Cloud Services, you can also use it locally when developing applications targeting Windows Azure Web Sites or Virtual Machines.

This is awesome – of course, it will be popular to create applications destined for Windows Azure Web Sites or Virtual Machines that take advantage of the various Storage Services.

So that’s the trick – be sure to take advantage of the Storage Emulator, even when you are not targeting a Cloud Service. You need to know two things: how to turn it on, and how to address it.

Turning on the Storage Emulator

If you create a regular old Web Site and run that in Visual Studio, the Storage Emulator is not turned on. Visual Studio only turns on the Storage Emulator for you when you debug using a Cloud Service, but this is not convenient.

It is easy to turn on. I have a whole post that explains how to start the storage emulator from a shortcut, but the keys are:

  1. Find csrun.exe — In my case: “C:\Program Files\Microsoft SDKs\Windows Azure\Emulator\csrun.exe” 
  2. Run csrun.exe with the parameter /devstore:start which indicates to start up the Storage Emulator.
  3. Done. Of course you might want this is a bat file or as a PowerShell function.

Here’s PowerShell script that will turn it on:

& 'C:\Program Files\Microsoft SDKs\Windows Azure\Emulator\csrun.exe' /devstore:start

Addressing the Storage Emulator

The other part is knowing how to set up your Storage Connection String so that it accesses local storage emulator instead of the cloud.

Here are the values to use to make it look like any other Storage Account, while still addressing local emulated storage rather than in the cloud:

Emulator Storage Account Name: devstoreaccount1
Emulator Storage Account Key: Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==

Resources

The latest version of the Windows Azure Storage Emulator (v2.2.1) is in Preview. This release has support for “2013-08-15” version of Storage which adds CORS and JSON and still has all those features from years gone by…

A comparison of emulated and cloud storage services is also available. There are some differences.

[This is part of a series of posts on #StupidAzureTricks, explained here.]

Advertisement

Windows Azure Japan Data Center Regions now in Production, AzureMap updated

Two more Windows Azure data center regions have moved into production – in Japan this time.

There was a press release, but in Japanese. You can either learn Japanese or read the Bing Translator or Google Translate versions.

I updated the JSON meta data in the Azure Map project to indicate these data center regions into the “Production” mode, then re-ran my script to regenerate and upload the GeoJSON and TopoJSON maps. All data is in GitHub. For more info, see these two posts:

Stupid Azure Trick #6 – A CORS Toggler Command-line Tool for Windows Azure Blobs

[Edit: I originally accidentally published an old draft. The draft went out to all email subscribers and was public for around 90 minutes. Fixed now.]

In the most recent Stupid Azure Trick installment, I explained how one could host a 1000 visitor-per-day web site for one penny per month. Since then I also explained my choice to use CORS in that same application. Here I will dig into specifically using CORS with Windows Azure.

I also show how the curl command line tool can be helpful to examine CORS properties in HTTP headers for a blob service.

I also will briefly describe a simple tool I built that could quickly turn CORS on or off for a specified Blob service – the CORS Toggler. The CORS Toggler (in its current simple form) was useful to me because of two constraints that were true for my scenario:

  • I was only reading files from the Windows Azure Blob Service. When just reading, pre-flight request doesn’t matter when you are just reading. Simplification #1.
  • I didn’t care whether the blob resource is publicly available, rather than just available to my application. So the CORS policy was to open to any caller (‘*’). Simplification #2.

These two simplifications mean that the toggler knew what it meant to enable CORS (open up for reading to all comers) and to disable. (Though it is worth noting that opening up CORS to any caller is probably a common scenario. Also worth noting that tool could easily extended to support a whitelist for allowed domains or other features.)

First, here’s the code for the toggler – there are three files here:

  1. Driver program (Console app in C#) – handles command line params and such and then calls into the …
  2. Code to perform simple CORS manipulation (C# class)
  3. The above two and driven (in my fast toggler) through the third file (command line batch file) which passes in the storage keys and storage account name for the service I was working with
@echo off
D:\dev\path\to\ToggleCors.exe azuremap 123abcYourStorageKeyIsHere987zyx== %1
echo.
echo FOR COMPARISON QUERY THE nocors SERVICE (which never has CORS set)
echo.
D:\dev\path\to\ToggleCors.exe nocors 123abcYourStorageKeyIsHere987zyx== -q
echo.
PowerShell -Command Get-Date
using System;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
using Microsoft.WindowsAzure.Storage.Shared.Protocol;
using Newtonsoft.Json;
namespace BlobCorsToggle
{
public class SimpleAzureBlobCorsSetter
{
public CloudBlobClient CloudBlobClient { get; private set; }
public SimpleAzureBlobCorsSetter(Uri blobServiceUri, StorageCredentials storageCredentials)
{
this.CloudBlobClient = new CloudBlobClient(blobServiceUri, storageCredentials);
}
public SimpleAzureBlobCorsSetter(CloudBlobClient blobClient)
{
this.CloudBlobClient = blobClient;
}
/// <summary>
/// Set Blob Service CORS settings for specified Windows Azure Storage Account.
/// Either sets to a hard-coded set of values (see below) or clears of all CORS settings.
///
/// Does not check for any existing CORS settings, but clobbers with the CORS settings
/// to allow HTTP GET access from any origin. Non-CORS settings are left intact.
///
/// Most useful for scenarios where a file is published in Blob Storage for read-access
/// by any client.
///
/// Can also be useful in conjunction with Valet Key Pattern-style limited access, as
/// might be useful with a mobile application.
/// </summary>
/// <param name="clear">if true, clears all CORS setting, else allows GET from any origin</param>
public void SetBlobCorsGetAnyOrigin(bool clear)
{
// http://msdn.microsoft.com/en-us/library/windowsazure/dn535601.aspx
var corsGetAnyOriginRule = new CorsRule();
corsGetAnyOriginRule.AllowedOrigins.Add("*"); // allow access to any client
corsGetAnyOriginRule.AllowedMethods = CorsHttpMethods.Get; // only CORS-enable http GET
corsGetAnyOriginRule.ExposedHeaders.Add("*"); // let client see any header we've configured
corsGetAnyOriginRule.AllowedHeaders.Add("*"); // let clients request any header they can think of
corsGetAnyOriginRule.MaxAgeInSeconds = (int)TimeSpan.FromHours(10).TotalSeconds; // clients are safe to cache CORS config for up to this long
var blobServiceProperties = this.CloudBlobClient.GetServiceProperties();
if (clear)
{
blobServiceProperties.Cors.CorsRules.Clear();
}
else
{
blobServiceProperties.Cors.CorsRules.Clear(); // replace current property set
blobServiceProperties.Cors.CorsRules.Add(corsGetAnyOriginRule);
}
this.CloudBlobClient.SetServiceProperties(blobServiceProperties);
}
public void DumpCurrentProperties()
{
var blobServiceProperties = this.CloudBlobClient.GetServiceProperties();
var blobPropertiesStringified = StringifyProperties(blobServiceProperties);
Console.WriteLine("Current Properties:\n{0}", blobPropertiesStringified);
}
internal string StringifyProperties(ServiceProperties serviceProperties)
{
// JsonConvert.SerializeObject(serviceProperties) for whole object graph or
// JsonConvert.SerializeObject(serviceProperties.Cors) for just CORS
return Newtonsoft.Json.JsonConvert.SerializeObject(serviceProperties, Formatting.Indented);
}
}
}
using System;
using System.Diagnostics;
using System.IO;
using Microsoft.WindowsAzure.Storage.Auth;
namespace BlobCorsToggle
{
class Program
{
static string clearFlag = "-clear";
static string queryFlag = "-q";
static void Main(string[] args)
{
if (args.Length == 0 || !ValidCommandArguments(args))
{
#region Show Correct Usage
if (args.Length != 0 && !ValidCommandArguments(args))
{
var saveForegroundColor = Console.ForegroundColor;
Console.ForegroundColor = ConsoleColor.Red;
Console.Write("\nINVALID COMMAND LINE OR UNKNOWN FLAGS: ");
foreach (var arg in args) Console.Write("{0} ", arg);
Console.WriteLine();
Console.ForegroundColor = saveForegroundColor;
}
Console.WriteLine("usage:\n{0} <storage acct name> <storage acct key> [{1}]",
Path.GetFileNameWithoutExtension(Environment.GetCommandLineArgs()[0]),
clearFlag);
Console.WriteLine();
Console.WriteLine("example setting:\n{0} {1} {2}",
Path.GetFileNameWithoutExtension(Environment.GetCommandLineArgs()[0]),
"mystorageacct",
"lala+aou812SomERAndOmStrINglOlLolFroMmYStORaG3akounT314159265358979323ilIkEpi==");
Console.WriteLine();
Console.WriteLine("example clearing:\n{0} {1} {2} {3}",
Path.GetFileNameWithoutExtension(Environment.GetCommandLineArgs()[0]),
"mystorageacct",
"lala+aou812SomERAndOmStrINglOlLolFroMmYStORaG3akounT314159265358979323ilIkEpi==",
clearFlag);
if (Debugger.IsAttached)
{
Console.WriteLine("\nPress any key to exit.");
Console.ReadKey();
}
#endregion
}
else
{
var storageAccountName = args[0];
var storageKey = args[1];
var blobServiceUri = new Uri(String.Format("https://{0}.blob.core.windows.net", storageAccountName));
var storageCredentials = new StorageCredentials(storageAccountName, storageKey);
var blobConfig = new SimpleAzureBlobCorsSetter(blobServiceUri, storageCredentials);
bool query = (args.Length == 3 && args[2] == queryFlag);
if (query)
{
blobConfig.DumpCurrentProperties();
return;
}
blobConfig.DumpCurrentProperties();
Console.WriteLine();
bool clear = (args.Length == 3 && args[2] == clearFlag);
blobConfig.SetBlobCorsGetAnyOrigin(clear);
Console.WriteLine("CORS Blob Properties for Storage Account {0} have been {1}.",
storageAccountName, clear ? "cleared" : "set");
Console.WriteLine();
blobConfig.DumpCurrentProperties();
}
}
private static bool ValidCommandArguments(string[] args)
{
if (args.Length == 2) return true;
if (args.Length == 3 && (args[2] == clearFlag || args[2] == queryFlag)) return true;
return false;
}
}
}
view raw ToggleCors.cs hosted with ❤ by GitHub

One simple point to highlight – CORS properties are simply available on the Blob service object (and would be same for Table or Queue service within Storage):

image

Yes, this is a very simple API.

Showing the Service Object Contents

For those interested in the contents of these objects, here are a few ways to show content of properties (in code) before turning on CORS and after. (The object views are created using the technique I described my post on using JSON.NET as an object dumper that’s Good Enough™.)

DUMPING OBJECT BEFORE CORS ENABLED (just CORS properties):

{“Logging”:{“Version”:”1.0″,”LoggingOperations”:0,”RetentionDays”:null},”Metrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”HourMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”Cors”:{“CorsRules”:[]},”MinuteMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”DefaultServiceVersion”:null}

DUMPING OBJECT AFTER CORS ENABLED:

{“Logging”:{“Version”:”1.0″,”LoggingOperations”:0,”RetentionDays”:null},”Metrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”HourMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”Cors”:{“CorsRules”:[{“AllowedOrigins”:[“*”],”ExposedHeaders”:[“*”],”AllowedHeaders”:[“*”],”AllowedMethods”:1,”MaxAgeInSeconds”:36000}]},”MinuteMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”DefaultServiceVersion”:null}

image

DUMPING OBJECT BEFORE CORS ENABLED (but including ALL properties):

Current Properties:
{“Logging”:{“Version”:”1.0″,”LoggingOperations”:0,”RetentionDays”:null},”Metrics
“:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”HourMetrics”:{“Versio
n”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”Cors”:{“CorsRules”:[]},”MinuteM
etrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”DefaultServiceV
ersion”:null}

DUMPING OBJECT AFTER CORS ENABLED (but including ALL properties):

Current Properties:
{“Logging”:{“Version”:”1.0″,”LoggingOperations”:0,”RetentionDays”:null},”Metrics
“:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”HourMetrics”:{“Versio
n”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”Cors”:{“CorsRules”:[{“AllowedOr
igins”:[“*”],”ExposedHeaders”:[“*”],”AllowedHeaders”:[“*”],”AllowedMethods”:1,”M
axAgeInSeconds”:36000}]},”MinuteMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”Rete
ntionDays”:null},”DefaultServiceVersion”:null}

image

Using ‘curl’ To Examine CORS Data:

curl -H "Origin: http://example.com&quot; -H "Access-Control-Request-Method: GET" -H "Access-Control-Request-Headers: X-Requested-With" -X OPTIONS --verbose http://azuremap.blob.core.windows.net/maps/azuremap.geojson
view raw check-cors hosted with ❤ by GitHub

image

CURL OUTPUT BEFORE CORS ENABLED:

D:\dev\github>curl -H “Origin: http://example.com” -H “Access-Control-Request-Method: GET” -H “Access-Control-Request-Headers: X-Requested-With” -X OPTIONS –verbose http://azuremap.blob.core.windows.net/maps/azuremap.geojson

* Adding handle: conn: 0x805fa8
* Adding handle: send: 0
* Adding handle: recv: 0
* Curl_addHandleToPipeline: length: 1
* – Conn 0 (0x805fa8) send_pipe: 1, recv_pipe: 0
* About to connect() to azuremap.blob.core.windows.net port 80 (#0)
*   Trying 168.62.32.206…
* Connected to azuremap.blob.core.windows.net (168.62.32.206) port 80 (#0)
> OPTIONS /maps/azuremap.geojson HTTP/1.1
> User-Agent: curl/7.31.0
> Host: azuremap.blob.core.windows.net
> Accept: */*
> Origin: http://example.com
> Access-Control-Request-Method: GET
> Access-Control-Request-Headers: X-Requested-With
>
< HTTP/1.1 403 CORS not enabled or no matching rule found for this request.
< Content-Length: 316
< Content-Type: application/xml
* Server Blob Service Version 1.0 Microsoft-HTTPAPI/2.0 is not blacklisted
< Server: Blob Service Version 1.0 Microsoft-HTTPAPI/2.0
< x-ms-request-id: 04402242-d4a7-4d0c-bedc-ff553a1bc982
< Date: Sun, 26 Jan 2014 15:08:11 GMT
<
<?xml version=”1.0″ encoding=”utf-8″?><Error><Code>CorsPreflightFailure</Code><Message>CORS not enabled or no matching rule found for this request.
RequestId:04402242-d4a7-4d0c-bedc-ff553a1bc982
Time:2014-01-26T15:08:12.0193649Z</Message><MessageDetails>No CORS rules matches this request</MessageDetails></Error>*
Connection #0 to host azuremap.blob.core.windows.net left intact

CURL OUTPUT AFTER CORS ENABLED:

D:\dev\github>curl -H “Origin: http://example.com” -H “Access-Control-Request-Method: GET” -H “Access-Control-Request-Headers: X-Requested-With” -X OPTIONS –verbose http://azuremap.blob.core.windows.net/maps/azuremap.geojson
* Adding handle: conn: 0x1f55fa8
* Adding handle: send: 0
* Adding handle: recv: 0
* Curl_addHandleToPipeline: length: 1
* – Conn 0 (0x1f55fa8) send_pipe: 1, recv_pipe: 0
* About to connect() to azuremap.blob.core.windows.net port 80 (#0)
*   Trying 168.62.32.206…
* Connected to azuremap.blob.core.windows.net (168.62.32.206) port 80 (#0)
> OPTIONS /maps/azuremap.geojson HTTP/1.1
> User-Agent: curl/7.31.0
> Host: azuremap.blob.core.windows.net
> Accept: */*
> Origin: http://example.com
> Access-Control-Request-Method: GET
> Access-Control-Request-Headers: X-Requested-With
>
< HTTP/1.1 200 OK
< Transfer-Encoding: chunked
* Server Blob Service Version 1.0 Microsoft-HTTPAPI/2.0 is not blacklisted
< Server: Blob Service Version 1.0 Microsoft-HTTPAPI/2.0
< x-ms-request-id: d4df8953-f8ae-441b-89fe-b69232579aa4
< Access-Control-Allow-Origin: http://example.com
< Access-Control-Allow-Methods: GET
< Access-Control-Allow-Headers: X-Requested-With
< Access-Control-Max-Age: 36000
< Access-Control-Allow-Credentials: true
< Date: Sun, 26 Jan 2014 16:02:25 GMT
<
* Connection #0 to host azuremap.blob.core.windows.net left intact

Resources

A new version of the Windows Azure Storage Emulator (v2.2.1) is now in Preview. This release has support for “2013-08-15” version of Storage which includes CORS (and JSON and other) support.

Overall description of Azure Storage’s CORS Support:

http://msdn.microsoft.com/en-us/library/windowsazure/dn535601.aspx

REST API doc (usually the canonical doc for any feature, though in code it is easily accessed with the Windows Azure SDK for .NET)

http://msdn.microsoft.com/en-us/library/windowsazure/hh452235.aspx

A couple of excellent posts from the community on CORS support in Windows Azure Storage:

[This is part of a series of posts on #StupidAzureTricks, explained here.]

Choosing CORS over JSONP over Inline… and Lessons Learned Using CORS

I recently created a very simple client-only (no server-side code) that loads needed data dynamically. In order to access data from another storage location (in my case the data came from a Windows Azure Blob), the application needed to make a choice: how to load the data.

It really came down to three choices:

  1. Load the data synchronously as the page loaded using an Inline script tag
  2. Load the data asynchronously as part of initial page load using JSONP
  3. Load the data asynchronously as part of initial page load using CORS

All 3 options effectively work within the Same Origin Policy (SOP) sandbox security measures that browsers implement. If access is not coming from a browser (but from, say, curl or a server application), SOP has no effect. SOP is there to protect end users from web sites that might not behave themselves.

Option 1 would be to basically have a hardcoded script tag load the data. One disadvantage of this is put perfectly by Douglas Crockford: “A <script src="url"></script> will block the downloading of other page components until the script has been fetched, compiled, and executed.” This means that the page will block while the data is loaded, potentially making the initial load appear a bit more visually chaotic. Also, if this technique is the only mechanism for loading data, once the page is loaded, the data is never refreshed, a potentially severe limitation for some applications; in the very old days, the best we could do was periodically trigger a full-page refresh, but that’s not state-of-the-art in 2014.

Option 2 would be to load the data asynchronously using JSONP. This is a fine solution from a user experience point of view: the page structure is first loaded, then populated once the data arrives. The client invokes the request using the XMLHttpRequest object in JavaScript.

Option 3 would be to load the data asynchronously using CORS. This offers essentially the identical user experience as option 2 and also relies on the XMLHttpRequest object.

Options 1 and 2 require that the data be encapsulated in JavaScript code. For option 2 with JSONP the convention is a function (often named callback) that simply returns a JSON object. The client making the call will then need to execute the function to get at the data. Option 1 has slightly more flexibility and could be simply a data structure declared with a known name like var mapData = ... which the client can access directly.

Option 3 with CORS is able to return the data directly. In that regard it is a little tiny bit more efficient since no bubble-wrap is needed – and is a lot safer since you are not executing a JavaScript function returned returned by a potentially untrusted server.

JSONP is not based on any official standard, but is common practice. CORS is a standard that is supported in modern browsers and comes with granular access policies. As an example, CORS policies can be set to allow access from a whitelist of domains (such as paying customers), while disallowing from any other domain.

For all three options there needs to be coordination between the client and the server since they need to agree on how the data is packaged for transmission. For CORS, this also requires browser support (see chart below). All options require that JavaScript is enabled in the client browser.

Summarizing CORS, JSONP, Inline

The following summary compares key qualities.

Inline JavaScript JSONP CORS Comments
Synchronous or Async Synchronous Async Async
Granular domain-level security no no yes In any of the three, you could also implement an authorization scheme. This is above and beyond that.
Risk no yes no JSONP requires that you execute a JavaScript function to get at the data. Neither of the other two approaches require that. There’s an extra degree of caution needed for JSONP data sources outside of your control.
Efficiency on the wire close close most efficient Both Inline and JSONP both wrap your data in JavaScript constructs. These add a small amount of overhead. Depending on what you are doing, these could add up. But minor.
Browser support full full partial
Server support full full partial Servers need to support the CORS handshake with browsers to (a) deny disallowed domains, and (b) to give browsers the information they need to honor restrictions
Supported by a Standard no no yes
Is it the future no no yes Safer. Granular security. Standardized. Max efficiency.

Lessons Learned Using CORS

Yes, my simple one-page map app (described here) ended up using CORS. In large part since it is mature, and the browser support (see below) was sufficient.

Reloading Browser Pages: In debugging, CTRL-F5 is your friend in Chrome, Firefox, and IE if you want to clear the cache and reload the page you are on. I did this a lot as I was continually enabled and disabling CORS on the server to test out the effects.

Overriding CORS Logic in CHROME: It turns out that Chrome normally will honor all CORS settings. This is what most users will see. Let’s call this “civilian mode” for Chrome. But there’s also a developer mode – which you enable by running chrome with the chrome.exe –disable-web-security parameter. It was initially confused since it seemed Chrome’s CORS support didn’t work, but of course it did. This is one of the perils of living with a software nerd; my wife had used my computer and changed this a long time ago when she needed to build some CORS features, and I never knew until I ran into the perplexing issue.

Handling CORS Rejection: Your browser may not not let your JavaScript code know directly that a remote call was rejected due to a CORS policy. Some browsers silently map 404 to 0 if against a CORS-protected resource. You’ll see this mentioned in the code for httpGetString.js (if you look at my sample code).

Testing CORS from curl: Helped by a post on StackOverflow, I found it very handy to look at CORS headers from the command line. Note that you need to provide SOME origin in the request for it to be valid CORS, but here’s the command that worked for my cloud-host resource (you should also be able to run this same command):

curl -H “Origin: http://localhost” -H “Access-Control-Request-Method: GET” -H “Access-Control-Request-Headers: X-Requested-With” -X OPTIONS –verbose http://azuremap.blob.core.windows.net/maps/azuremap.geojson

Browser Support for CORS

To understand where CORS support stands with web browsers, this fantastic site http://caniuse.com/cors offers a nice visual showing CORS support across today. A corresponding chart for JSONP is not needed since it works within long-standing capabilities.

image

Resources

http://en.wikipedia.org/wiki/Same-origin_policy

My simple one-page map app is described here. That page includes a link to a running instance and its source code is easily viewed with View Source.

http://blog.auth0.com/2014/01/27/ten-things-you-should-know-about-tokens-and-cookies/#preflight

Stupid Azure Trick #5 – Got a Penny? Run a Simple Web Site 100% on Blob Storage for a Month – Cost Analysis Provided

Suppose you have a simple static web site you want to publish, but your budget is small. You could do this with Windows Azure Storage as a set of blobs. The “simple static” qualifier rules out ASP.NET and PHP and Node.js – and anything that does server-side processing before serving up a page. But that still leaves a lot of scenarios – and does not preclude the site from being interactive or loading external data using AJAX and behaving like it is dynamic. This one does.

Check out the web site at http://azuremap.blob.core.windows.net/apps/bingmap-geojson-display.html.

image

You may recognize the map from an earlier post that showed how one could visualize Windows Azure Data Center Regions on a map. It should look familiar because this web site uses the exact same underlying GeoJSON data used earlier, except this time the map implementation is completely different. This version has JavaScript code that loads and parses the raw GeoJSON data and renders it dynamically by populating a Bing Maps viewer control (which is also in JavaScript).

But the neat part is there’s only JavaScript behind the scenes. All of the site’s assets are loaded directly from Windows Azure Blob Storage (plus Bing Maps control from an external location).

Here’s the simple breakdown. There is the main HTML page (the URL specifies that directly), and that in turn loads the following four JavaScript files:

  1. http://ecn.dev.virtualearth.net/mapcontrol/mapcontrol.ashx?v=7.0 – version 7.0 of the Bing Map control
  2. httpGetString.js – general purposes data fetcher (used to pull in the GeoJSON data)
  3. geojson-parse.js – application-specific to parse the GeoJSON data
  4. bingmap-geojson-display.js – application-specific logic to put elements from the GeoJSON file onto the Bing Map

I have not tried this to prove the point, but I think that to render on, say, Google Maps, the only JavaScript that would need to change would be bingmap-geojson-display.js (presumably replaced by googlemap-geojson-display.js).

Notice that the GeoJSON data lives in a different Blob Storage Container here:  http://azuremap.blob.core.windows.net/maps/azuremap.geojson. We’ll get into the details in another post, but in order for this to work – in order for …/apps/bingmap-geojson.html to directly load a JSON data file from …/maps/azuremap.geojson – we enabled CORS for the Blob Service within the host Windows Azure Storage account.

Costs Analysis

Hosting a very low-cost (and low-complexity) web site as a few blobs is really handy. It is very scalable and robust. Blob Storage costs come from three sources:

  1. cost of data at rest – for this scenario, probably Blob Blobs and Locally Redundant Storage would be appropriate, and the cost there is $0.068 per GB / month (details)
  2. storage transactions – $0.005 per 100,000 transactions (details – same as above link, but look lower on the page) – where a storage transaction is (loosely speaking) a file read or write operation
  3. outbound data transfers (data leaving the data center) – first 5 GB / month is free, then there’s a per GB cost (details)

The azuremap web site shown earlier weighs in at under 18 KB and is spread across 5 files (1 .html, 3 .js, 1 .geojson). If we assume a healthy 1000 hits a day on our site, here’s the math.

  • We have around 1000 x 31 = 31,000 visits per month.
  • Cost of data at rest would be 18 KB x $0.068 / GB = effectively $0. Since storage starts at less than 7 cents per GB and our data is 5 orders of magnitude smaller, the cost is too small to meaningfully measure.
  • Storage transactions would be 31,000 x 5 (one per file in our case) x $0.005 / 100,000 = $0.00775, or a little more than 3/4 of a penny in US currency per month, around 9 cents per year, or $1 every 11 years.
  • Outbound data transfer total would be 31,000 x 18 KB = 560 MB, which is around 1/10th of the amount allowed for free, so there’d be no charge for that.

So our monthly bill would be for less than 1 penny (less than US$0.01).

This is also a good (though very simple) example of the sort of cost analysis you will need to do when understanding what it takes to create cloud applications or migrate from on-premises to the cloud. The Windows Azure Calculator and information on lower-cost commitment plans may also prove handy.

Alternative Approaches

Of course in this day and age, for a low-cost simple site it is hard to beat Windows Azure Web Sites. There’s an entirely free tier there (details) – allowing you to save yourself nearly a penny every month. That’s pretty good since Benjamin Franklin, one of America’s founding fathers, famously quipped A penny saved is a penny earned!.BenFranklinDuplessis.jpg

Windows Azure Web Sites also has other features – your site can be in PHP or ASP.NET or Node.js or Python. And you can get continuous deployment from GitHub or Bitbucket or TFS or Dropbox or others. And you get monitoring and other features from the portal. And more.

But at least you know you can host in blob storage if you like.

[This is part of a series of posts on #StupidAzureTricks, explained here.]

Stupid Azure Trick #4 – C#, Node.js, and Python side-by-side – Three Simple Command Line Tools to Copy Files up to Windows Azure Blob Storage

Windows Azure has a cloud file storage service known as Blob Storage.

[Note: Windows Azure Storage is broader than just Blob Storage, but in this post I will ignore its sister services Table Storage (a NoSQL key/value store) and Queues (a reliable queuing service).]

Before we get into the tricks, it is useful to know a bit about Blog Storage.

The code below is very simple – it uploads a couple of files to Blob Storage. The files being uploaded are JSON, so it includes proper setting of the HTTP content-type and sets up caching. Then it lists a directory of the files up in that particular Blob Storage container (where a container is like a folder or subdirectory in a regular file system).

The code listed below will work nicely on a Windows Azure Dev-Test VM, or on your own desktop. Of course you need a Windows Azure Storage Account first, and the storage credentials. (New to Azure? Click here to access a free trial.) But once you do, the coding is straight-forward.

  • For C#: create a Windows Console application and add the NuGet packaged named “Windows Azure Storage”
  • For Node.js: run “npm install azure” (or “npm install azure – –global”)
  • For Python: run “pip install azure” to get the SDK
  • We don’t cover it here, but you could also use PowerShell or the CLI or the REST API directly.

Note: these are command line tools, so there isn’t a web project with config values for the storage keys. So in lieu of that I used a text file on the file system. Storage credentials should be stored safely, regardless of which computer they are used on, so beware my demonstration only using public data so my storage credentials in this case may not be as damaging, if lost, as some others.

Here’s the code. Enjoy!

using System;
using System.Diagnostics;
using System.IO;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
internal class Program
{
private static void Main(string[] args)
{
var storageAccountName = "azuremap";
// storage key in file in parent directory called <storage_account_name>.storagekey
var storageAccountKey = File.ReadAllText(String.Format("d:/dev/github/{0}.storagekey", storageAccountName));
//Console.WriteLine(storageAccountKey);
var storageContainerName = "maps";
var creds = new StorageCredentials(storageAccountName, storageAccountKey);
var storageAccount = new CloudStorageAccount(creds, useHttps: true);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(storageContainerName);
string[] files = {"azuremap.geojson", "azuremap.topojson"};
foreach (var file in files)
{
CloudBlockBlob blockBlob = container.GetBlockBlobReference(file);
var filepath = @"D:\dev\github\azuremap\upload\" + file;
blockBlob.UploadFromFile(filepath, FileMode.Open);
}
Console.WriteLine("Directory listing of all blobs in container {0}", storageContainerName);
foreach (IListBlobItem blob in container.ListBlobs())
{
Console.WriteLine(blob.Uri);
}
if (Debugger.IsAttached) Console.ReadKey();
}
}
view raw upload.cs hosted with ❤ by GitHub
var azure = require('azure');
var fs = require('fs');
var storageAccountName = 'azuremap' // storage key in file in parent directory called <storage_account_name>.storagekey
var storageAccountKey = fs.readFileSync('../../%s.storagekey'.replace('%s', storageAccountName), 'utf8');
//console.log(storageAccountKey);
var storageContainerName = 'maps';
var blobService = azure.createBlobService(storageAccountName, storageAccountKey, storageAccountName + '.blob.core.windows.net');
var fileNameList = [ 'azuremap.geojson', 'azuremap.topojson' ];
for (var i=0; i<fileNameList.length; i++) {
var fileName = fileNameList[i];
console.log('=> ' + fileName);
blobService.createBlockBlobFromFile(storageContainerName, fileName, fileName,
{ contentType: 'application/json', cacheControl: 'public, max-age=3600' }, // max-age units is seconds, so 31556926 is 1 year
function(error) {
if (error) {
console.error(error);
}
});
}
blobService.listBlobs(storageContainerName,
function(error, blobs) {
if (error) {
console.error(error);
}
else {
console.log('Directory listing of all blobs in container ' + storageContainerName);
for(var i in blobs) {
console.log(blobs[i].name);
}
}
});
view raw upload.js hosted with ❤ by GitHub
from azure.storage import *
storage_account_name = 'azuremap' # storage key in file in parent directory called <storage_account_name>.storagekey
storage_account_key = open(r'../../%s.storagekey' % storage_account_name, 'r').read()
//print(storage_account_key)
blob_service = BlobService(account_name=storage_account_name, account_key=storage_account_key)
storage_container_name = 'maps'
blob_service.create_container(storage_container_name)
blob_service.set_container_acl(storage_container_name, x_ms_blob_public_access='container')
for file_name in [r'azuremap.geojson', r'azuremap.topojson']:
myblob = open(file_name, 'r').read()
blob_name = file_name
blob_service.put_blob(storage_container_name, blob_name, myblob, x_ms_blob_type='BlockBlob')
blob_service.set_blob_properties(storage_container_name, blob_name, x_ms_blob_content_type='application/json', x_ms_blob_cache_control='public, max-age=3600')
# Show a blob listing which now includes the blobs just uploaded
blobs = blob_service.list_blobs(storage_container_name)
print("Directory listing of all blobs in container '%s'" % storage_container_name)
for blob in blobs:
print(blob.url)
# format for blobs is: <account>.blob.core.windows.net/<container>/<file>
# example blob for us: pytool.blob.core.windows.net/pyfiles/clouds.jpeg
view raw upload.py hosted with ❤ by GitHub

Useful Links

Python

http://research.microsoft.com/en-us/projects/azure/an-intro-to-using-python-with-windows-azure.pdf

http://research.microsoft.com/en-us/projects/azure/windows-azure-for-linux-and-mac-users.pdf

http://www.windowsazure.com/en-us/develop/python/

SDK Source for Python: https://github.com/WindowsAzure/azure-sdk-for-python

Node.js

http://www.windowsazure.com/en-us/develop/nodejs/

SDK Source for Node.js: https://github.com/WindowsAzure/azure-sdk-for-node

http://www.windowsazure.com/en-us/documentation/articles/storage-nodejs-how-to-use-blob-storage/

C#/.NET

http://www.windowsazure.com/en-us/develop/net/

Storage SDK Source for .NET: https://github.com/WindowsAzure/azure-storage-net

Storage Client Library 3: http://msdn.microsoft.com/en-us/library/dn495001%28v=azure.10%29.aspx

[This is part of a series of posts on #StupidAzureTricks, explained here.]

Where’s Azure? Mapping Windows Azure 4 years after full General Availability.

On October 27, 2008, Windows Azure was unveiled publicly by Microsoft Chief Architect Ray Ozzie at Microsoft’s Professional imageDeveloper’s Conference.

Less than a year later, on November 17, 2009, Windows Azure was unleashed on the world – anyone could go create an account.

image

Only a couple of months later, on January 1, 2010, Windows Azure turned on its Service Level Agreement (SLA) – you could now get production support.

And finally, on Feb 1, 2010, Windows Azure became self-aware – billing was turned on, completing the last step in them being fully open as a business.

That was 4 years ago today. Happy Anniversary Azure! I am not calling this a “birthday” since it isn’t – it was born years earlier as the Red Dog project – but this is the fourth anniversary of it being a fully-operational, pay-as-you-go, public cloud platform.

At the time, there were 6 Windows Azure data centers available – 2 each in Asia, Europe, and North America: East Asia, SE Asia, North Europe, West Europe, North Central US, South Central US. (Ignoring the Content Delivery Network (CDN) nodes which I plan to cover another time.)

What about today? With the addition in 2012 of East US and West US data centers, today there are 8 total production data centers, but more on the way.

Here’s a map of the Windows Azure data center landscape. (Source data is in a JSON file in GitHub; pull requests with additions/corrections welcome. CDN data is TBD.)

The lines between data center regions represent failover relationships drawn from published geo-replication sites for Windows Azure Storage. Mostly they are bi-directional, except for Brazil which is one-directional; the metadata on each pushpin specifies its failover region explicitly.

NOTE: this is a work-in-progress that will be updated as “official” names are published for geos and regions.

Also, be sure to click on the map pushpins to see which data center regions are in production and where are coming attractions. Not all of these pushpins represent data centers you can access right now.

There are three insets in order – first a GeoJSON rendering, second a TopoJSON rendering (which should look identical to the GeoJSON one, but included for demonstration purposes, as it is lighter weight), and the third is the raw JSON data from which I am generating the GeoJSON and TopoJSON files. [All the code is here: https://github.com/codingoutloud/azuremap. I plan to blog in the future on how it works.]

Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
[
{ "Geo" : "Asia Pacific", "Region": "Asia Pacific East", "Location": "Hong Kong", "Failover Region" : "Asia Pacific Southeast", "Status" : "Production" },
{ "Geo" : "Asia Pacific", "Region": "Asia Pacific Southeast", "Location": "Singapore", "Failover Region" : "Asia Pacific East", "Status" : "Production" },
{ "Geo" : "Asia Pacific", "Region": "Shanghai China", "Location": "Shanghai China", "Failover Region" : "Beijing China", "Status" : "Production" },
{ "Geo" : "Asia Pacific", "Region": "Beijing China", "Location": "Beijing, China", "Failover Region" : "Shanghai China", "Status" : "Production" },
{ "Geo" : "Australia", "Region": "Australia East", "Location": "Sydney, New South Wales, Australia", "Failover Region" : "Australia SE", "Status" : "Production" },
{ "Geo" : "Australia", "Region": "Australia SE", "Location": "Melbourne, Victoria, Australia", "Failover Region" : "Australia East", "Status" : "Production" },
{ "Geo" : "South America", "Region": "Brazil South", "Location": "Brazil", "Failover Region" : "US South Central", "Status" : "Production" },
{ "Geo" : "Europe", "Region": "Europe West", "Location": "Amsterdam, Netherlands", "Failover Region" : "Europe North", "Status" : "Production" },
{ "Geo" : "Europe", "Region": "Europe North", "Location": "Dublin, Ireland", "Failover Region" : "Europe West", "Status" : "Production" },
{ "Geo" : "Japan", "Region": "Japan East", "Location": "Saitama, Japan", "Failover Region" : "Japan West", "Status" : "Production" },
{ "Geo" : "Japan", "Region": "Japan West", "Location": "Osaka, Japan", "Failover Region" : "Japan East", "Status" : "Production" },
{ "Geo" : "United States", "Region": "US North Central", "Location": "Chicago, IL, USA", "Failover Region" : "US South Central", "Status" : "Production" },
{ "Geo" : "United States", "Region": "US North Central", "Location": "Chicago, IL, USA", "Failover Region" : "US South Central", "Status" : "Production" },
{ "Geo" : "United States", "Region": "US Central", "Location": "Iowa, USA", "Failover Region" : "US East 2", "Status" : "Production" },
{ "Geo" : "United States", "Region": "US East", "Location": "Bristow, Virginia, USA", "Failover Region" : "US West", "Status" : "Production" },
{ "Geo" : "United States", "Region": "US East 2", "Location": "Virginia, USA", "Failover Region" : "US Central", "Status" : "Preview" },
{ "Geo" : "United States", "Region": "US West", "Location": "San Francisco, California, USA", "Failover Region" : "US East", "Status" : "Production" },
{ "Geo" : "United States", "Region": "US Gov-Iowa", "Location": "Iowa, USA", "Failover Region" : "US Gov-Virginia", "Status" : "Preview" },
{ "Geo" : "United States", "Region": "US Gov-Virginia", "Location": "Virginia, USA", "Failover Region" : "US Gov-Iowa", "Status" : "Production" }
]

The map data is derived from public (news releases and blog posts for coming data centers and Windows Azure documentation for existing production regions).

The city information for data centers is not always published, so what I’m using is a mix of data directly published, and information derived from published data. For example, it is well known there is a data center in Dublin, Ireland, but where city should I used for US West region that’s in California? For the latter, I used IP address geocoding of the published data center IP address ranges. This is absolutely not definitive, but just makes for a slightly nicer map. It was from this data that I made assumptions around Tokyo and Osaka locations in Japan and San Francisco in California for US West.

Finally, this map is at the region level which equates roughly to a city (see the project readme for terminology I am using). A region is not necessarily a single location, since there may well be multiple data centers per region and though they will be “near” each other, this is not necessarily in the same city – they could be 1 kilometer apart with a city border between them.