Tag Archives: blob-storage

Stupid Azure Trick #7 – Use Windows Azure’s Local Storage Emulator with Web Sites & VMs

[Ugh – editing 2nd week in a row after accidental early publishing.]

The original programming model for Windows Azure applications was to use Cloud Services (originally known as Hosted Services, but still the same thing). Of particular note, Cloud Services run on VMs with disks that are not-persistent – you can write data locally (some pointers here), but any locally stored data is not guaranteed to stick around. This is a powerful model for some scenarios, especially highly scalable applications. Another feature of Cloud Services has always been that it comes with an emulator you can run locally – on your laptop at 30,000 feet was a common way to hammer home the point. (Remember, Cloud Services were announced in 2008 – a long time before we had wifi on airplanes!) There are actually two emulators: Compute – which emulates the Cloud Service model by supporting Web Role and Worker Role abstractions, and Storage – which emulates Blob, Table, and [Storage] Queue Services. The rest of this post will focus specifically on the Storage Emulator.

waws-wavm-wacs-venn

Since their announcement in 2012, Windows Azure Web Sites and Virtual Machines have been taking on many of the common workloads that used to require Cloud Services. This diagram at least conceptually should capture the sense that the when to use which model decision has become blurred over time. This is good – with more choice comes the freedom to get started more simply – often a Virtual Machine is an easier onramp for existing apps, and a Web Site can be a great onramp for a website that adheres to some of the well-known programming stacks running on PHP, ASP.NET, Python, or Node.js. If you are a big success, consider upgrading to Cloud Services.

Notably absent from the diagram is the Storage Emulator. It should be in the middle of the diagram because while the local storage emulator is still useful for Cloud Services, you can also use it locally when developing applications targeting Windows Azure Web Sites or Virtual Machines.

This is awesome – of course, it will be popular to create applications destined for Windows Azure Web Sites or Virtual Machines that take advantage of the various Storage Services.

So that’s the trick – be sure to take advantage of the Storage Emulator, even when you are not targeting a Cloud Service. You need to know two things: how to turn it on, and how to address it.

Turning on the Storage Emulator

If you create a regular old Web Site and run that in Visual Studio, the Storage Emulator is not turned on. Visual Studio only turns on the Storage Emulator for you when you debug using a Cloud Service, but this is not convenient.

It is easy to turn on. I have a whole post that explains how to start the storage emulator from a shortcut, but the keys are:

  1. Find csrun.exe — In my case: “C:\Program Files\Microsoft SDKs\Windows Azure\Emulator\csrun.exe” 
  2. Run csrun.exe with the parameter /devstore:start which indicates to start up the Storage Emulator.
  3. Done. Of course you might want this is a bat file or as a PowerShell function.

Here’s PowerShell script that will turn it on:

& 'C:\Program Files\Microsoft SDKs\Windows Azure\Emulator\csrun.exe' /devstore:start

Addressing the Storage Emulator

The other part is knowing how to set up your Storage Connection String so that it accesses local storage emulator instead of the cloud.

Here are the values to use to make it look like any other Storage Account, while still addressing local emulated storage rather than in the cloud:

Emulator Storage Account Name: devstoreaccount1
Emulator Storage Account Key: Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==

Resources

The latest version of the Windows Azure Storage Emulator (v2.2.1) is in Preview. This release has support for “2013-08-15” version of Storage which adds CORS and JSON and still has all those features from years gone by…

A comparison of emulated and cloud storage services is also available. There are some differences.

[This is part of a series of posts on #StupidAzureTricks, explained here.]

Advertisement

Stupid Azure Trick #6 – A CORS Toggler Command-line Tool for Windows Azure Blobs

[Edit: I originally accidentally published an old draft. The draft went out to all email subscribers and was public for around 90 minutes. Fixed now.]

In the most recent Stupid Azure Trick installment, I explained how one could host a 1000 visitor-per-day web site for one penny per month. Since then I also explained my choice to use CORS in that same application. Here I will dig into specifically using CORS with Windows Azure.

I also show how the curl command line tool can be helpful to examine CORS properties in HTTP headers for a blob service.

I also will briefly describe a simple tool I built that could quickly turn CORS on or off for a specified Blob service – the CORS Toggler. The CORS Toggler (in its current simple form) was useful to me because of two constraints that were true for my scenario:

  • I was only reading files from the Windows Azure Blob Service. When just reading, pre-flight request doesn’t matter when you are just reading. Simplification #1.
  • I didn’t care whether the blob resource is publicly available, rather than just available to my application. So the CORS policy was to open to any caller (‘*’). Simplification #2.

These two simplifications mean that the toggler knew what it meant to enable CORS (open up for reading to all comers) and to disable. (Though it is worth noting that opening up CORS to any caller is probably a common scenario. Also worth noting that tool could easily extended to support a whitelist for allowed domains or other features.)

First, here’s the code for the toggler – there are three files here:

  1. Driver program (Console app in C#) – handles command line params and such and then calls into the …
  2. Code to perform simple CORS manipulation (C# class)
  3. The above two and driven (in my fast toggler) through the third file (command line batch file) which passes in the storage keys and storage account name for the service I was working with
@echo off
D:\dev\path\to\ToggleCors.exe azuremap 123abcYourStorageKeyIsHere987zyx== %1
echo.
echo FOR COMPARISON QUERY THE nocors SERVICE (which never has CORS set)
echo.
D:\dev\path\to\ToggleCors.exe nocors 123abcYourStorageKeyIsHere987zyx== -q
echo.
PowerShell -Command Get-Date
using System;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
using Microsoft.WindowsAzure.Storage.Shared.Protocol;
using Newtonsoft.Json;
namespace BlobCorsToggle
{
public class SimpleAzureBlobCorsSetter
{
public CloudBlobClient CloudBlobClient { get; private set; }
public SimpleAzureBlobCorsSetter(Uri blobServiceUri, StorageCredentials storageCredentials)
{
this.CloudBlobClient = new CloudBlobClient(blobServiceUri, storageCredentials);
}
public SimpleAzureBlobCorsSetter(CloudBlobClient blobClient)
{
this.CloudBlobClient = blobClient;
}
/// <summary>
/// Set Blob Service CORS settings for specified Windows Azure Storage Account.
/// Either sets to a hard-coded set of values (see below) or clears of all CORS settings.
///
/// Does not check for any existing CORS settings, but clobbers with the CORS settings
/// to allow HTTP GET access from any origin. Non-CORS settings are left intact.
///
/// Most useful for scenarios where a file is published in Blob Storage for read-access
/// by any client.
///
/// Can also be useful in conjunction with Valet Key Pattern-style limited access, as
/// might be useful with a mobile application.
/// </summary>
/// <param name="clear">if true, clears all CORS setting, else allows GET from any origin</param>
public void SetBlobCorsGetAnyOrigin(bool clear)
{
// http://msdn.microsoft.com/en-us/library/windowsazure/dn535601.aspx
var corsGetAnyOriginRule = new CorsRule();
corsGetAnyOriginRule.AllowedOrigins.Add("*"); // allow access to any client
corsGetAnyOriginRule.AllowedMethods = CorsHttpMethods.Get; // only CORS-enable http GET
corsGetAnyOriginRule.ExposedHeaders.Add("*"); // let client see any header we've configured
corsGetAnyOriginRule.AllowedHeaders.Add("*"); // let clients request any header they can think of
corsGetAnyOriginRule.MaxAgeInSeconds = (int)TimeSpan.FromHours(10).TotalSeconds; // clients are safe to cache CORS config for up to this long
var blobServiceProperties = this.CloudBlobClient.GetServiceProperties();
if (clear)
{
blobServiceProperties.Cors.CorsRules.Clear();
}
else
{
blobServiceProperties.Cors.CorsRules.Clear(); // replace current property set
blobServiceProperties.Cors.CorsRules.Add(corsGetAnyOriginRule);
}
this.CloudBlobClient.SetServiceProperties(blobServiceProperties);
}
public void DumpCurrentProperties()
{
var blobServiceProperties = this.CloudBlobClient.GetServiceProperties();
var blobPropertiesStringified = StringifyProperties(blobServiceProperties);
Console.WriteLine("Current Properties:\n{0}", blobPropertiesStringified);
}
internal string StringifyProperties(ServiceProperties serviceProperties)
{
// JsonConvert.SerializeObject(serviceProperties) for whole object graph or
// JsonConvert.SerializeObject(serviceProperties.Cors) for just CORS
return Newtonsoft.Json.JsonConvert.SerializeObject(serviceProperties, Formatting.Indented);
}
}
}
using System;
using System.Diagnostics;
using System.IO;
using Microsoft.WindowsAzure.Storage.Auth;
namespace BlobCorsToggle
{
class Program
{
static string clearFlag = "-clear";
static string queryFlag = "-q";
static void Main(string[] args)
{
if (args.Length == 0 || !ValidCommandArguments(args))
{
#region Show Correct Usage
if (args.Length != 0 && !ValidCommandArguments(args))
{
var saveForegroundColor = Console.ForegroundColor;
Console.ForegroundColor = ConsoleColor.Red;
Console.Write("\nINVALID COMMAND LINE OR UNKNOWN FLAGS: ");
foreach (var arg in args) Console.Write("{0} ", arg);
Console.WriteLine();
Console.ForegroundColor = saveForegroundColor;
}
Console.WriteLine("usage:\n{0} <storage acct name> <storage acct key> [{1}]",
Path.GetFileNameWithoutExtension(Environment.GetCommandLineArgs()[0]),
clearFlag);
Console.WriteLine();
Console.WriteLine("example setting:\n{0} {1} {2}",
Path.GetFileNameWithoutExtension(Environment.GetCommandLineArgs()[0]),
"mystorageacct",
"lala+aou812SomERAndOmStrINglOlLolFroMmYStORaG3akounT314159265358979323ilIkEpi==");
Console.WriteLine();
Console.WriteLine("example clearing:\n{0} {1} {2} {3}",
Path.GetFileNameWithoutExtension(Environment.GetCommandLineArgs()[0]),
"mystorageacct",
"lala+aou812SomERAndOmStrINglOlLolFroMmYStORaG3akounT314159265358979323ilIkEpi==",
clearFlag);
if (Debugger.IsAttached)
{
Console.WriteLine("\nPress any key to exit.");
Console.ReadKey();
}
#endregion
}
else
{
var storageAccountName = args[0];
var storageKey = args[1];
var blobServiceUri = new Uri(String.Format("https://{0}.blob.core.windows.net", storageAccountName));
var storageCredentials = new StorageCredentials(storageAccountName, storageKey);
var blobConfig = new SimpleAzureBlobCorsSetter(blobServiceUri, storageCredentials);
bool query = (args.Length == 3 && args[2] == queryFlag);
if (query)
{
blobConfig.DumpCurrentProperties();
return;
}
blobConfig.DumpCurrentProperties();
Console.WriteLine();
bool clear = (args.Length == 3 && args[2] == clearFlag);
blobConfig.SetBlobCorsGetAnyOrigin(clear);
Console.WriteLine("CORS Blob Properties for Storage Account {0} have been {1}.",
storageAccountName, clear ? "cleared" : "set");
Console.WriteLine();
blobConfig.DumpCurrentProperties();
}
}
private static bool ValidCommandArguments(string[] args)
{
if (args.Length == 2) return true;
if (args.Length == 3 && (args[2] == clearFlag || args[2] == queryFlag)) return true;
return false;
}
}
}
view raw ToggleCors.cs hosted with ❤ by GitHub

One simple point to highlight – CORS properties are simply available on the Blob service object (and would be same for Table or Queue service within Storage):

image

Yes, this is a very simple API.

Showing the Service Object Contents

For those interested in the contents of these objects, here are a few ways to show content of properties (in code) before turning on CORS and after. (The object views are created using the technique I described my post on using JSON.NET as an object dumper that’s Good Enough™.)

DUMPING OBJECT BEFORE CORS ENABLED (just CORS properties):

{“Logging”:{“Version”:”1.0″,”LoggingOperations”:0,”RetentionDays”:null},”Metrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”HourMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”Cors”:{“CorsRules”:[]},”MinuteMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”DefaultServiceVersion”:null}

DUMPING OBJECT AFTER CORS ENABLED:

{“Logging”:{“Version”:”1.0″,”LoggingOperations”:0,”RetentionDays”:null},”Metrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”HourMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”Cors”:{“CorsRules”:[{“AllowedOrigins”:[“*”],”ExposedHeaders”:[“*”],”AllowedHeaders”:[“*”],”AllowedMethods”:1,”MaxAgeInSeconds”:36000}]},”MinuteMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”DefaultServiceVersion”:null}

image

DUMPING OBJECT BEFORE CORS ENABLED (but including ALL properties):

Current Properties:
{“Logging”:{“Version”:”1.0″,”LoggingOperations”:0,”RetentionDays”:null},”Metrics
“:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”HourMetrics”:{“Versio
n”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”Cors”:{“CorsRules”:[]},”MinuteM
etrics”:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”DefaultServiceV
ersion”:null}

DUMPING OBJECT AFTER CORS ENABLED (but including ALL properties):

Current Properties:
{“Logging”:{“Version”:”1.0″,”LoggingOperations”:0,”RetentionDays”:null},”Metrics
“:{“Version”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”HourMetrics”:{“Versio
n”:”1.0″,”MetricsLevel”:0,”RetentionDays”:null},”Cors”:{“CorsRules”:[{“AllowedOr
igins”:[“*”],”ExposedHeaders”:[“*”],”AllowedHeaders”:[“*”],”AllowedMethods”:1,”M
axAgeInSeconds”:36000}]},”MinuteMetrics”:{“Version”:”1.0″,”MetricsLevel”:0,”Rete
ntionDays”:null},”DefaultServiceVersion”:null}

image

Using ‘curl’ To Examine CORS Data:

curl -H "Origin: http://example.com&quot; -H "Access-Control-Request-Method: GET" -H "Access-Control-Request-Headers: X-Requested-With" -X OPTIONS --verbose http://azuremap.blob.core.windows.net/maps/azuremap.geojson
view raw check-cors hosted with ❤ by GitHub

image

CURL OUTPUT BEFORE CORS ENABLED:

D:\dev\github>curl -H “Origin: http://example.com” -H “Access-Control-Request-Method: GET” -H “Access-Control-Request-Headers: X-Requested-With” -X OPTIONS –verbose http://azuremap.blob.core.windows.net/maps/azuremap.geojson

* Adding handle: conn: 0x805fa8
* Adding handle: send: 0
* Adding handle: recv: 0
* Curl_addHandleToPipeline: length: 1
* – Conn 0 (0x805fa8) send_pipe: 1, recv_pipe: 0
* About to connect() to azuremap.blob.core.windows.net port 80 (#0)
*   Trying 168.62.32.206…
* Connected to azuremap.blob.core.windows.net (168.62.32.206) port 80 (#0)
> OPTIONS /maps/azuremap.geojson HTTP/1.1
> User-Agent: curl/7.31.0
> Host: azuremap.blob.core.windows.net
> Accept: */*
> Origin: http://example.com
> Access-Control-Request-Method: GET
> Access-Control-Request-Headers: X-Requested-With
>
< HTTP/1.1 403 CORS not enabled or no matching rule found for this request.
< Content-Length: 316
< Content-Type: application/xml
* Server Blob Service Version 1.0 Microsoft-HTTPAPI/2.0 is not blacklisted
< Server: Blob Service Version 1.0 Microsoft-HTTPAPI/2.0
< x-ms-request-id: 04402242-d4a7-4d0c-bedc-ff553a1bc982
< Date: Sun, 26 Jan 2014 15:08:11 GMT
<
<?xml version=”1.0″ encoding=”utf-8″?><Error><Code>CorsPreflightFailure</Code><Message>CORS not enabled or no matching rule found for this request.
RequestId:04402242-d4a7-4d0c-bedc-ff553a1bc982
Time:2014-01-26T15:08:12.0193649Z</Message><MessageDetails>No CORS rules matches this request</MessageDetails></Error>*
Connection #0 to host azuremap.blob.core.windows.net left intact

CURL OUTPUT AFTER CORS ENABLED:

D:\dev\github>curl -H “Origin: http://example.com” -H “Access-Control-Request-Method: GET” -H “Access-Control-Request-Headers: X-Requested-With” -X OPTIONS –verbose http://azuremap.blob.core.windows.net/maps/azuremap.geojson
* Adding handle: conn: 0x1f55fa8
* Adding handle: send: 0
* Adding handle: recv: 0
* Curl_addHandleToPipeline: length: 1
* – Conn 0 (0x1f55fa8) send_pipe: 1, recv_pipe: 0
* About to connect() to azuremap.blob.core.windows.net port 80 (#0)
*   Trying 168.62.32.206…
* Connected to azuremap.blob.core.windows.net (168.62.32.206) port 80 (#0)
> OPTIONS /maps/azuremap.geojson HTTP/1.1
> User-Agent: curl/7.31.0
> Host: azuremap.blob.core.windows.net
> Accept: */*
> Origin: http://example.com
> Access-Control-Request-Method: GET
> Access-Control-Request-Headers: X-Requested-With
>
< HTTP/1.1 200 OK
< Transfer-Encoding: chunked
* Server Blob Service Version 1.0 Microsoft-HTTPAPI/2.0 is not blacklisted
< Server: Blob Service Version 1.0 Microsoft-HTTPAPI/2.0
< x-ms-request-id: d4df8953-f8ae-441b-89fe-b69232579aa4
< Access-Control-Allow-Origin: http://example.com
< Access-Control-Allow-Methods: GET
< Access-Control-Allow-Headers: X-Requested-With
< Access-Control-Max-Age: 36000
< Access-Control-Allow-Credentials: true
< Date: Sun, 26 Jan 2014 16:02:25 GMT
<
* Connection #0 to host azuremap.blob.core.windows.net left intact

Resources

A new version of the Windows Azure Storage Emulator (v2.2.1) is now in Preview. This release has support for “2013-08-15” version of Storage which includes CORS (and JSON and other) support.

Overall description of Azure Storage’s CORS Support:

http://msdn.microsoft.com/en-us/library/windowsazure/dn535601.aspx

REST API doc (usually the canonical doc for any feature, though in code it is easily accessed with the Windows Azure SDK for .NET)

http://msdn.microsoft.com/en-us/library/windowsazure/hh452235.aspx

A couple of excellent posts from the community on CORS support in Windows Azure Storage:

[This is part of a series of posts on #StupidAzureTricks, explained here.]

Stupid Azure Trick #5 – Got a Penny? Run a Simple Web Site 100% on Blob Storage for a Month – Cost Analysis Provided

Suppose you have a simple static web site you want to publish, but your budget is small. You could do this with Windows Azure Storage as a set of blobs. The “simple static” qualifier rules out ASP.NET and PHP and Node.js – and anything that does server-side processing before serving up a page. But that still leaves a lot of scenarios – and does not preclude the site from being interactive or loading external data using AJAX and behaving like it is dynamic. This one does.

Check out the web site at http://azuremap.blob.core.windows.net/apps/bingmap-geojson-display.html.

image

You may recognize the map from an earlier post that showed how one could visualize Windows Azure Data Center Regions on a map. It should look familiar because this web site uses the exact same underlying GeoJSON data used earlier, except this time the map implementation is completely different. This version has JavaScript code that loads and parses the raw GeoJSON data and renders it dynamically by populating a Bing Maps viewer control (which is also in JavaScript).

But the neat part is there’s only JavaScript behind the scenes. All of the site’s assets are loaded directly from Windows Azure Blob Storage (plus Bing Maps control from an external location).

Here’s the simple breakdown. There is the main HTML page (the URL specifies that directly), and that in turn loads the following four JavaScript files:

  1. http://ecn.dev.virtualearth.net/mapcontrol/mapcontrol.ashx?v=7.0 – version 7.0 of the Bing Map control
  2. httpGetString.js – general purposes data fetcher (used to pull in the GeoJSON data)
  3. geojson-parse.js – application-specific to parse the GeoJSON data
  4. bingmap-geojson-display.js – application-specific logic to put elements from the GeoJSON file onto the Bing Map

I have not tried this to prove the point, but I think that to render on, say, Google Maps, the only JavaScript that would need to change would be bingmap-geojson-display.js (presumably replaced by googlemap-geojson-display.js).

Notice that the GeoJSON data lives in a different Blob Storage Container here:  http://azuremap.blob.core.windows.net/maps/azuremap.geojson. We’ll get into the details in another post, but in order for this to work – in order for …/apps/bingmap-geojson.html to directly load a JSON data file from …/maps/azuremap.geojson – we enabled CORS for the Blob Service within the host Windows Azure Storage account.

Costs Analysis

Hosting a very low-cost (and low-complexity) web site as a few blobs is really handy. It is very scalable and robust. Blob Storage costs come from three sources:

  1. cost of data at rest – for this scenario, probably Blob Blobs and Locally Redundant Storage would be appropriate, and the cost there is $0.068 per GB / month (details)
  2. storage transactions – $0.005 per 100,000 transactions (details – same as above link, but look lower on the page) – where a storage transaction is (loosely speaking) a file read or write operation
  3. outbound data transfers (data leaving the data center) – first 5 GB / month is free, then there’s a per GB cost (details)

The azuremap web site shown earlier weighs in at under 18 KB and is spread across 5 files (1 .html, 3 .js, 1 .geojson). If we assume a healthy 1000 hits a day on our site, here’s the math.

  • We have around 1000 x 31 = 31,000 visits per month.
  • Cost of data at rest would be 18 KB x $0.068 / GB = effectively $0. Since storage starts at less than 7 cents per GB and our data is 5 orders of magnitude smaller, the cost is too small to meaningfully measure.
  • Storage transactions would be 31,000 x 5 (one per file in our case) x $0.005 / 100,000 = $0.00775, or a little more than 3/4 of a penny in US currency per month, around 9 cents per year, or $1 every 11 years.
  • Outbound data transfer total would be 31,000 x 18 KB = 560 MB, which is around 1/10th of the amount allowed for free, so there’d be no charge for that.

So our monthly bill would be for less than 1 penny (less than US$0.01).

This is also a good (though very simple) example of the sort of cost analysis you will need to do when understanding what it takes to create cloud applications or migrate from on-premises to the cloud. The Windows Azure Calculator and information on lower-cost commitment plans may also prove handy.

Alternative Approaches

Of course in this day and age, for a low-cost simple site it is hard to beat Windows Azure Web Sites. There’s an entirely free tier there (details) – allowing you to save yourself nearly a penny every month. That’s pretty good since Benjamin Franklin, one of America’s founding fathers, famously quipped A penny saved is a penny earned!.BenFranklinDuplessis.jpg

Windows Azure Web Sites also has other features – your site can be in PHP or ASP.NET or Node.js or Python. And you can get continuous deployment from GitHub or Bitbucket or TFS or Dropbox or others. And you get monitoring and other features from the portal. And more.

But at least you know you can host in blob storage if you like.

[This is part of a series of posts on #StupidAzureTricks, explained here.]