Stupid Azure Trick #4 – C#, Node.js, and Python side-by-side – Three Simple Command Line Tools to Copy Files up to Windows Azure Blob Storage

Windows Azure has a cloud file storage service known as Blob Storage.

[Note: Windows Azure Storage is broader than just Blob Storage, but in this post I will ignore its sister services Table Storage (a NoSQL key/value store) and Queues (a reliable queuing service).]

Before we get into the tricks, it is useful to know a bit about Blog Storage.

The code below is very simple – it uploads a couple of files to Blob Storage. The files being uploaded are JSON, so it includes proper setting of the HTTP content-type and sets up caching. Then it lists a directory of the files up in that particular Blob Storage container (where a container is like a folder or subdirectory in a regular file system).

The code listed below will work nicely on a Windows Azure Dev-Test VM, or on your own desktop. Of course you need a Windows Azure Storage Account first, and the storage credentials. (New to Azure? Click here to access a free trial.) But once you do, the coding is straight-forward.

  • For C#: create a Windows Console application and add the NuGet packaged named “Windows Azure Storage”
  • For Node.js: run “npm install azure” (or “npm install azure – –global”)
  • For Python: run “pip install azure” to get the SDK
  • We don’t cover it here, but you could also use PowerShell or the CLI or the REST API directly.

Note: these are command line tools, so there isn’t a web project with config values for the storage keys. So in lieu of that I used a text file on the file system. Storage credentials should be stored safely, regardless of which computer they are used on, so beware my demonstration only using public data so my storage credentials in this case may not be as damaging, if lost, as some others.

Here’s the code. Enjoy!

using System;
using System.Diagnostics;
using System.IO;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
internal class Program
private static void Main(string[] args)
var storageAccountName = "azuremap";
// storage key in file in parent directory called <storage_account_name>.storagekey
var storageAccountKey = File.ReadAllText(String.Format("d:/dev/github/{0}.storagekey", storageAccountName));
var storageContainerName = "maps";
var creds = new StorageCredentials(storageAccountName, storageAccountKey);
var storageAccount = new CloudStorageAccount(creds, useHttps: true);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(storageContainerName);
string[] files = {"azuremap.geojson", "azuremap.topojson"};
foreach (var file in files)
CloudBlockBlob blockBlob = container.GetBlockBlobReference(file);
var filepath = @"D:\dev\github\azuremap\upload\" + file;
blockBlob.UploadFromFile(filepath, FileMode.Open);
Console.WriteLine("Directory listing of all blobs in container {0}", storageContainerName);
foreach (IListBlobItem blob in container.ListBlobs())
if (Debugger.IsAttached) Console.ReadKey();
view raw upload.cs hosted with ❤ by GitHub
var azure = require('azure');
var fs = require('fs');
var storageAccountName = 'azuremap' // storage key in file in parent directory called <storage_account_name>.storagekey
var storageAccountKey = fs.readFileSync('../../%s.storagekey'.replace('%s', storageAccountName), 'utf8');
var storageContainerName = 'maps';
var blobService = azure.createBlobService(storageAccountName, storageAccountKey, storageAccountName + '');
var fileNameList = [ 'azuremap.geojson', 'azuremap.topojson' ];
for (var i=0; i<fileNameList.length; i++) {
var fileName = fileNameList[i];
console.log('=> ' + fileName);
blobService.createBlockBlobFromFile(storageContainerName, fileName, fileName,
{ contentType: 'application/json', cacheControl: 'public, max-age=3600' }, // max-age units is seconds, so 31556926 is 1 year
function(error) {
if (error) {
function(error, blobs) {
if (error) {
else {
console.log('Directory listing of all blobs in container ' + storageContainerName);
for(var i in blobs) {
view raw upload.js hosted with ❤ by GitHub
from import *
storage_account_name = 'azuremap' # storage key in file in parent directory called <storage_account_name>.storagekey
storage_account_key = open(r'../../%s.storagekey' % storage_account_name, 'r').read()
blob_service = BlobService(account_name=storage_account_name, account_key=storage_account_key)
storage_container_name = 'maps'
blob_service.set_container_acl(storage_container_name, x_ms_blob_public_access='container')
for file_name in [r'azuremap.geojson', r'azuremap.topojson']:
myblob = open(file_name, 'r').read()
blob_name = file_name
blob_service.put_blob(storage_container_name, blob_name, myblob, x_ms_blob_type='BlockBlob')
blob_service.set_blob_properties(storage_container_name, blob_name, x_ms_blob_content_type='application/json', x_ms_blob_cache_control='public, max-age=3600')
# Show a blob listing which now includes the blobs just uploaded
blobs = blob_service.list_blobs(storage_container_name)
print("Directory listing of all blobs in container '%s'" % storage_container_name)
for blob in blobs:
# format for blobs is: <account><container>/<file>
# example blob for us:
view raw hosted with ❤ by GitHub

Useful Links


SDK Source for Python:


SDK Source for Node.js:


Storage SDK Source for .NET:

Storage Client Library 3:

[This is part of a series of posts on #StupidAzureTricks, explained here.]


3 thoughts on “Stupid Azure Trick #4 – C#, Node.js, and Python side-by-side – Three Simple Command Line Tools to Copy Files up to Windows Azure Blob Storage

  1. Pingback: Friday, February 7, 2014 on #WindowsAzure | Alexandre Brisebois

  2. Pingback: Reading Notes 2014-02-17 | Matricis

  3. Pingback: Friday, February 7, 2014 on #WindowsAzure | TGS Partners

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.