Windows Azure has a cloud file storage service known as Blob Storage.
[Note: Windows Azure Storage is broader than just Blob Storage, but in this post I will ignore its sister services Table Storage (a NoSQL key/value store) and Queues (a reliable queuing service).]
Before we get into the tricks, it is useful to know a bit about Blog Storage.
- At its core it is a file system in the cloud, albeit a highly reliable, super-scalable, and extremely versatile one. It stacks up VERY COMPETITIVELY against the other cloud players.
- Windows Azure Storage supports geo-redundancy (making a copy of your blob data in another data center – depicted as the lines between data center regions in the maps shown in my recent Where’s Azure? post ).
- Microsoft is committed to keeping Blob storage priced on par with Amazon: http://blogs.msdn.com/b/windowsazure/archive/2014/01/24/storage-price-match.aspx
- Interacting with Storage need not be programmatic, as there are some handy tools. Check them out here: http://storagetools.azurewebsites.net/
The code below is very simple – it uploads a couple of files to Blob Storage. The files being uploaded are JSON, so it includes proper setting of the HTTP content-type and sets up caching. Then it lists a directory of the files up in that particular Blob Storage container (where a container is like a folder or subdirectory in a regular file system).
The code listed below will work nicely on a Windows Azure Dev-Test VM, or on your own desktop. Of course you need a Windows Azure Storage Account first, and the storage credentials. (New to Azure? Click here to access a free trial.) But once you do, the coding is straight-forward.
- For C#: create a Windows Console application and add the NuGet packaged named “Windows Azure Storage”
- For Node.js: run “npm install azure” (or “npm install azure – –global”)
- For Python: run “pip install azure” to get the SDK
- We don’t cover it here, but you could also use PowerShell or the CLI or the REST API directly.
Note: these are command line tools, so there isn’t a web project with config values for the storage keys. So in lieu of that I used a text file on the file system. Storage credentials should be stored safely, regardless of which computer they are used on, so beware my demonstration only using public data so my storage credentials in this case may not be as damaging, if lost, as some others.
Here’s the code. Enjoy!
using System; | |
using System.Diagnostics; | |
using System.IO; | |
using Microsoft.WindowsAzure.Storage; | |
using Microsoft.WindowsAzure.Storage.Auth; | |
using Microsoft.WindowsAzure.Storage.Blob; | |
internal class Program | |
{ | |
private static void Main(string[] args) | |
{ | |
var storageAccountName = "azuremap"; | |
// storage key in file in parent directory called <storage_account_name>.storagekey | |
var storageAccountKey = File.ReadAllText(String.Format("d:/dev/github/{0}.storagekey", storageAccountName)); | |
//Console.WriteLine(storageAccountKey); | |
var storageContainerName = "maps"; | |
var creds = new StorageCredentials(storageAccountName, storageAccountKey); | |
var storageAccount = new CloudStorageAccount(creds, useHttps: true); | |
var blobClient = storageAccount.CreateCloudBlobClient(); | |
var container = blobClient.GetContainerReference(storageContainerName); | |
string[] files = {"azuremap.geojson", "azuremap.topojson"}; | |
foreach (var file in files) | |
{ | |
CloudBlockBlob blockBlob = container.GetBlockBlobReference(file); | |
var filepath = @"D:\dev\github\azuremap\upload\" + file; | |
blockBlob.UploadFromFile(filepath, FileMode.Open); | |
} | |
Console.WriteLine("Directory listing of all blobs in container {0}", storageContainerName); | |
foreach (IListBlobItem blob in container.ListBlobs()) | |
{ | |
Console.WriteLine(blob.Uri); | |
} | |
if (Debugger.IsAttached) Console.ReadKey(); | |
} | |
} |
var azure = require('azure'); | |
var fs = require('fs'); | |
var storageAccountName = 'azuremap' // storage key in file in parent directory called <storage_account_name>.storagekey | |
var storageAccountKey = fs.readFileSync('../../%s.storagekey'.replace('%s', storageAccountName), 'utf8'); | |
//console.log(storageAccountKey); | |
var storageContainerName = 'maps'; | |
var blobService = azure.createBlobService(storageAccountName, storageAccountKey, storageAccountName + '.blob.core.windows.net'); | |
var fileNameList = [ 'azuremap.geojson', 'azuremap.topojson' ]; | |
for (var i=0; i<fileNameList.length; i++) { | |
var fileName = fileNameList[i]; | |
console.log('=> ' + fileName); | |
blobService.createBlockBlobFromFile(storageContainerName, fileName, fileName, | |
{ contentType: 'application/json', cacheControl: 'public, max-age=3600' }, // max-age units is seconds, so 31556926 is 1 year | |
function(error) { | |
if (error) { | |
console.error(error); | |
} | |
}); | |
} | |
blobService.listBlobs(storageContainerName, | |
function(error, blobs) { | |
if (error) { | |
console.error(error); | |
} | |
else { | |
console.log('Directory listing of all blobs in container ' + storageContainerName); | |
for(var i in blobs) { | |
console.log(blobs[i].name); | |
} | |
} | |
}); |
from azure.storage import * | |
storage_account_name = 'azuremap' # storage key in file in parent directory called <storage_account_name>.storagekey | |
storage_account_key = open(r'../../%s.storagekey' % storage_account_name, 'r').read() | |
//print(storage_account_key) | |
blob_service = BlobService(account_name=storage_account_name, account_key=storage_account_key) | |
storage_container_name = 'maps' | |
blob_service.create_container(storage_container_name) | |
blob_service.set_container_acl(storage_container_name, x_ms_blob_public_access='container') | |
for file_name in [r'azuremap.geojson', r'azuremap.topojson']: | |
myblob = open(file_name, 'r').read() | |
blob_name = file_name | |
blob_service.put_blob(storage_container_name, blob_name, myblob, x_ms_blob_type='BlockBlob') | |
blob_service.set_blob_properties(storage_container_name, blob_name, x_ms_blob_content_type='application/json', x_ms_blob_cache_control='public, max-age=3600') | |
# Show a blob listing which now includes the blobs just uploaded | |
blobs = blob_service.list_blobs(storage_container_name) | |
print("Directory listing of all blobs in container '%s'" % storage_container_name) | |
for blob in blobs: | |
print(blob.url) | |
# format for blobs is: <account>.blob.core.windows.net/<container>/<file> | |
# example blob for us: pytool.blob.core.windows.net/pyfiles/clouds.jpeg |
Useful Links
Python
http://research.microsoft.com/en-us/projects/azure/an-intro-to-using-python-with-windows-azure.pdf
http://research.microsoft.com/en-us/projects/azure/windows-azure-for-linux-and-mac-users.pdf
http://www.windowsazure.com/en-us/develop/python/
SDK Source for Python: https://github.com/WindowsAzure/azure-sdk-for-python
Node.js
http://www.windowsazure.com/en-us/develop/nodejs/
SDK Source for Node.js: https://github.com/WindowsAzure/azure-sdk-for-node
http://www.windowsazure.com/en-us/documentation/articles/storage-nodejs-how-to-use-blob-storage/
C#/.NET
http://www.windowsazure.com/en-us/develop/net/
Storage SDK Source for .NET: https://github.com/WindowsAzure/azure-storage-net
Storage Client Library 3: http://msdn.microsoft.com/en-us/library/dn495001%28v=azure.10%29.aspx
[This is part of a series of posts on #StupidAzureTricks, explained here.]
Pingback: Friday, February 7, 2014 on #WindowsAzure | Alexandre Brisebois
Pingback: Reading Notes 2014-02-17 | Matricis
Pingback: Friday, February 7, 2014 on #WindowsAzure | TGS Partners