Category Archives: FAQ

Azure FAQ: IP Addresses and DNS

The Azure FAQWhen deploying an application or service to Windows Azure, a public IP address is assigned, making it easy to host a web server, API, or other services. Here are some of the more frequently asked questions asked about these IP addresses.

Q. Will my IP Address be Stable?

Short answer: Yes. Longer answer: For Cloud Services and Virtual Machines (but not Azure Web Sites) the IP address – once assigned – is stable, provided you do not remove the deployment. If you delete the deployment, your IP address goes back into the pool. For most production cloud applications it would very unusual to ever delete the deployment, so this is reasonable. Windows Azure supports in-place updates as well as the VIP Swap approach for Cloud Services, both of which always preserve the IP Address. Windows Azure Web Sites also has an IP Address-preserving swap feature.

Q. Can I map a “Naked” Domain to my Azure App or Service?

Short answer: Yes. The formal name for a so-called “naked” domain is a zone apex. But regardless of what we call it, it is simply a domain without any subdomain prefix. The address “devpartners.com” is a “naked” or “apex” domain, whereas  “www.devpartners.com” is not. And it is not just about counting periods in the domain: “amazon.co.jp” is also an apex domain. A DNS Address Record – or “A Record” for short – is used to configure an apex domain, and an A Record must be mapped to an IP Address. As noted in the question immediately above, you can have a stable IP address in Windows Azure, so therefore a stable A Record is possible, so therefore you can definitely map an apex record to your Windows Azure application or service. You can also use a DNS Canonical Name Record – or “CNAME” for short – to refer to a subdomain in your service. This is easy since, in addition to the stable IP address support mentioned above, Windows Azure provides a DNS name you can assign CNAMEs against. In Cloud Services (which includes Virtual Machines) this is of the form mycloudservice.cloudapp.net. [As opposed to Azure Web Sites which are of the form mywebsite.azurewebsites.net.]

Q. Is the IP Address Range Known?

Short answer: Yes. Longer answer: Microsoft publishes the IP Address Ranges used, organized by data center. So this published list of ranges can be consulted to review the possible IP address ranges. Specifically, the IP Address Ranges are documented here (http://msdn.microsoft.com/en-us/library/windowsazure/dn175718.aspx) and are expressed in Classless Inter-Domain Routing (CIDR) format. Be aware that as capacity increases and new data centers come on line, these ranges will evolve (I assume mostly the number of addresses will grow).

Advertisement

Azure FAQ: How to Use .NET 4.5 with Windows Azure Cloud Services?

Microsoft released version 4.5 of its popular .NET Framework in August 2012. This framework can be installed independently on any compatible machine (check out the .NET FrameworkThe Azure FAQ Deployment Guide for Administrators) and (for developers) come along with Visual Studio 2012.

Windows Azure Web Sites also support .NET 4.5, but what is the easiest way to deploy a .NET 4.5 application to Windows Azure as a Cloud Service? This post shows how easy this is.

Assumption

This post assumes you have updated to the most recent Windows Azure Tools for Visual Studio and the latest SDK for .NET.

For any update to a new operating system or new SDK, consult the Windows Azure Guest OS Releases and SDK Compatibility Matrix to understand which versions of operating systems and Azure SDKs are intended to work together.

You can do this with the Web Platform Installer by installing Windows Azure SDK for .NET (VS 2012) – Latest (best option) – or directly here (2nd option since this link will become out-of-date eventually).

Also pay close attention to the release notes, and don’t forget to Right-Click on your Cloud Service, hit Properties, and take advantage of some of the tooling support for the upgrade:

UpgradeFall2012

Creating New ASP.NET Web Role for .NET 4.5

Assuming you have up-to-date bits, a File | New from Visual Studio 2012 will look something like this:

image

Select a Cloud project template, and (the only current choice) a Windows Azure Cloud Service, and be sure to specify .NET Framework 4.5. Then proceed as normal.

Updating Existing ASP.NET Web Role for .NET 4.5

If you wish to update an existing Web Role (or Worker Role), you need to make a couple of changes in your project.

First, update the Windows Azure Operating System version use Windows Server 2012. This is done by opening your Cloud project (pageofphotos in the screen shot) and opening ServiceConfiguration.Cloud.cscfg.

image

Change the osFamily setting to be “3” to indicate Windows Server 2012.

   osFamily=”3″

As of this writing. the other allowed values for osFamily are “1” and “2” to indicate Windows Server 2008 SP2 and Windows Server 2008 R2 (or R2 SP1) respectively. The up-to-date settings are here.

Now you are set for your operating system to include .NET 4.5, but none of your Visual Studio projects have yet been updated to take advantage of this. For each project that you intend to update to use .NET 4.5, you need to update the project settings accordingly.

image

First, select the project in the Solution Explorer, right-click on it, and choose Properties from the pop-up menu. That will display the screen shown. Now simply select .NET Framework 4.5 from the available list of Target framework options.

If you open an older solution with the newer Azure tools for Visual Studio, you might see a message something like the following. If that happens, just follow the instructions.

WindowAzureTools-dialog-NeedOct2012ToolsForDotNet45

That’s it!

Now when you deploy your Cloud Service to Windows Azure, your code can take advantage of .NET 4.5 features.

Troubleshooting

Be sure you get all the dependencies correct across projects. In one project I migrated, I realized the following came up because I had a mix of projects that needed to stay on .NET 4.0, but those aspects deployed to the Windows Azure Cloud could be on 4.5. If you don’t get this quite right, you may get a compiler warning like the following:

Warning  The referenced project ‘CapsConfig’ is targeting a higher framework version (4.5) than this project’s current target framework version (4.0). This may lead to build failures if types from assemblies outside this project’s target framework are used by any project in the dependency chain.    SomeOtherProjectThatReferencesThisProject

The warning text is self-explanatory: the solution is to not migrate that particular project to .NET 4.5 from .NET 4.0. In my case, I was trying to take advantage of the new WIF features, and this project did not have anything to do with Identity, so there was no problem.

Azure FAQ: How frequently is the clock on my Windows Azure VM synchronized?

Q. How often do Windows Azure VMs synchronize their internal clocks to ensure they are keeping accurate time?

A. This basic question comes up occassionally, usually when there is concern around correlating timestamps across instances, such as for log files or business events.  Over time, like mechanical clocks, computer clocks can drift, with virtual machines (especially when sharing cores) effected even more. (This is not specific to Microsoft technologies; for example, it is apparently an annoying issue on Linux VMs.)

I can’t find any official stats on how much drift happens generally (though some data is out there), but the question at hand is what to do to minimize it. Specifically, on Windows Azure Virtual Machines (VMs) – including Web Role, Worker Role, and VM Role – how is this handled?

According to this Word document – which specifies the “MICROSOFT ONLINE SERVICES USE RIGHTS SUPPLEMENTAL LICENSE TERMS, MICROSOFT WINDOWS SERVER 2008 R2 (FOR USE WITH WINDOWS AZURE)” – the answer is once a week. (Note: the title above includes “Windows Server 2008 R2” – I don’t know for sure if the exact same policies apply to the older Windows Server 2008 SP2, but would guess that they do.)

Here is the full quote, in the context of which services you can expect will be running on your VM in Windows Azure:

Windows Time Service. This service synchronizes with time.windows.com once a week to provide your computer with the correct time. You can turn this feature off or choose your preferred time source within the Date and Time Control Panel applet. The connection uses standard NTP protocol.

So Windows Azure roles use the time service at time.windows.com to keep their local clocks up to snuff.  This service uses the venerable Network Time Protocol (NTP), described most recently in RFC 5905.

UDP Challenges

The documentation around NTP indicates it is based on User Datagram Protocol (UDP). While Windows Azure roles do not currently support you building network services that require UDP endpoints (though you can vote up the feature request here!), the opposite is not true: Windows Azure roles are able to communicate with non-Azure services using UDP, but only within the Azure Data Center. This is how some of the key internet plumbing based on UDP still works, such as the ability to do Domain Name System (DNS) lookups, and – of course – time synchronization via NTP.

This may lead to some confusion since UDP support is currently limited, while NTP being already provided.

The document cited above mentions you can “choose your preferred time source” if you don’t want to use time.windows.com. There are other sources from which you can update the time of a computing using NTP, such as free options from National Institute for Standards and Technology (NIST).

Here are the current NTP Server offerings as seen in the Control Panel on a running Windows Azure Role VM (logged in using Remote Desktop Connection). The list includes time.windows.com and four options from NIST:

Interestingly, when I manually tried changing the time on my Azure role using a Remote Desktop session, any time changes I made were immediately corrected whenever I tried to make changes. Not sure if it was doing an automatic NTP correction after any time change, but my guess is something else was going on since the advertised next time it would sync via NTP did not change based on this.

When choosing a different NTP Server, it did not always succeed (sometimes timing out), but also I did see it succeed, as in the following:

The interesting part of seeing any successful sync with time.nist.gov is that it implies UDP traffic leaving and re-entering the Windows Azure data center. This, in general, is just not allowed – all UDP traffic leaving or entering the data center is blocked (unless you use a VM Role with Windows Azure Connect). To prove this for yourself another way, configure your Azure role VM to use a DNS server which is outside of the Azure data center; all subsequent DNS resolution will fail.

If “weekly” is Not Enough

If the weekly synchronization frequency is somehow inadequate, you could write a Startup Task to adjust the frequency to, say, daily. This can be done via the Windows Registry (full details here including all the registry settings and some tools, plus there is a very focused summary here giving you just the one registry entry to tweak for most cases).

How frequently is too much? Not sure about time.windows.com, but time.nist.gov warns:

All users should ensure that their software NEVER queries a server more frequently than once every 4 seconds. Systems that exceed this rate will be refused service. In extreme cases, systems that exceed this limit may be considered as attempting a denial-of-service attack.

Of further interest, check out the NIST Time Server Status descriptions:

Name IP Address Location Status
time-a.nist.gov 129.6.15.28 NIST, Gaithersburg, Maryland ntp ok, time,daytime busy, not recommended
time-b.nist.gov 129.6.15.29 NIST, Gaithersburg, Maryland ntp ok, time,daytime busy, not recommended
time-nw.nist.gov 131.107.13.100 Microsoft, Redmond, Washington ntp, time ok, daytime busy, not recommended
time.nist.gov 192.43.244.18 NCAR, Boulder, Colorado All services busy, not recommended

They recommend against using any of the servers, at least at the moment I grabbed these Status values from their web site.  I find this amusing since – other than the default time.windows.com – these are the only four servers offered as alternatives in the User Interface of the Control Panel applet. As I mentioned above, sometimes these servers timed out on an on-demand NTP sync request I issued through the applet user interface; this may explain why.

It may be possible to use a commercial NTP service, but I don’t know if the Windows Server 2008 R2 configuration supports it (at least I did not see it in the user interface), and if there was a way to specify it (such as in the registry), I am not sure that the Windows Azure data center will allow the UDP traffic to that third-party host. (They may – I just don’t know. They do appear to allow UDP requests/responses to NIST servers. Not sure if this is a firewall/proxy rule, and if so, is it for NTP, or just NTP to NIST?)

And – for the (good kind of) hacker in you – if you want to play around with accessing an NTP service from code, check out this open source C# code.

Is this useful? Did I leave out something interesting or get something wrong? Please let me know in the comments! Think other people might be interested? Spread the word!

Azure FAQ: Can I create a Startup Task that executes only when really in the Cloud?

Q. Can I create a Startup Task that executes only when really in the Cloud? I mean really in the cloud. In other words, can I get my Startup Task to NOT RUN when I debug/deploy my Windows Azure application on my development machine?

A. The short answer is that while there is no built-in support for this, you can get the same effect by using a simple trick to add logic to your Startup Script to provide sufficient control. Before getting into that, let’s describe the problem in a bit more detail. Update 14-Oct-2011: Stop the presses!! This capability is now built into Windows Azure! Steve Marx has a blog post on the matter. I will leave this blog post around since the details in it may be of value for other scenarios.

Suppose you want to use ASP.NET MVC 3 in your Windows Azure Web Role. At the time of this writing, MVC 2 was installed in Azure, but not MVC 3. What to do? The short answer is, you can install MVC 3 along with your application at deployment time in the cloud. This type of prerequisite installation is most conveniently handled using a Startup Task. The idea is that I include the ASP.NET MVC 3 bits with my app, and define a Startup Task that installs these bits, and I can set things up easily so that these bits are already installed before my Web Role tries to run (via a Simple Startup Task). This is a pretty clean solution. (For more on Startup Tasks and how to configure them see How to Define Startup Tasks for a Role. For specific guidance on installing ASP.NET MVC 3 as a Startup Task, see Technique #2 in the ASP.NET MVC 3 in Windows Azure post on Steve Marx’s blog.)

Example Startup Task That ALWAYS Runs

Of course, installing ASP.NET MVC 3 is only one example. Here is another example – a Startup Task that enables support for ADSI with IIS – let’s call it enable-webmetabase.cmd. First, you would add the following entry to ServiceDefinition.csdef:

<?xml version=”1.0″ encoding=”utf-8″?>
<ServiceDefinition name=”NameOfMyAzureApp” xmlns=”http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition“>

<Startup>
<Task commandLine=”enable-webmetabase.cmd” executionContext=”elevated” taskType=”simple” />
</Startup>

The contents of enable-webmetabase.cmd would be something like the following (first enabling PowerShell scripting, then executing a specific script):

powershell -command “Set-ExecutionPolicy Unrestricted”
powershell .\enable-webmetabase.ps1

Though the specifics are not important for these instructions, since this script invokes a PowerShell script – let’s call it enable-webmetabase.ps1 – here is what that might look like:

Import-Module ServerManager
Add-WindowsFeature Web-Metabase

And as a final step, you would include both enable-webmetabase.cmd and enable-webmetabase.ps1 with your Visual Studio Project, and set the Copy to Output Directory property on each of these two files to be Copy always. Now, every time you deploy this Azure solution this Startup Task will be executed – and you can feel confident that you won’t have to worry about ADSI in IIS not being available (or whatever it is your Startup Tasks do for you).

Startup Tasks Run in Development Too

But what happens when I wish to deploy this solution on my development machine so I can quickly test it out while I am in the midst of development? Since the Windows Azure Platform has an outstanding local cloud simulation environment (which can be downloaded for free), “local” is the most common deployment target! It is not ideal that the Startup Tasks will run locally – I do not want to continually install ASP.NET MVC (or re-enable web metabase support, etc.) since that will just slow me down.

The Simple Workaround

I know of no built-in support that makes it easy for a Startup Task to “know” whether it is running in the cloud or in your local development environment. But it is simple to roll your own. Here’s what I do:

  • Create an Environment Variable called AZURE_CLOUD_SIMULATION_ENVIRONMENT. While the exact value of this variable does not matter, for the sake of someone else who may see it and be puzzled, I set mine to be “set manually per http://bit.ly/rs5SRN” where the bit.ly link points back to this blog post. 🙂 It also doesn’t matter if the Environment Variable is user-specific or System-wide. If it is a shared development machine, I would make it System-wide (for all users).
  • It is common practice when defining Startup Tasks to create command script using a .cmd file and have that be the Startup Task. Within the Startup Task .cmd file, use the defined keyword (supported in the command shells of recent versions of Windows, such as those you will be using for Azure development and deployment) to add a little logic so that you run only those commands you wish to execute in the current environment.

To set up the AZURE_CLOUD_SIMULATION_ENVIRONMENT environment variable:

  1. Run SystemPropertiesAdvanced.exe to bring up the System Properties dialog box:
  2. Click the Environment Variables button to bring up the Environment Variables dialog box:
  3. Click the New… button at the bottom to bring up the New System Variable dialog box:
  4. Type AZURE_CLOUD_SIMULATION_ENVIRONMENT into the Variable name field, and set manually per http://bit.ly/rs5SRN into the Variable value field:
  5. Hit a few OK buttons and you’ll be done.

Revised “Smart” Startup Task

Of course the trick is that the AZURE_CLOUD_SIMULATION_ENVIRONMENT variable will only be set on development machines, so it will NOT be set in the real cloud, getting you the desired results. Here is the same enable-webmetabase.cmd Startup Task script from above, except rewritten so that when you run it locally it will not do anything to your development machine.

if defined AZURE_CLOUD_SIMULATION_ENVIRONMENT goto SKIP

powershell -command “Set-ExecutionPolicy Unrestricted”
powershell .\enable-webmetabase.ps1

:SKIP

The line “if defined AZURE_CLOUD_SIMULATION_ENVIRONMENT goto SKIP” simply checks whether AZURE_CLOUD_SIMULATION_ENVIRONMENT exists in the environment, and if it does exist, the script jumps over the two powershell lines. This is pretty handy!

Again, in summary, if you follow the very simple approach in this post, the AZURE_CLOUD_SIMULATION_ENVIRONMENT will exist only on development machines – in the simulated cloud – and not out in the “real” cloud.

Not to be Confused with RoleEnvironment.IsAvailable

There is another technique – that is built into Azure – which you can use in code that needs to behave one way when running under Windows Azure, and another way when not running under Windows Azure: RoleEnvironment.IsAvailable. This is good for code that might be deployed both in, say, an Azure Web Role and in a non-Azure ASP.NET web site. For Azure applications, RoleEnvironment.IsAvailable will be true for both the local development machine and when deployed into the public cloud.

While RoleEnvironment.IsAvailable and AZURE_CLOUD_SIMULATION_ENVIRONMENT serve different purposes, they are complementary and can be used together.

For more information on RoleEnvironment.IsAvailable, there is documentation and a good description of its use.

Other Uses for the Technique

Maybe you want to do certain things ONLY in your development environment. For example, perhaps you wish to launch Fiddler. Or maybe uninstall a Windows Service (via InstallUtil /u <service exe name>). Whatever your needs – you can use the same simple technique to make this easy. The following syntax is also supported – each bullet being a single line (though some of them may appear on more than one line in this blog post):

  • if defined AZURE_CLOUD_SIMULATION_ENVIRONMENT (echo AZURE_CLOUD_SIMULATION_ENVIRONMENT equals %AZURE_CLOUD_SIMULATION_ENVIRONMENT%) else (echo AZURE_CLOUD_SIMULATION_ENVIRONMENT is NOT defined)
  • if defined AZURE_CLOUD_SIMULATION_ENVIRONMENT echo DOING SOMETHING
  • if NOT defined AZURE_CLOUD_SIMULATION_ENVIRONMENT echo DOING SOMETHING ELSE

Is this useful? Did I leave out something interesting or get something wrong? Please let me know in the comments! Think other people might be interested? Spread the word!

Azure FAQ: Can I write to the file system on Windows Azure?

[Update 12-Oct-2012] This post only applies to Windows Azure Cloud Services (which have Web Roles and Worker Roles). This post was written a year before Windows Azure Web Sites and Windows Azure Virtual Machines (including Windows and Linux flavors) were announced and does not apply to either of them.

Q. Can I write to the file system from an application running on Windows Azure?

A. The short answer is that, yes, you can. The longer answer involves better approaches to persisting data in Windows Azure, plus a couple of caveats in writing data to (virtual) hard disks attached to the (virtual) machines on which your application is deployed.

Any of your code running in either (a) ASP.NET (e.g., default.aspx or default.aspx.cs) or (b) WebRole.cs/WorkerRole.cs (e.g., methods OnStartup, OnRun, and OnStop which are derived from RoleEntryPoint class) will not have permission to write to the file system. This. is. a. good. thing.®

To be clear, if you have code that currently writes to fixed locations on the file system, you will probably need to change it. For example, your ASP.NET or Role code cannot directly create/write the file c:\foo.txt – the permissions are against you, so Windows will not allow it. (To round out the picture though… You can write to the file system directly if you are running in an elevated Startup Task, but cannot write to it from a limited Startup Task. For more on Startup Tasks and how to configure them see How to Define Startup Tasks for a Role.)

The best option is usually to use one of the cloud-native solutions: use one of the Windows Azure Storage Services or use SQL Azure. These services are all built into Windows Azure for the purpose of supporting scalable, reliable, highly available storage. In practice, this means choosing among Windows Azure Blob storage, Windows Azure Table storage, or SQL Azure.

The second-best option is usually to use a Windows Azure Cloud Drive – which is an abstraction that sits on top of Blob storage (Page blobs, specifically) – and looks and acts a lot like an old-fashioned hard disk. You can access it with a drive letter (though you won’t know the drive letter until deployment time!), it can be mounted by and read from multiple of your role instances, but only one of these at a time will be able to mount it for updating. The Windows Azure Drive feature is really there for backward compatibility – to make it easier to migrate existing applications into the cloud without having to change them. Learn more from Neil Mackenzie’s detailed post on Azure Drives.

The third-best option is usually to use the local hard disk. (And this is what the original FAQ question specifically asked about.) Read on…

Writing Data to Local Drive from Windows Azure Role

So… Can I write to the hard disk? Yes. And you have a decent amount of disk at your disposal, depending on role size. Using Azure APIs to write to disk on your role is known as writing to Local Storage. You will need to configure some space in Local Storage from your ServiceDefinition.csdef by giving that space (a) a name, (b) a size, and (c) indicating whether the data there should survive basic role recycles (via cleanOnRoleRecycle). Note – cleanOnRoleRecycle does not guarantee your data will survive – it is just a hint to the Fabric Controller that, if it is available, should it leave it around or clean it up.  That limitation is fine for data that is easily recalculated or generated when the role starts up – so there are some good use cases for this data, even for cloud-native applications – think of it as a handy place for a local cache. (Up above I refer to this as the usually being the third-best option. But maybe it is the best option! In some use cases it might be. One good example might be if you were simply exploding a ZIP file that was pulled from blob storage, but there are others too. But let’s get back to Local Storage…)

Here is the snippet from ServiceDefinition.csdef:

...
<LocalResources>
<LocalStorage name="SomeLocationForCache"
cleanOnRoleRecycle="false"
sizeInMB="10" />
</LocalResources>
...

You can also use the Windows Azure Tools for Visual Studio user interface to edit these values; double-click on the role you wish to configure from the Roles list in your Windows Azure solution. This is the easiest approach.

Once specified, the named Local Storage area can be written to and read from using code similar to the following:

// reference Microsoft.WindowsAzure.ServiceRuntime.dll from SDK
// (probably in C:\Program Files\Windows Azure SDK\v1.4\ref)
const string azureLocalResourceNameFromServiceDefinition =
"SomeLocationForCache";
var azureLocalResource =
RoleEnvironment.GetLocalResource(
azureLocalResourceNameFromServiceDefinition);
var filepath =
azureLocalResource.RootPath +
"myCacheFile.xml"; // build full path to file
// the rest of the code is plain old reading and writing of files
// using the 'filepath' variable immediately above

Learn more from Neil Mackenzie’s blog post on Local Storage.

Writing to TEMP Folder from Windows Azure Role

How about writing temporary files? Is that supported? Yes, same as in Windows. For example, in .NET one can get a temporary scratch space and write to it using code similar to the following:

var filepath = System.IO.Path.GetTempFileName();
System.IO.File.WriteAllText(filepath, "some text");

Do Not Use Environment.SpecialFolder Locations in Azure

You may also have some existing code which writes files for the currently logged in user. Check the Environment.SpecialFolder Enumeration for the full list, but one examples is Environment.SpecialFolder.ApplicationData. You would access this location with code such as the following:

string filepath = Environment.GetFolderPath(
Environment.SpecialFolder.ApplicationData,
Environment.SpecialFolderOption.DoNotVerify);

You will find that your ASP.NET code will be able to write to this location, but that is almost certainly not what you want! By default, the user account under which you will be saving this data is one that is generated when your role is deployed – something like RD00155D328831$ – not some IPrincipal from your Windows domain.

Further, for data you care about, you don’t want to store data it in the local file system in Windows Azure. Better options should be apparent from earlier points made in this article.

And, finally, you may prefer the elegance of claims-based federated authentication using the Access Control Service.

Writing to File System from Windows Service in Windows Azure Role

If you want to do something unusual, like write to the file system from outside of Role’s code, there are ways to write to the file system from a Windows Service or a Startup Task (though be sure to run your Startup Task with elevated permissions).

Is this useful? Did I leave out something interesting or get something wrong? Please let me know in the comments! Think other people might be interested? Spread the word!

Azure FAQ: How much will it cost me to run my application on Windows Azure?

Q. How much will it cost me to run my application in the Windows Azure cloud platform?

A. The anwer, of course, depends on what you are doing. Official pricing information is available on the Windows Azure Pricing site, and to help you model pricing for your application you can check out the latest Windows Azure Pricing Calculator. Also, the Microsoft Assessment and Planning (MAP) Toolkit is now in beta.

Simple cost example: Running One Instance of a Small Compute Role costs 12¢ per hour, which is around $1052 per year. A SQL Azure instance that holds up to 1 GB costs $9.99 per month. If you have Two Small Compute Instances & 1 GB of SQL Azure storage, plus throwing in some bandwidth use, a dash of Content Delivery Network (CDN) use, and your baseline cost might start at around $2,225.

Update 22-June-2011: The pricing calculators may not reflect this interesting development: data transfer into the Azure Data Centers becomes free on July 1, 2011. See: https://blog.codingoutloud.com/2011/06/22/free-data-transfer-into-azure-datacenters-is-a-big-deal/ and http://blogs.msdn.com/b/windowsazure/archive/2011/06/22/announcing-free-ingress-for-all-windows-azure-customers-starting-july-1st-2011.aspx

But it is not always that simple: this is just the simplest, pay-as-you-go model! In the short term, there are many deals, offers, and trials – some free. There are Azure benefits included with MSDN. And long term there are ways to get better rates if you have an Enterprise Agreement with Microsoft, or by selecting a more predictable baseline than pay-as-you-go. See the Windows Azure Pricing site for current options.

Further, when comparing costs with other options, consider a few factors:

  • The SQL Azure storage is really a SQL Azure cluster of three instances giving you storage redundancy (3 copies of every byte), high availability (with automagic failover), high performance, and other advanced capabilities.
  • Similarly, every byte written to Windows Azure Storage (blobs, tables, and queues) is stored as three copies.
  • Running two Small Compute instances of a role comes with a 99.9% uptime Service Level Agreement (SLA), and a 99.95% connectivity SLA. Read more about the Compute, SQL Azure, and other Windows Azure Platform SLAs here.
  • Since Windows Azure is Platform as a Service (PaaS), be careful to also consider that you may have fewer hassles and lower engineering and operational costs – these are lower staff-time costs – if you are comparing to an Infrastructure as a Service (IaaS) offering.

While you are at it, consider checking out some of these related third-party offerings:

  • CloudValue – A whole company dedicated to understanding and optimizing costs in moving to the cloud. I saw them at TechEd Atlanta in May 2011. They (a) presented a generally useful talk on Cost-Oriented Development (not specific to their technology, though we saw a glimpse of their Visual Studio integrated cost analyzer); and they (b) had a booth so people could check out their CloudValue Online Service Tracking (COST) service which provides ongoing analysis of your costs in the Windows Azure cloud. I am trying out the COST product now that my beta request has been approved!
  • CloudTally – A service offering from Red Gate Software – currently in beta, and currently free – will keep an eye on your SQL Azure database instance and based on how much data you have in it over time, it will report your daily storage costs via email. I’ve been using this for a few months. The data isn’t very sophisticated – of the “you spent $3.21 yesterday” variety – but I think they are considering some enhancements (I even sent them some suggestions).
  • Windows Azure Migration Scanner – An open source tool created by Neudesic to help you identify changes your application might require in order to make it ready for Azure. This is not specifically a cost-analysis tool, but is useful from a cost-analysis point of view since it can help you predict operational costs of the Azure-ready version of your application – for example if you will make changes to leverage the reliable queue service in Windows Azure Storage, you will know enough to model this. Read David Pallmann’s introduction to the scanner, where he also mentions some other tools.
  • Greybox – While not a core tool for calculating costs, it is a interesting open source utility to help you avoid the “I-deployed-to-Azure-for-testing-purposes-but-forgot-all-about-it” memory lapse. (If deployed, you pay – whether you are using it or not. Like an apartment – you pay for it, even while you are at work – though Azure has awesome capabilities for you to “move out of your cloud apartment” during times when you don’t need it!) You may not need it, but its existance illustrates an important lesson!

Credit: I discovered the new Windows Azure Pricing Calcular from http://twitter.com/#!/dhamdhere/status/73056679599677440.

Is this useful? Did I leave out something interesting or get something wrong? Please let me know in the comments! Think other people might be interested? Spread the word!

Azure FAQ: How do I run MapReduce on Windows Azure?

Q. Some cloud environments support running MapReduce. Can I do this in Windows Azure?

A. You can run MapReduce in Windows Azure. First we give some pointers, then get into some other options that might even be more useful or powerful, depending on what you are doing.

Summary of most obvious Azure-oriented choices: (1) Apache Hadoop on Azure, (2) LINQ to HPC leveraging Azure, or (3) Daytona Map/Reduce on Azure.

The first approach is to use the open source Apache Hadoop project which implements MapReduce. Details on how to run Hadoop on Azure are available on the Distributed Development Blog. Update 14-Oct-2011: Check out this write-up by Ted Kummert about his keynote at PASS where he discussed deeper Hadoop support for Windows Azure: “Microsoft makes this possible through SQL Server 2012 and through new investments to help customers manage ‘big data’, including an Apache Hadoop-based distribution for Windows Server and Windows Azure and a strategic partnership with Hortonworks. Our announcements today highlight how we enable our customers to take advantage of the cloud to better manage the ‘currency’ of their data.” Also, Avkash Chauhan provides a nice summary of the announcement.

The MapReduce tutorial on the Apache Hadoop project site explains the goal of the project, as followed by detailed steps on how to use the software.

“Hadoop MapReduce is a software framework for easily writing applications which process vast amounts of data (multi-terabyte data-sets) in-parallel on large clusters (thousands of nodes) of commodity hardware in a reliable, fault-tolerant manner.” – from Overview section of Hadoop MapReduce tutorial

Another entrant in this Big Data Analytics space is LINQ to HPC. For more details on LINQ to HPC, check out David Chappell‘s whitepaper called Introducing LINQ to HPC: Processing Big Data on Windows. Chappell explains the value proposition, and also talks about when you might use it versus using SQL Server Parallel Data Warehouse. LINQ to HPC beta 2 is availlable for download.

[Update 19-July-2011: Daytona enters the fray] “Microsoft has developed an iterative MapReduce runtime for Windows Azure, code-named Daytona.” It is available for download as of early July, though has a non-commercial-use-only license attached to it. (credit: saw it on the insideHPC blog)

[Update 19-July-2011: It is now clear that LINQ to HPC (available in beta 2!) is supplanting DryadLINQ.] You may also be interested in checking out DryadLINQ from Microsoft Research. Though not identical to MapReduce, they describe it as “a simple, powerful, and elegant programming environment for writing large-scale data parallel applications running on large PC clusters.” As of this writing it was not licensed for commercial use, but was available under an academic use license. (With the introduction of LINQ to HPC, I can’t tell whether these projects are related, or whether LINQ to HPC is the productized version of DryadLINQ.)

And, finally, I also just read an interesting post called Hadoop is the Answer! What is the Question? by Tim Negris. This brings up some good points about the maturity of Hadoop and other points – if you are thinking about MapReduce, Hadoop, DryadLINQ, or other approaches, give his article a read.

[05-June-2011 updates] Added info from David Chappell and Tim Negris.

Is this useful? Did I leave out something interesting or get something wrong? Please let me know in the comments! Think other people might be interested? Spread the word!