Author Archives: Bill Wilder

Unknown's avatar

About Bill Wilder

My professional developer persona: coder, blogger, community speaker, founder/leader of Boston Azure cloud user group, and Azure MVP

An HTTP header that’s mandatory for this request is not specified: One Cause for Azure Error Message

I recently posted sample code that shows copying a file up to Azure Blob Storage in One Page Of Code. In repurposing the code that deals with Azure Queues, I encountered a perplexing error message in using the Azure CloudQueue class from the SDK. I was able to figure it out, and the actual solution may actually be less interesting than how the solution was discovered, so here it is…

The story of “an HTTP header that’s mandatory for this request is not specified”

First of all, my call to get a queue reference had completed without incident:

queue = queueStorage.GetQueueReference(“myqueue”);

Next I executed this line of seemingly innocuous code:

queue.CreateIfNotExist();

An Exception was raised – a “Microsoft.WindowsAzure.StorageClient.StorageClientException” to be exact – with the following message:

Exception Message: “An HTTP header that’s mandatory for this request is not specified” 

“An HTTP header that’s mandatory for this request is not specified.”

"An HTTP header that's mandatory for this request is not specified."

That didn’t help, so I then checked the Inner Exception:

Inner Exception Message: “The remote server returned an error: (400) Bad Request.”

"The remote server returned an error: (400) Bad Request."

That didn’t help either. So I fired up Fiddler and looked at the http Request and Response (Raw views shown here):

Screen shot mentioning “Server: Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0” and “<HeaderName>x-ms-blob-type</HeaderName>”

If you look carefully in the Response, you will see there are two references to Blobs:

Circled “Server: Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0” and “<HeaderName>x-ms-blob-type</HeaderName>”

Blobs? Yes, blobs.

Blobs… That was my problem. This was supposed to be code to create a queue. A quick check back to my code immediately revealed a cut and paste error on my part. Two actually, as I tried this both against Development Storage and against live Cloud Storage with the same error.

This was the problem – the culpret – the issue – the bug:

    var clientStorageAccount = CloudStorageAccount.DevelopmentStorageAccount;
    CloudQueueClient queueStorage = new CloudQueueClient(clientStorageAccount.BlobEndpoint.AbsoluteUri,clientStorageAccount.Credentials);

As was this:

    CloudQueueClient queueStorage = new CloudQueueClient(String.Format(“http://{0}.blob.core.windows.net”, accountName), creds);

Replacing “Blob” with “Queue” did the trick for both snippets.

Pay the Fiddler

The error message was tricky, requiring that I fire up Fiddler to see the error of my ways. So..  Be careful out there when you Cut & Paste. Or don’t hack at 9:30 in the night. Or check out a Fiddler http trace, which may have additional information. Or all three..

Checking the Fiddler trace is really the interesting lesson from this post. If you are perplexed over some error condition, look at the raw http traffic for additional details – there may be a new clue in there.

Did This Post Help You?

Please leave me a comment if this blog post helped you or if you encountered the same exact error.

Why Don’t Windows Azure Libraries Show Up In Add Reference Dialog when Using .NET Framework Client Profile?

You are writing an application for Windows – perhaps a Console App or a WPF Application – or maybe an old-school Windows Forms app.  Every is humming along. Then you want to interact with Windows Azure storage. Easy, right? So you Right-Click on the References list in Visual Studio, pop up the trusty old Add Reference dialog box, and search for Microsoft.WindowsAzure.StorageClient in the list of assemblies.

But it isn’t there!

You already know you can’t use the .NET Managed Libraries for Windows Azure in a Silverlight app, but you just know it is okay in a desktop application.

You double-check that you have installed Windows Azure Tools for Microsoft Visual Studio 1.2 (June 2010) (or at least Windows Azure SDK 1.2 (last refreshed from June in Sept 2010 with a couple of bug-fixes)).

You sort the list by Component Name, then leveraging your absolute mastery of the alphabet, you find the spot in the list where the assemblies ought to be, but they are not there. You see the one before in the alphabet, the one after it in the alphabet, but no Microsoft.WindowsAzure.StorageClient assembly in sight. What gives?

Look familiar? Where is the Microsoft.WindowsAzure.StorageClient assembly?

Confirmation Dialog after changing from Client Profile to full .NET

Azure Managed Libraries Not Included in .NET Framework 4 Client Profile

If your eyes move a little higher in the Add Reference dialog box, you will see the problem. You are using the .NET Framework 4 Client Profile. Nothing wrong with the Client Profile – it can be a friend if you want a lighter-weight version of the .NET framework for deployment to desktops where you can’t be sure your .NET platform bits are already there – but Windows Azure Managed Libraries are not included with the Client Profile.

image

Bottom line: Windows Azure Managed Libraries are simply not support in the .NET Framework 4 Client Profile

How Did This Happen?

It turns out that in Visual Studio 2010, the default behavior for many common project types is to use the .NET Framework 4 Client Profile. There are some good reasons behind this, but it is something you need to know about. It is very easy to create a project that uses the Client Profile because it is neither visible – and with not apparent option for adjustment – on the Add Project dialog box – all you see is .NET Framework 4.0:

The “Work-around” is Simple: Do Not Use .NET Framework 4 Client Profile

While you are not completely out of luck, you just can’t use the Client Profile in this case. And, as the .NET Framework 4 Client Profile documentation states:

If you are targeting the .NET Framework 4 Client Profile, you cannot reference an assembly that is not in the .NET Framework 4 Client Profile. Instead you must target the .NET Framework 4.

So let’s use the (full) .NET Framework 4.

Changing from .NET Client Profile to Full .NET Framework

To move your project from Client Profile to Full Framework, right-click on your project in Solution Explorer (my project here is called “SnippetUploader”):

image

From the bottom of the pop-up list, choose Properties.

image

This will bring up the Properties window for your application. It will look something like this:

image

Of course, by now you probably see the culprit in the screen shot: change the “Target framework:” from “.NET Framework 4 Client Profile” to “.NET Framework 4” (or an earlier version) and you have one final step:

image

Now you should be good to go, provided you have Windows Azure Tools for Microsoft Visual Studio 1.2 (June 2010) installed. Note, incidentally, that the Windows Azure tools for VS mention support for

…targeting either the .NET 3.5 or .NET 4 framework.

with no mention of support the .NET Client Profile. So stop expecting it to be there!



You can’t add a reference to Microsoft.WindowsAzure.StorageClient.dll as it was not build against the Silverlight runtime

Are you developing Silverlight apps that would like to talk directly to Windows Azure APIs? That is perfectly legal, using the REST API. But if you want to use the handy-dandy Windows Azure Managed Libraries – such as Microsoft.WindowsAzure.StorageClient.dll to talk to Windows Azure Storage – then that’s not available in Silverlight.

As you may know, Silverlight assembly format is a bit different than straight-up .NET, and attempting to use Add Reference from a Silverlight project to a plain-old-.NET assembly just won’t work. Instead, you’ll see something like this:

Visual Studio error message from use of Add Reference in a Silverlight project: "You can’t add a reference to Microsoft.WindowsAzure.StorageClient.dll as it was not build against the Silverlight runtime. Silverlight projects will only work with Silverlight assemblies."

If you pick a class from the StorageClient assembly – let’s say, CloudBlobClient – and check the documentation, it will tell you where this class is supported:

Screen clipping from the StorageClient documentation with empty list of Target Platforms

Okay – so maybe it doesn’t exactly – the Target Platforms list is empty – presumably an error of omission. But going by the Development Platforms list, you wouldn’t expect it to work in Silverlight.

There’s Always REST

As mentioned, you are always free to directly do battle with the Azure REST APIs for Storage or Management. This is a workable approach. Or, even better, expose the operations of interest as Azure services – abstracting them as higher level activities. You have heard of SOA, haven’t you? 🙂

“Cloud Computing 101, Azure Style!” and “Building Cloud-Native Applications on Azure” – Two Talks I Presented at New England Code Camp 14

Yesterday I attended New England Code Camp 14 (check out the #necc14 twitter stream while it lasts). I enjoyed many talks:

  1. Maura Wilder on JavaScript Debugging (@squdgy)
  2. Jason Haley on Comparing the Azure and Amazon Cloud Platforms (@haleyjason)
  3. Jim O’Neil on Dissecting the Azure @Home Application (@jimoneil)
  4. Abby Fichtner on Lean Startups (@hackerchick)
  5. MC’d by Abby, various folks talking about their experiences at startups — 4 talks jam-packed into a fast-paced one-hour session:
    1. Vishal Kumar of savinz.com (“mint.com for shopping”)
    2. Allison Friedman (@rateitgreen) of Rate It Green (“yelp for the green building industry”)
    3. Sean Creely (@screeley) of Embedly (“make friendly embedded links”) – a Y Combinator company providing a service for turning tweets containing media links into something more user friendly (e.g., embed inline YouTube video rather than a link taking you to YouTube)
    4. Marc Held (@getzazu) of getzazu.com (“alarm clock 2.0”)

At Uno’s afterwards, I enjoyed chatting with many folks, including Veronica and Shawn Robichaud (all the way from Maine!), John from BUGC and Blue Fin, Slava Kokaev, entrepreneurs Marc, Billy, Brian, Vishal, and Dan Colon, dev evangelists Jim O’Neil and Chris Bowen, Yilmaz Rona from Trilogy, and of course Maura.

At the Code Camp, I presented twice on Azure-focused topics:

  1. Cloud Computing 101: Azure Style! – an introduction to cloud computing, and an overview of the services that Microsoft’s cloud stack offers
  2. Building Cloud-Native Applications with Azure – a mind-blowing tour of some of the changes that await the technology community as we move our world into the cloud

The Boston Azure User Group is one year old! You can follow the group on twitter @bostonazure. You can also follow me on twitter @codingoutloud. And I hope to see you at the next Boston Azure meeting on Thurs October 21 from 6:00-8:30 PM at NERD (registration and more info).

Azure 101 Talk Presented at Boston Azure User Group’s September Meeting

Last week on Thursday I gave a talk to the Boston Azure User Group[†]: a high level introduction to Windows Azure titled Azure 101 (you can download the Azure 101 slide deck).

I shared the stage with Mark Eisenberg of Microsoft who walked us through some of the features coming in the November update of Windows Azure. One of the sites Mark showed was the Open Source Windows Azure Companion.

Hope to see you next month when Ben Day will talk about how Windows Azure and Silverlight can play nice together.

For up to date information on Boston Azure, follow Boston Azure on twitter (@bostonazure),  keep an eye on the group’s web site (bostonazure.org), or add yourself to the low-volume email announcement list.

[] Yes, I also founded and run the Boston Azure User Group, but it is my first time having the honors as the main speaker.

Programming Windows Azure Blob Storage in One Page of Code

Microsoft Windows Azure supports several storage approaches: Blobs, Tables, Queues, Drives, and CDN. We even have SQL Azure available to us for full relational power. This post will outline some basic thoughts on programming Blob storage in .NET. And at the end there will be one (long) page of example code (though you will need to supply your Database Access Keys for your Azure Cloud Account). This code is a complete program that will upload a file into Azure Blob Storage and mark it as Publicly Readable, as would be suitable for linking to such resources from a public web site.

Do I Need .NET?

No, .NET is not needed to program against Blob storage. Any programming language or platform can be used, provided it can support calling out via http. Programs speak to the Blob storage service in Azure via a RESTful interface – yes, good old-fashioned http goodness.

Isn’t REST Awkward to Program Against?

Well, there are a few details to making these REST requests: construct a well-formed request body, set up the http headers, add your hash (in most cases Azure requires this step as proof you have the right key), create a web connection, send your request, handle the response, and repeat. But in .NET it is even easier due to the Azure SDK where you will find some helper classes, such as CloudBlobContainer, CloudBlobClient, and CloudBlob. These helpful helpers help you help yourself to Blob storage services without having to worry about most of the details – you just deal with some objects.

How Do I Access the Azure SDK, and How Many Thousands of Dollars Does it Cost?

For .NET / Visual Studio developers, download the SDK as part of the Windows Azure Tools for Microsoft Visual Studio. Or, better still, follow these instructions from David Aiken for getting started with Windows Azure.

For non-.NET, non-Visual Studio developers, download the Windows Azure SDK separately.

And even though the Azure SDK makes Azure development super über ultra convenient on .NET, it does not cost any money. A freebie. If you are developing on a non-.NET platform, there is very likely an open source client library for you. Microsoft provides a library now for PHP, too.

Can You Give Me a Short Example?

Sure, here is a code snippet showing the two primary classes in action (and bold blue). Under the hood, there are REST calls being made out to the Blob storage services, but you don’t need to deal with this plumbing in your code.

FileInfo = new FileInfo(“c:/temp/foo.png”);
string blobUriPath = fileInfo.Name;

CloudBlobContainer blobContainer = // getting blob container not shown here

CloudBlob blob = blobContainer.GetBlobReference(blobUriPath);
blob.UploadFile(fileInfo.FullName);

blob.Metadata[“SomeArbitraryPropertyName”] = Guid.NewGuid().ToString(); // arbitrary value
blob.SetMetadata();

blob.Properties.ContentType = “image/png”;
blob.SetProperties();

Are these Calls Really REST Under the Hood!!??

They sure are. You can prove this by firing up an http sniffer like Fiddler. You will see http traffic whiz back and forth.

What if Something Goes Wrong?

Here are a couple of errors I’ve run into:

For other errors or issues, try the Azure Support Forum.

Is it Production Quality Code?

Hmmm… We have a continuous stream of code on a single (long) page, in a single source file… Is it “Production Quality Code” you might wonder? I’m going to go with “no” – this code is not production ready. It is for getting up to speed quickly and learning about Azure Blob Storage.

Can I Tell if My Blobs Get to the Cloud?

You sure can. One way is to use the nifty myAzureStorage.com tool:

Go to http://myAzureStorage.com in your browser:

image

Now you need to know something about how your Azure Storage account was configured in the Cloud. You need to know both the Account Name and one of the Access Key values (Primary or Secondary – it doesn’t matter which).

In our case we will type in the following:

Account Name = bostonazuretemp

Access Key = Gfd1TqS/60hKj0Ob3xPbtbQfmH/R0DMDKDC8VXWpxdMvhRPH1A+f6FMoIzyP+zDQmoN3GYQzJlLOASKKEvTJkA==

Note: the Access Key above is no longer valid. Use a different real one if you like, or see the One Page of Code snippet below for how to do this using local storage in the Dev Fabric.

You may also want to check “Remember Me” and your screen will look something like this:

image

Now simply click on “Log In” and you will see your storage. The default tab is for Table storage, so click the BLOBs tab to view your Blob Containers:

image

In my case I see one – “billw” – and I can click on it to drill into it and see its blobs:

image

And for each blob, I can click on the blob to examine its attributes and metadata:

image

What Project Template Should I Use in Visual Studio?

Create a Visual C# Console Application on .NET Framework 4 using Visual Studio 2010 or Visual C# Express 2010:

image

Show Me the Code!

Okay, the working code – fully functional – on One Page of Code – appears below. After you create a new Visual C# Console application in Visual Studio 2010, as shown above, simply clobber the contents of the file Program.cs with the code below. That oughta be easy. Then start playing with it.

You will also need to add a reference to Microsoft.WindowsAzure.StorageClient – but first you’ll need to switch away from the .NET Framework Client Profile.

Sharing Files on the Public Web using Azure Blob Storage

Also note that the following code will post to Azure Blob Storage in such a way that the item stored will be accessible from a web browser. This is not the default behaviour; read the code to see the couple of lines that influence this.

Note that this code is intensionally compressed to fit in a short space and all in one place – this is not intended to be production code, but “here is a simple example” code. For instance, this code does not use config files – but you should. This is just to help you quickly understand the flow and take all the magic out of getting a code sample to work.

You can also download this code directly: SnippetUploaderInOnePageOfCode.cs.

Without further ado, here is your One Page of Code

using System;
using System.Diagnostics;
using System.IO;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.StorageClient;

namespace CodeSnippetUploader
{
    class Program
    {
#if false
        private const string AccountKey = “Put a real Storage Account Key – find it on http://windows.azure.com dev portal for your Storage Service”;
#else
        private const string AccountKey = null;  // use local storage in the Dev Fabric
#endif
        private const string AccountName = “bostonazuretemp”;
        private const string ContainerName = “snippets”;
        private const string MimeTypeName = “text/plain”; // since these are assumed to be code snippets

        static void Main(string[] args)
        {
            // pass in the single snippet code file you want uploaded
            string snippetFilePath = args[0];

            string baseUri = null;
            CloudBlobClient blobStorage = null;

            if (AccountKey == null)
            {
                var clientStorageAccount = CloudStorageAccount.DevelopmentStorageAccount; // use storage services in the Developer Fabric, not real cloud
                baseUri = clientStorageAccount.BlobEndpoint.AbsoluteUri;
                blobStorage = new CloudBlobClient(baseUri, clientStorageAccount.Credentials);
            }
            else
            {
                byte[] key = Convert.FromBase64String(AccountKey);
                var creds = new StorageCredentialsAccountAndKey(AccountName, key);
                baseUri = string.Format(“http://{0}.blob.core.windows.net“, AccountName);
                blobStorage = new CloudBlobClient(baseUri, creds);
            }

            CloudBlobContainer blobContainer = blobStorage.GetContainerReference(ContainerName);
            bool didNotExistCreated = blobContainer.CreateIfNotExist();

            var perms = new BlobContainerPermissions
            {
                PublicAccess = BlobContainerPublicAccessType.Container // Blob (see files if you know the name) or Container (enumerate like a directory)
            };
            blobContainer.SetPermissions(perms); // This line makes the blob public so it is available from a web browser (no magic needed to read it)

            var fi = new FileInfo(snippetFilePath);
            string blobUriPath = fi.Name; // could also use paths, as in: “images/” + fileInfo.Name;
            CloudBlob blob = blobContainer.GetBlobReference(blobUriPath);
            blob.UploadFile(fi.FullName); // REST call under the hood; use tool like Fiddler to see generated http traffic (http://fiddler2.com)

            blob.Properties.ContentType = MimeTypeName; // IMPORTANT: Mime Type here needs to match type of the uploaded file
                                                        // e.g., *.png <=> image/png, *.wmv <=> video/x-ms-wmv (http://en.wikipedia.org/wiki/Internet_media_type)
            blob.SetProperties(); // REST call under the hood

            blob.Metadata[“SourceFileName”] = fi.FullName; // not required – just showing how to store metadata
            blob.Metadata[“WhenFileUploadedUtc”] = DateTime.UtcNow.ToLongTimeString();
            blob.SetMetadata(); // REST call under the hood

            string url = String.Format(“{0}/{1}/{2}”, baseUri, ContainerName, blobUriPath);
            Process process = System.Diagnostics.Process.Start(url); // see the image you just uploaded (works from Console, WPF, or Forms app – not from ASP.NET app)
        }
    }
}

What causes “specified container does not exist” error message in Windows Azure Storage?

In debugging some Windows Azure Storage code, I ran across a seemingly spurious, unpredictable exception in Azure Blob code where I was creating Blob containers and uploading Blobs to the cloud. The error would appear sometimes… at first there was no discernable pattern… and the code would always work if I ran my code again immediately after a failure. Mysterious…

A Surprising Exception is Raised

When there was an exception raised, this was the error message with some details:

StorageClientException was unhandled - The specified container does not exist

The title bar reads “StorageClientException was unhandled” which is accurate, since that code was not currently in a try/catch block. No problem or surprise there, at least with that part. But the exception text itself was surprising: “The specified container does not exist.”

Uhhhh, yes it does! After calling GetContainerReference, container.CreateIfNotExist() was called to ensure the container was there. No errors were thrown. What could be the problem?

A Clue

Okay, here’s a clue: while running, testing, and debugging my code, occasionally I would want a completely fresh run, so I would delete all my existing data stored in the cloud (that this code cared about at least) by deleting the whole Blob container (called “AzureTop40”). This was rather convenient using the handy myAzureStorage utility:

This seemed like an easy thing to do, since my code re-created the container and any objects needed. Starting from scratch was a convenience for debugging and testing. Or so I thought…

Azure Storage is Strongly Consistent, not Eventually Consistent

Some storage systems are “eventually consistent” – a technique used in distributed scalable systems in which a trade-off is made: we open a small window of inconsistency with our data, in exchange for scalability improvements. One example system is Amazon’s S3 storage offering.

But, per page 130 of Programming Windows Azure, “Windows Azure Storage is not eventually consistent; it is instantly/strongly consistent. This means when you do an update or a delete, the changes are instantly visible to all future API calls. The team decided to do this since they felt that eventual consistency would make writing code against the storage services quite tricky, and more important, the could achieve very good performance without needing this.”

So there should be no problem, right? Well, not exactly.

Is Azure Storage actually Eventually Strongly Consistent?

Okay, “Eventually Strongly Consistent” isn’t a real term, but it does seem to fit this scenario.

I’ve heard more than once (can’t find authoritative sources right now!??) that you need to give the storage system time to clean up after you delete something – such as a Blob container – which is immediately not available (strongly consistent) but is cleaned up as a background job, with a garbage collection-like feel to it. There seems to be a small problem: until the background or async cleanup of the “deleted” data is complete, the name is not really available for reuse. This appears to be what was causing my problem.

Another dimension of the problem was that there was no error from the code that purportedly ensured the container was there waiting for me. At least this part seems to be a bug: it seems a little eventually consistent is leaking into Azure Storage’s tidy instantly/strongly consistent model.

I don’t know what the Azure Storage team will do to address this, if anything, but at least understanding it helps suggest solutions. One work-around would be to just wait it out – eventually the name will be available again. Another is to use different names instead of reusing names from objects recently deleted.

I see other folks have encountered the same issue, also without a complete solution.

Vermont Code Camp – Building Cloud-Native Applications with Azure

I attended Vermont Code Camp 2 yesterday (11-Sept-2010) at the University of Vermont.  Many thanks to the awesome crew of Vermonters who put on an extremely well-organized and highly energetic event! I look forward to #vtcc3 next year. (Twitter stream, while it lasts: #vtcc2)

I presented a talk on Building Cloud-Native Applications using Microsoft Windows Azure. My slides are available as a PPT download and on slideshare.net.

<aside>Maura and I went to Vermont a day early. We put that time to good use climbing to the summit of Vermont’s highest mountain: Mt. Mansfield. We hiked up from Underhill State Park, up the Maple Ridge Trail, over to the Long Trail, up to the summit, then down the Sunset Ridge Trail (map). It was a really tough climb, but totally worth it. I think the round trip was around 7 miles.

</aside>

Gave Azure Storage Talk at VB.NET User Group Meeting

I gave a talk at the Thurs Sept 2, 2010 New England VB.NET user group meeting. Andy Novick covered SQL Azure, and I covered the rest (Blobs, Tables, Queues, Drives, and CDN).

My slides can be downloaded here (which is hosted on Azure Blob storage!).

I also have  plans for a few more Azure-related talks in the near future:

  1. First up is Building Cloud-Native Applications with Windows Azure – at the Vermont Code Camp on Saturday, September 11, 2010.
  2. I am the main speaker at the September 23, 2010 Boston Azure meeting – topic is Azure 101 – the basics. (Then for the October 21, Ben Day will be (most likely) talking about how to integrate Silverlight and Azure.)
  3. I am also planning one or two talks at the New England Code Camp 14 on Saturday October 2 (I haven’t submitted abstracts yet, but probably talks similar to (a) Demystifying Windows Azure and Introduction to Cloud Computing with Azure, and (b) Building Cloud-Native Applications with Windows Azure)

Here is the abstract for the Building Cloud-Native Applications with Windows Azure talk at VT Code Camp:

Cloud computing is here to stay, and it is never too soon to begin understanding the impact it will have on application
architecture. In this talk we will discuss the two most significant architectural mind-shifts, discussing the key patterns
changes generally and seeing how these new cloud patterns map naturally into specific programming practices in Windows
Azure. Specifically this relates to (a) Azure Roles and Queues and how to combine them using cloud-friendly design
patterns, and (b) the combination of relational data and non-relational data, how to decide among them, and how to
combine them. The goal is for mere mortals to build highly reliable applications that scale economically. The concepts
discussed in this talk are relevant for developers and architects building systems for the cloud today, or who want to be
prepared to move to the cloud in the future.

Chris Bowen Speaks at August 2010 Boston Azure Meeting

Many thanks to Chris Bowen who was the guest speaker at the August 2010 Boston Azure user group meeting. The topic was ASP.NET MVC, with an Azure perspective.

chris-bowen-mvc-aug-2010.1

Here are my rough notes:

There was no slide deck – Chris jumped right into the code. Here are a few of my rough notes.

Consider Web Platform Installer 2.0 to install Azure tooling.

  • Windows Azure Platform Tools
  • Visual Web Developer 2010 Express

ASP.NET MVC concepts / benefits:

  • “A lot of convention” – great in the long run, hard to grasp at first…
  • Separation of Concerns – controller then view
  • ASP.NET MVC is closer to the metal than traditional ASP.NET – if you want to implement, say, XHTML, then nothing stands in your way.
  • Strongly-typed Controllers and Views can be generated once your model is in place.
  • Controller may choose to pass along only a ViewModel – subset of full Model, or perhaps enhanced
  • Model Binding is also by convention
  • Hackable URLs

Tips and Tricks:

  • Ctrl-Shift-Click on Visual Studio in Win 7 will launch in Admin mode which Azure requires.
  • Can modify the T4 template for MVC to alter its UI options in wizards.
  • Ctrl-M-G – bring me to the appropriate View for this Action

New in MVC 2 / ASP.NET 4:

  • Html.DisplayForModel
  • RenderActions – new in MVC 2
  • New in ASP.NET 4 (not just ASP.NET MVC 2) is <%: “foo” %> where the “:” is a new feature as shortcut for HTML.Encode for the content.
  • MVC 2 has powerful client-side validation based on characteristics of your model. Does not require a server-side round trip. You specify e.g., [Required] attribute on Model data – and you don’t need to write any imperative code.

 chris-bowen-mvc-aug-2010.2 

http://asp.net/mvc – many great resources.

Windows Azure developer fabric – also known as “the fog” – is the Azure cloud simulator running locally.

Also check out by Arra Derderian’s write-up of the same Boston Azure meeting.

There were around 30 people in attendance at the meeting.