Moving…

January 26, 2011 Leave a comment

Since joining Microsoft as a full timer it makes more sense for me to move my technology blog to the MSDN blogs.  My new address is http://blogs.msdn.com/b/golive.  I hope you’ll follow me there and continue reading.

Cheers!
Greg

Categories: Uncategorized

Accommodating Your Batch Window in Windows Azure

January 10, 2011 2 comments

Certain functions of a system need to run once per day in a typical “batch window” scenario.

You might already be aware that Azure compute is paid by the CPU-hour.  If you’re occupying the CPU, whether or not actively using it, you’re still paying for it.  You occupy the CPU ‘slot’ by having a role deployed, regardless of state – it doesn’t have to be in the ‘Ready’ state.

So how do you get your batch of work done without paying for the slot 24 hours per day?

The answer is found in the Windows Azure Service Management API.  The CreateDeployment API can be used to deploy an app into an existing Hosted Service.  You can do this in various ways including client code, a PowerShell script, or a web or worker role that’s running in Windows Azure.

There are a couple of versions of the Service Management API: managed code and REST.  At the time of this writing, the REST version of the API is the most complete and feature rich.  The managed version doesn’t currently support the ability to CreateDeployment.

For my purposes, I looked at how to use an Azure role to deploy an app.  The PowerShell script already exists and is easy to use – as depicted in one of the Hands on Labs in the Windows Azure Platform Training Kit.  By creating code that runs in a worker role you can then replicate (multi-instance) it in a management pack for your service, thus giving it high availability and fault tolerance.

The sample code that I provide is in a web role.  I used the web page to display diagnostics as I went along.  The project is just a new cloud solution with a web role as provided by Visual Studio 2010.  I’m using the 1.3 version of the Windows Azure SDK, but everything I’ve done here can also be done with the 1.2 version.  In my code, I:

  1. Load the certificate
  2. Initialize storage
  3. (user clicks “test” button)
  4. Get ServiceConfiguration.cscfg from blob storage
  5. Build request and payload
  6. Submit the request
  7. Display the ID of the request as returned from the platform.
    The source code is posted on CodePlex.

Following are signposts to help you avoid some of the bumps on the road:

Certificate Handling

When you are accessing the Service Management API from the client, you must use a certificate to authenticate the program to the portal.  You must upload the management certificate to the Azure portal and have the original certificate on the client.  The PowerShell script (or windows client) references the local version and matches it to the version that’s in the portal. 

When you’re using a web or worker role, the cert has to be loaded into the role as well as the management certificate store so the role can match its copy to the one in the management certificate store.  Hence, you actually have to export the certificate twice – once to the DER version (to a file with .CER extension) as well as export the private key version (.PFX extension).  The source code that I’ve provided has additional details.

Request URI

One of the parameters in the request URI is <service-name>.  The value that you need to use is the one in the ‘DNS Prefix’ field in the management portal.  It’s case sensitive.

Creating the Payload

The web request for CreateDeployment requires a payload.  I tried a couple of ways to create the payload and decided I like using XDocument and XElement (System.XML.Linq) to manage this.  The other way I tried is plain old embedded text.

If you like the plain old text method, there’s a wrinkle to avoid.  Don’t precede your XML with a newline.  I tried this at first for readability, but the request fails with InvalidXMLRequest.

GOOD

payload = string.Format(@”<?xml version=””1.0″” encoding=””utf-8″” standalone=””yes””?>
<CreateDeployment xmlns=””
http://schemas.microsoft.com/windowsazure””> <Name>{0}</Name>

BAD

payload = string.Format(@”
<?xml version=””1.0″” encoding=””utf-8″” standalone=””yes””?>
<CreateDeployment xmlns=””
http://schemas.microsoft.com/windowsazure””> <Name>{0}</Name>

If you prefer the System.XML.Linq approach, there is another wrinkle to avoid.

XNamespace xn = “http://schemas.microsoft.com/windowsazure”;
XDocument doc = new XDocument(
                         new XElement(xn + “CreateDeployment”,
                         new XElement(xn + “Name”, “gotest-20101231″),

After building up the XDocument, you need to get it into a string before transmitting it.  This string must be UTF-8 encoded.  XDocument.Save wants a StringWriter, and StringWriter.ToString() outputs UTF-16.  (If you forget and do this, you’ll get ProtocolError.)  The solution is to subclass StringWriter, adding an encoder that outputs UTF-8.  Again, details are in the provided source.

Referencing REST Resources

In general, REST resource designators are case sensitive in the Windows Azure REST API.  The portal will enforce lower case on some things, but your deployment package might be upper and lower case, so be careful with that. 

Multiple Versions

There have been several releases of the API, and older versions are still supported.  Hence, when you build your request, you need to specify x-ms-version in a request header.  Check carefully for the correct version; the value given for CreateDeployment is 2009-10-01 (or later) in the Request Headers section of the documentation.  If you read further you’ll find that it must be 2010-04-01 if you include StartDeployment or TreatWarningsAsError in your payload.

I hope these tips come in handy.  Please comment freely and let me know if there are other aspects of this topic that you’d like to hear more about.

Cheers!

Attribution:

Thanks to Steve Marx for his very valuable assistance on several fronts.
Thanks to Ricardo Villalobos for his assistance with the StringWriter subclass.

A Lot Can Happen in a Year

January 9, 2011 1 comment

Wasn’t 2010 a peach?

Let’s see (and in no particular order): I lost one job, got a much better one shortly thereafter, and married Barbara.

DSC06148

September 18, 2010

Barbara and I invited 60 of our friends and family to our home in Redmond, WA.  It was a perfect weekend.  It rained Friday night, but we had a tent on the deck so folks could enjoy a smoked brisket dinner without getting drenched.  The sun came out on Saturday morning, so rather than getting wet we got sunburned during the ceremony.

We didn’t go on honeymoon immediately due to the good weather right here at home, electing instead to drive down to Utah over Thanksgiving and take in the sights at Arches National Park.  What a place!

We enjoyed Thanksgiving dinner with family in western Colorado and then drove home.

A couple days after the wedding I started my new job at Microsoft.  I’ve joined Microsoft’s Developer and Platform Evangelism group – Azure ISV Incubation.  This is to say that I focus my attention on technical enablement of ISVs that want to bring new applications to Microsoft’s cloud: Windows Azure.  This will be the focus of most of my new posts.

Cheers!
Greg Oliver
Senior Architect Evangelist
Microsoft Corporation

Categories: Uncategorized

Getting Hyper-V Squared Away

December 12, 2009 Leave a comment

You could never tell it from asking my girlfriend, Barbara, but I’m a neat freak in some respects.  So what does this and getting Hyper-V squared away have to do with learning Sharepoint?  I’ve spent most of my life as a developer. (And entrepreneur, but that’s another story.)  I can’t stand reading unreadable code, messy code, or otherwise difficult to understand code.  I like my mental models to be well organized and this impacts directly on the physical implementation of a new computer system or code that I write.  So I want the environment that I’m working in, including how Hyper-V VMs are organized, to be nice and neat.  A messy or inefficient network pathway, for example, will drive me nuts.   

Things that I want to clean up:
1. Storage.  I’ll consider carefully the pros and cons of physical vs virtual hard disk for each VMs main drive.
2. Network.  Similarly, I’d like each VM to be able to communicate with each other VM and the parent efficiently.     

Storage    

Recall that I already took the steps to separate each VM onto its own physical drive.  So, each VM has a single VHD that is stored on its own physical disk.  I asked around a bit and heard that if the VM boots from physical disk, rather than VHD, that it’ll enjoy better performance.  The documentation says this is possible and tells you how to do it, but I ran into one of those “undocumented features”.  There’s a key setting that is disabled with no explanation.  Wouldn’t it be great if a disabled control had tooltips that explains why it’s disabled?!? 

Physical Hard Disk radio button disabled

 While I was off trying to discover how to enable this feature switch, I learned the reasons why not to do it: 1. You can’t do a snapshot of a VM once it has physical hard drives attached, 2. You can’t do differencing, 3. You can’t dynamically expand the disk.  So I’ve decided that VHD was a better way to go after all.  I don’t really need to squeeze out every last drop of performance.  The other main reason for using physical disk is to remove the limitation on the size of VHD which isn’t an issue for me.  Oh – and in case there’s an eagle eye out there that notices the IDE Controller Location is “in use”, it doesn’t matter one way or the other.  I tried it both ways.     

Network    

 As I start this section, I’ve already read John Howard’s blog post regarding basic networking in a Hyper-V environment.  So what I expected to see when I went into my network settings was a single network adapter with a cable attached and a second network adapter with no cable.  (Need to bring a second cable from work.)  What I see instead is this: 

Where did Network Connection 4 come from?!?

 This looks strangely familiar, and after re-reading the John’s blog, I think I understand that with Windows Server 2008 R2 Hyper-V, they’ve added an external network for me automatically.  I certainly didn’t create the virtual switch!  I should have been suspicious when I was able to share files between VMs without doing anything special.  So it seems that all I have to do now is plug in the second CATV cable so the host partition has a dedicated channel.  Seems a shame to waste a perfectly good NIC.   

Recap  For those of you that skipped to the end, it wasn’t such a bad decision!  It turns out that I had storage done pretty much optimally from the get-go and the Hyper-V installation took care of my basic networking needs for me!  But I learned a lot along the way, which was the whole idea.  Only one nagging question remains:  What’s up with that disabled radio button that allows for a physical hard disk (otherwise known as passthrough disk) on a Hyper-V VM?

Categories: Hyper-V Tags:

Passionate about Sharepoint, Part 2

December 8, 2009 1 comment

Two weeks into this little project and so far no major stumbling blocks, thank heaven!

The three new drives came in the mail but I had to wait until the next day to install them and get them configured correctly.  Remember that I wanted to put the two client O.S.s on Drive Zero – the one that came with the machine.  The three server O.S.s will each have a separate drive.  This is because the client O.S.s  will never be running at the same time and the layout appeals to my sense of symmetry.

Figuring out how to install the physical Windows Server O.S. on Drive One wasn’t obvious.  I finally settled on using Boot Camp (multi-boot facility that comes with Mac OS X) which made it quite easy.  There might be a solution that doesn’t involve Boot Camp, but after an hour of looking around (using Bing, of course) I came up dry.  When I started up Boot Camp to see if it would work, it noticed the new drives immediately and offered the opportunity to install the O.S.  How cool is that!?!

Once Windows Server 2008 R2 was installed on Drive One, I wiped out my original installation after preserving the Windows 7 VHD.  I then set up Hyper-V same as last time, thus arriving back where I was before, only with everything installed on two drives instead of one.  Next step was to create a couple more VMs on Drive Two and Drive Three and install the O.S. a couple more times.  This is getting easy!

At this point all of the VMs are installed and running the correctly O.S.  They all run simultaneously, as expected, but without workloads to put on them it’s hard to tell how they’re going to do. 

Upcoming tasks? 

All 4 copies of Windows are using the same network connection.  This needs to be fixed.  As it is, the pathways between VMs would seem to include a trip out to the router and back again.  There are Hyper-V tricks that you can play to alleviate this problem.  I’ll get to this in one of my next posts.

I think my next step is going to be purchasing some training materials and starting to study.  I’m using the Microsoft Learning web site (http://www.microsoft.com/learning) to find the certs and learning resources.  First step:  exam 70-631: TS: Configuring Microsoft Windows SharePoint Services 3.0.  This is the free portion of SharePoint, and I’m not sure about the scaling model.  I guess I’ll find out soon!

Cheers!
Greg Oliver
ISV Developer Evangelist
Microsoft Corporation

Categories: Sharepoint Tags:

Passionate about Sharepoint, Part 1

December 3, 2009 Leave a comment

I’ve been passionate about Sharepoint for some time now. Almost since the first time I used it. You know – back when everyone saw it as a fancy file system. I’ve been working at Microsoft for about 3 years as a contractor and the more I learn about it, the more I like it. Besides the flexibility, depth and richness of the platform, the job opportunities are immense!

So, I’ve decided that I will develop some expertise. And, to measure myself I think I’ll earn a couple certs. I don’t kid myself that earning certs will get me my next job for what I make now. But certs aren’t a bad place to start. I think of them as a standardized bar. And obviously, they don’t mean much if you simply learn the test, which I will not do.

How to go about this? I don’t have “the next job” lined up, so I don’t feel right about investing lots of money in a boot camp. I’m sure I could attend and pass the training course and then the exam. But that’s a pretty big investment with no specific destination in mind. What’s the rush? I’ve thought about an online instructor led training as well. That still requires that I take time off from work. Since I work by the hour, that means not getting paid. Not so good. That leaves the old stand-by. Self-Study. That takes discipline, equipment and plenty of time. Sounds good!

I think I have the discipline and the time. The equipment is another thing. My desktop machine at home has a processor that won’t run the Windows Server O.S. So I’m going to have to buy something. I live in a 2 bedroom apartment with my girlfriend, Barbara, so I have to be careful not to overwhelm the place with hardware and noise. To really learn Sharepoint from an I.T. perspective, I want to build up a 3 server farm so that I can install the UI, Core and Search portions separately.

I started looking at Dell and HP servers and got completely overwhelmed with the options. Even the savings that can be gotten by shopping Craig’s List doesn’t protect from a bad decision. After a couple of weeks of trawling various places like this and never getting comfortable with a solution, I ran across a blog posting by John Robbins of Wintellect. In it he gives the details of setting up Windows Server systems on Mac Pro hardware. After reading a couple of his posts, I was sold. I went out and bought myself a quad-core Mac Pro.

So far, I have partitioned the drive in half, giving Windows about 300GB. I then installed WS2008 R2 and switched on the Hyper-V role. After a reboot, the server O.S. became the primary guest O.S. I then created a Virtual Machine in a Virtual Hard Drive, assigned 1GB RAM and installed Windows 7 Ultimate. Initially I was having trouble with file access between the two partitions and the display was stuck on 800×600. Installing the drivers from the Boot Camp disk solved these issues without fuss. Everything appears to be working just as it should. I’ve even installed Live Mesh inside the Windows 7 VM and used it to remote into my office desktop machine!

Now, the server software that I’ve installed is trial. So I’m going to have to re-install before too long. I’ve purchased 3 more drives, exactly like the primary drive. I don’t know if I’ll buy a RAID controller, but with 4 identical drives I’m set for that. I’m going to have to buy some software from Microsoft. I’ll probably get an MSDN subscription or maybe the Action Pack. Once I have these things I’ll proceed with my training course.

I’ve been wondering how to distribute the bits once all of the storage is installed. It makes sense to me to put the Windows 7 O.S. into physical storage that is shared with Mac OS X. They’ll never both be running at the same time. Then I could put each of the 3 server O.S. installations on their own drives. And what about backup? But we’ll see. The whole idea of this is to be able to burn it down and build it back up easily. And have a snappy system to use as I go through the training materials. More on that later.

Cheers!
Greg Oliver
ISV Developer Evangelist
Microsoft Corporation
Categories: Sharepoint Tags: ,
Follow

Get every new post delivered to your Inbox.