BOF05: Is my data really secure in the Cloud?

Hey Folks, Welcome to another live blog from TechEd 2011 in Atlanta, Georgia.

This is my first Birds of Feather session so it should be rather interesting, before we start lets break down the rules of engagement.

  1. “Birds Nest” – There are six chairs in the front of the room where attendees can come sit and participate in the on-going discussion. One chair will always be left empty to “invite” a new participant. If a new Participant joins the Birds Nest, one of the active participates must leave. the Moderator will facilitate attendees joining or leaving the Birds Nest.
  2. Stand-Up Microphones – There are stand-based microphone in each of the two aisles in the BOF room. Attendees will have the opportunity to line-up at either microphone to add a quick question or comment. Anyone wanting to participate in more depth will be asking to join the Birds Nest. However you do not need to ask a question at the stand-up microphone to join the birds nest.

**Lucky me, I got stuck in the birds nest.

Is Private Cloud more Secure than Public Cloud

Questions to Ask:

Public/Private Cloud Concerns

  • What are the physical security requirements?
  • Encryption of Data
    • Over the wire?
    • At Rest?
  • How do you prevent [D]DOS from your Cloud Applications
    • Anti-Hacking
      • Cloud Provider Employee Security
      • Compliance
      • Multi-tenancy
  • Disaster Recovery
    • Backup
    • Redundancy
  • Auditing& Certification
    • SAS-70
    • PCI
  • Authorization & Authentication
    • How Quickly can I change Access Control?
  • Exit Strategy
    • What happens if I change providers?
  • Data Storage
    • E-Discovery (Easy of Access)
    • Archiving
    • Data Loss

Breaking down the Concerns

Physical Security

Large Cloud Providers have a large budget and a lot of brains for providing better Virtual Security. However there are many concerns about Physical security. Should Cloud Providers allow tours? The majority of the room thinks that a tour should be provided (I think there should be Beer Provided as well).

What has the greater percentage of loss? Internal Hacking or External Hacking? Only 5% of the room was wrong and thought that External Hacking would expose more data.

A thing to note is that a typical Internal Data Loss is through a Employee of 7 plus years of Service as they know all the Security Policies, the concerns and what data would be of the most use to them.

Data Security also comes into play with Cloud providers:

  • Are you sure my data is within the regions I’ve specified
    • Can they Audit that your Data is where it should be
  • Is it a good thing that my data is distributed across many data centers?
  • Do Cloud Providers allow you to access Audit Logs
    • What is the Expiry on logs


Can someone plug into my VM or Access my Storage nodes. Hyper-Jacking is a term that is used to explain about someone breaking into the Virtualized Sandbox where your Virtual Compute instance.


Can the Cloud provider isolate my VMs or Storage from my competitors? Could that be part of my SLA? Should the Cloud Provider be transparent to the point that they can give away the list of their Customers in order to ensure you aren’t around your competitors.

This is a great place where you need to understand want Data you would like to publish to the Cloud and which Data you would maintain On-Premises.

Exit Strategy

It’s always good to have a Plan B, or some sort of Exit Plan.  A great point was made by an Audience member said that the SLA should Provide sort parts of your Exit Strategy.

Should a Cloud provider make it easier for you to migrate your data off of their platform? Once your data is removed from their Storage Service what level of “Removal” is provided? Would they destruct the Hard Drive?

Your exit strategy is something that should be identified upfront, if you think ahead you could ensure that your data isn’t at risk when you look to leave your particular Cloud Provider.


TechEd2011: The Future of Visual Studio ALM

Brian Keller [Former Enemy of The Cloud Cover Show] and Cameron Skinner outline the new features that are coming to the Microsoft Platform in terms of Application Lifecycle Management.

Historical Debugging with Intellitrace

One Feature of Visual Studio 2010 that has been a game changer in my Development Process is Intellitrace. Intellitrace is a History Debugging Platform that allows you to step through the execution chain of your applications to help you debug a problem that you are facing in your application.

Coming in the future is the ability to use Intellitrace in Production. This allows you to deploy the bits for Intellitrace on your Production Server [Yes, there is a slight performance hit when using Intellitrace]. When using Intellitrace in the Production environment you can able Trace Capturing using Powershell.

Once you’ve found the issue in production, you can stop [or suspend] the tracing which has been generating an itrace file which can then be Imported into Visual Studio.

Product Templating in Powerpoint

Let’s face it, Managers and Stakeholder are not the best at wire framing, and we’re getting tired of napkin drawings. Microsoft is working on a feature which is an add-in for Powerpoint which will allow Product Stakeholders, or Developers to Prototype Applications in Powerpoint. Everyone and their mother can use Powerpoint, especially Business Decision makers.

This new extension will allow for creation of unique shapes and allow you to create templated shapes for different controls that you may have created for your Applications.

New Features in Team Foundation Server Web Access

Team Foundation Server has been around for quite some time now, and Web Access has been around for nearly as long. However the way we use the web and how interactive the web experience has become has made Web Access look rather out of date. The new features to Web Access include a lot of Ajax-y goodness.

You’ll be able to:

  • Reorder Tasks using Drag and Drop
  • Move Work Items into Sprints
  • Move Features across the Storyboard

This will allow BDMs to be able to re-organize the teams schedule and work items easily and seamlessly without the need to have Visual Studio Team Explorer.

New Features to Visual Studio for ALM

There were too many new features flying out in this session for me to keep up typing witty comments around the features, so here is a list of Great Enhancements to Visual Studio in vNext:

  • Backing up your environment
    • Avoids long interruptions in context switching
    • Replaces tool windows to where they were
  • “The Hub”
    • Development Dashboard when you enter VS.Next
    • Build onto Team Navigator
  • Analyze Solution for Code Clones
    • Finds Copy/Paste
    • Finds Refactor Points
      • Symantecly similar Code Blocks
      • Exact “Copy/Paste” Code snippets across a solution
  • Unit Test Explorer
    • Extenders: xUnit, nUnit, more to come…
    • runs much faster!
  • Provisional Tab
    • Helps avoid Clutter in the Doc-Well

I know this isn’t very useful, but I’ll be sure to expand on each of these points in coming posts.

Thanks for Joining me for my Live Blog of the Foundation Session on the Future of ALM on the Microsoft Platform.

Continuous Integration in the Cloud

At the recent At the Movies Event put on by ObjectSharp, I demonstrated how to automate deployment of Windows Azure Applications in a TFS Build using a custom TFS Workflow Activity. Automated Deployment to Windows Azure is useful functionality as it replaces a rather tedious repetitive task from our daily routine.

There are a number of ways to automate the Deployment Process to Windows Azure. In this entry I’ll outline how you can use TFS Builds or Powershell to Automate Windows Azure Deployments.

Using TFS Build Activities to Deploy to Windows Azure

To begin Automating Windows Azure Deployments today, download the Deploy To Azure open source project on CodePlex. To understand how to use the Deploy to Azure Activities on Codeplex, read .

Deploying a Windows Azure Applications is considered Management Functionality, this means it is required to upload a Management Certificate to your Windows Azure Account. This Management Certificate [can be Self Signed and] is used to Authenticate access to your Windows Azure Portal  Remotely [Read: How to Create and Export a Certificate].

Once you have your Management Certificate uploaded to the Windows Azure Portal, you will be able to use the Certificate to interact with the Windows Azure Service Management API. If you wish to build your own set of TFS Build Activities like the ones mentioned above Microsoft has created some Sample Code which is a .NET Wrapper of the Management API.

Using Powershell to Deploy to Windows Azure

If you’re an IT Pro or a Developer that is into scripting, it is possible to use powershell to deploy to Windows Azure. Ryan Dunn while he was in the Technical Evangelist role at Microsoft [recently moved to Cumulux] created a set of Commands in the Azure Management Tools Snap-in which allow you to leverage the Windows Azure Service Management API using Powershell. Since Ryan’s Departure Wade Wegner has taken over the project and has been maintaining the updates to the CommandLets with each change of the Windows Azure SDK.

Powershell is very powerful and I can see it becoming a very important part of Windows Azure Development. Just to give the Windows Azure Commandlets a try a created a re-usable powershell script that will deploy an application to Windows Azure. [This script needs to be executed from the same directory as the ServiceConfiguration.cscfg file, or modified to accept the path of the Service Configuration file as an argument.]

As you can see the New-Deployment command will only upload your deployment, it is necessary to execute a second command Set-DeploymentStatus in order to Start your Application after it’s been deployed to the Cloud.


Automating deployment of your applications into Windows Azure is a great way to take repetitive time intensive tasks our of your day to day schedule. Whether using TFS or another Source Code Repository and Automated build agent automated deployment is available to you.

AzureFest: Open Source Presentation

Undoubtedly by now you have heard of AzureFest, with any luck you have been out to one of the events [if you live in the GTA]. For the rest of you, that haven’t been able to experience the event, I wanted to take the opportunity to introduce you to what AzureFest is and why you might be interested in the event itself.

Windows Azure Data Center Locations

What is AzureFest?

At it’s core AzureFest is a talk that focuses on a few barriers to Windows Azure Adoption including Pricing, Registration, Platform Confusion and Coding/Deployment. This is not your Grandma’s Windows Azure Presentation, It includes both a lecture and a hands on component which is rare for a Community Event.

Why Talk Pricing?

Simple, pricing is the first question that I get asked at the end of every presentation that I’ve done to date, so why not talk about it first?  Pricing goes hand-in-hand with the Platform, which means not only do you get to understand what the Windows Azure Platform consists of, but you also get an understanding of what it will cost as well. Finally, It would be rather irresponsible not to talk about the costs of Windows Azure when the first Hands-on-Lab is a walkthrough of the registration process.

What Will I Learn?

Besides the Overview of the Platform and the Pricing Strategies, each attendee who participates in the Labs will learn:

  • How to Register for a Windows Azure Platform Subscription
  • How to Create, Manage, Configure and Leverage a SQL Azure Database
  • How to Create and Configure a Windows Azure Storage Service Account
  • How to Create & Deploy a Project to Windows Azure Compute

Attendees will also learn some of the gotcha’s around the Tool Installation/Configuration Process and some strategies on how to debug your cloud based solutions both on premise [using the Compute Emulator] and “In The Cloud”.

Windows Azure CDN Locations

Bonus… We’re giving it away!

In the spirit of growing adoption of the Windows Azure Platform within Canada [or any country for that matter], ObjectSharp is releasing the content as an Open Source Presentation. This means it is FREE for anyone to download, learn and/or deliver.

If you are interested in doing an AzureFest presentation in your area, download the Resources for AzureFest. The resources include:

  • An AzureFest Slide Deck
  • Hands-on-Lab Kit [Ready to deploy cspkg and cscfg files]
  • Modified NerdDinner Source Code for Hands-on-Lab

If you have specific questions about delivering an AzureFest presentation, or need clarification on the content, please direct your questions to me via twitter.