Windows Boxes for Vagrant Courtesy of Modern.ie

The following post is releasing experimental bits for feedback purposes.

If you’re like me, a clean development environment is crucial to being effective. I used to carry around a portable hard drive with my golden image (starting point VM) and a number of other environments I’ve had already configured for the projects I was currently working on. This could be for a number of different reasons, Development, Testing (to address side-by-side browser issues), etc.

One of the sites that helped make my environments simple was Modern.ie as they provided a series of Virtual Machine images with multiple versions of Windows with different versions of Internet Explorer installed. These images are available for users on a Mac, Linux or Windows machine by taking advantage of different virtualization technologies including Hyper-V, Parallels, Virtual Box and VMware Player/Fusion.

I’m pleased to be announcing a new way to leverage the Modern.ie VMs for your testing purposes — Vagrant. If you aren’t familiar with Vagrant, Vagrant is a handy tool for your tool belt which is used to create and configure lightweight, reproducible and portable development environments.

A special Thank You to the Modern.ie team for their hard work working on these VMs to make them available to Vagrant users. Read the License Terms which are offered in the link below which is outlined on the Modern.ie site.

The Microsoft Software License Terms for the IE VMs are included in the release notes and supersede any conflicting Windows license terms included in the VMs.

By downloading and using this software, you agree to these license terms.

Known Issues

  • Currently only the Virtual Box Provider is supported
  • OpenSSH is not installed at this time which disables Provisioning

We would like to hear your feedback, reach out to @IEDevChat on twitter.

How to Avoid Big Cloud Computing Bills

A long time ago in a galaxy far far away… Ok, maybe not, but 4 years ago I did write a blog post Clearing the Skies around Windows Azure which was aimed at helping people understand the pricing model of the Cloud. It seems much hasn’t changed in the past 4 years regarding pricing being the biggest concern when investigating the move to the Cloud, however, there have been a number of changes in Microsoft Azure to help address this concern.

Azure Spending Limits

If you have a Microsoft Developer Network (MSDN), BizSpark or Microsoft Partner Network (MPN) account you are able to activate benefits for Azure credits. These accounts come with a form of protection from spending additional dollars on top of the credits given in the form of a Spending Limit which is set initially to $0. A $0 spending limit means that you do not want to be charged for reasons over and above the credits you receive as part of the benefit.

This $0 spending limit is achieved by disabling the subscription when the benefit amount is reached, which intern will suspend access to resources (storage and SQL become read-only), and delete resources which would continue to accrue charges under normal operation (Cloud Services, VMs, etc). The subscription is re-activated at the beginning of the next billing cycle provided more credits are available.

image

It is possible to change the spending limit as described in the Change the Azure Spending Limit which supports the following actions:

  • Remove spending limit indefinitely
  • Remove spending limit for the current billing period
  • Turn on spending limit in the next billing period <start date of billing period>
  • Keep my current spending limit option

If you remove the spending limit, the earliest you can turn it back on is at the start of the next billing cycle.

Azure Billing Alerts (Preview)

Where as Azure Spending Limits provide a way to avoid spending additional money, Azure Billing Alerts (currently in preview at the time of writing this blog entry) provide a way to set an Alert when you reach a spending threshold with your subscriptions billable resources.

How to Setup a Billing Alert on Microsoft Azure

  1. Login to the Account Center using the Microsoft Account for your subscription
  2. Select the subscription you wish to add a Billing Alert for.
    image
  3. Select the Alerts menu item
    image
  4. Select Add Alert
    image
  5. Fill out the Alert form with the following details:
    1. Alert Name
    2. Alert For
      1. Billing Total
      2. Monetary Credits
    3. Amount Spent
    4. Email Recipient 1
    5. Email Recipient 2 (Optional)

    image

  6. Click Save.

Installing CakePHP in the Microsoft Azure Preview Portal

A while back, last year actually, I wrote a blog post on how you can Install CakePHP on Windows Azure Web Sites using the App Gallery. At Build 2014 we introduced a new Preview Portal which enables much more to an application owner including in-place billing information, application analytics and a whole new way to visualize your cloud experience.

In this thread, I’ll show you how to create a new CakePHP application via the Preview Portal.

If you’re an experienced CakePHP Developer, you might want to check out Sunitha Muthukrishna blog post on using CakePHP’s Bake Console Tool on Microsoft Azure Websites.

Install CakePHP on Azure Web Sites

From the Start Board, select the Azure Gallery.

image

This will open the App Gallery Blade, where you can select from a list of categories. Select Web, then CakePHP.

image

This will start the CakePHP installation, select Create. Thus begins your Journey.

image

You’ll need to create a new resource group. Enter a name for your Resource Group, then click on the Website configuration

image

You’ll need to select your Hosting Plan. For this demo, I created a free site.

image

Then configure the application, by clicking on Web App Settings. Set the Security Salt and Cipher Seed.

image

Then select the datacenter location you’d like to deploy your application to.

image

Click OK to finish the Web Site Configuration and move on to create the Database.

image

Select Database.

image

Accept the Terms for the ClearDB database.

image

Select a different subscription, if required. Then click Create.

image

Your site has started to deploy and should be ready for you to start creating within seconds.

image

You can monitor your application, change settings, or set up source control from your new site.

image

Enjoy!

How to backup a Web Site running on Microsoft Azure Web Sites

Keeping regular backups is important for anything on the web, nay technology, especially for mission critical applications, enterprise applications, or keeping your meme generator application from Build 2014 [not sure what I’m talking about? Watch the Day 2 Keynote].

In this example, I’m actually going to outline how I keep a backup of my blog, yes the one you’re currently reading right now. It is running on WordPress and represents a good portion of my journey into a career in technology, that means it’s countless hours of my time that I continuously have the opportunity to read what I’ve done in the past after doing a quick Bing search on something I’m currently working on.

Take a Backup of a Web Site.

In the Microsoft Azure Management Portal select the Web Site you wish to backup.

image

As you can see in the image below, I run my site in a Shared Web Site. This provides me with enough resources for people navigating my blog to get an excellent experience without it being too heavy on my pocket book.

image

The backup feature of Web Sites only works in Standard, so for now, I’m going scale my site to Standard. This is as simple as clicking on the Standard Button, then clicking on Save in the command bar at the bottom of the screen.

image

Once I click on the Save button, I am prompted to let me know that scaling to standard will increase my costs, but I’m not too worried as I’ll be scaling back down to shared again shortly.

image

After the scaling task finishes, I’ll be able to use the form in the Backups navigation to select my storage account I wish to have my backups save to, the frequency in which they are saved as well as a database which is linked to my Web Site as a Linked Resource in the previous tab.

image

So I’ll select my favourite storage account.

image

And include my ClearDB database which is linked to my site to be backed up as well.

image

Then I’m only one click away from knowing all my archived hard work is saved for me in my storage account.

image

After the backup is done, pay attention because this is important, I go back into the Scale tab and scale my site back down from Standard to Shared. This moves me back down into the lower billing range that I am comfortable running my site in.

What does Microsoft Azure Web Sites Backup?

In the image below you can see two files which identify a backup. The first which is an xml file describing the site that was backed up at a high level including the custom domain, web site service domain as well as the name of the database which was backed up. The second file is a zip file which contains a backup of your site which I will outline in more detail below.

image

Here is a quick snapshot of the contents of the zip file: a fs folder, a temp folder, a meta file and a file named the same as your database.

image

What is in the Azure Web Site Backup zip folder

FS – if you haven’t already guessed it, FS stands for File System. This retains a snapshot of the file system of your web site at the time the backup was taken. This includes both the site and logFiles folders so you have access to anything you would need.

Temp – My temp folder was unused.

Meta – This is an xml file which describes all aspects of your website including but no limited to Custom Domains, Configured SSL Certificates, App Settings, Connection Strings, Default Document settings, Handler Mappings (for custom FastCGI Handlers); Remote Debugging, Web Sockets. I could go on, but I believe you get the picture, if it’s something you set in the portal for your web site, it’s backed up in this file.

Database Name – In my case, I had a MySQL database selected, so this file is a MySQL dump file. This can be used to completely restore my database from schema to data.

Automating Environment Creation using Microsoft Azure PowerShell

During preparations for the Web Demo, presented by Mads Kristensen at Build 2014 [watch the day 2 keynote], it was necessary to stand up multiple environments to do development, testing and dry runs of the presentation. The quickest way to accomplish this was to script out the environment provisioning with PowerShell and the Microsoft Azure PowerShell and Cross Platform Command Line Tools.

Download the Tools

In order to automate your Microsoft Azure Environments, it’s helpful to download and install the following tools.

Microsoft Azure PowerShell Tools

Microsoft Azure Cross Platform Tools

Environment Automation Script

Considering the size of our environment, the number of stamps required and the need to be agile to changes in the environments it was necessary to build out an environment setup script for our Azure resources.

Our environment consists of:

  • 3 Azure Web Sites, 1 Staging Slot
  • 1 Azure Storage Account, 2 Storage Containers
  • 1 Traffic Manager Namespace, 2 Traffic Manager endpoints

Storage and Web Sites are easy to automate with PowerShell or the Cross Platform Command Line Tools. However, there isn’t currently a way to automate Traffic Manager without programming directly against the Microsoft Azure Service Management API.

To be able to support the creation of multiple assets, using Hash tables enables a unique value for the Web Site or Storage account, while still being able to store the region for creation. This fulfills the minimum requirements for creating an Azure Web Site, or an Azure Storage Account.

Leveraging a PowerShell cmdlet, the previous environment is deleted in the begin function, then the environment is rebuilt in the process function. Service creation is an asynchronous action, so we took advantage of Jobs in PowerShell to handle creation in background threads. If additional configuration is needed on a service, the Job is waited upon until completion, before additional service calls are preformed on the service.

Here is the script we called for each stamp of our environment:

Stamp Automation Script

The Environment Automation Script takes care of the heavy lifting, while the Stamp automation script is responsible for describing the resources required in the Environment.

First we must register the environment automation script so that it can be called within the same process that is executing our stamp automation script, then describe the stamp in variables then pass it to the environment setup script.

Conclusion

We generated custom scripts to suit our needs, but this isn’t something that you would need to do. Introduced in the VS 2013 Update 2 release and the Azure SDK 2.3 tools, you can have PowerShell scripts generated for your Web Application directly within Visual Studio File New Project dialogue (for more details read Using Windows PowerShell Scripts to Publish to Dev and Test Environments)

Building a Windows Azure Boot Camp (Training) Resource Kit

Before joining Microsoft, I used to run a local user group (monthly), a geeky social hour (weekly) and you’d often find me doing some (free-as-in-beer) public training sessions on Windows Azure. One of the hardest things about running an event is needing to rely on the internet connection of whatever space you could find to host your event. No matter how well you plan, unleashing hundreds of geeks on a WiFi connection at a venue is bound to cause some issues.

One important part of an organizers tool belt is providing an offline install option for those attendees who filled out their contact details with an auto-form tool and didn’t read your carefully crafted prerequisites list. Making one of these handy resources isn’t very straight forward, so I thought I would help out anyone looking to host an event.

If you’d like to host or attend an event, keep an eye out for the Global Windows Azure Boot Camp.

Recently, we’ve been making some changes to how we deliver the Windows Azure Training Kit, Web Camps Training Kit, Data Camps Training Kit & Enterprise Developer Training Kit. This is done through the Web Platform Installer which provides an excellent way to configure IIS and download tools & frameworks. This will allow us to provide dependencies for the training kits alongside the content making it easier to ensure your machine is configured to be able to use the content. This blog post will act as an interim step to providing alongside dependencies, but will also be useful after the point we include the dependencies as part of the training kit install.

Make a WebPI Offline Package

First you will need to create an offline backup of the contents required for your boot camp. This is easily done by using the WebPICmd.exe tool which is included in the Web Platform Installer installation directory. The /offline switch will download the installer files and required metadata for the /Products to the specified /path.

image

Offline Install Flavours

I’ve created a few scripts that will help attendees install the tools that they need. Here is a list of the offline installation scripts which will help attendees install the required tools.

image

The total size of this package is approximately 2GB (a perfect use for those USB keys you collect from conferences).

Changing the Training Kit Install Directory

Warning If you plan on updating the Install Directory of the Windows Azure Training Kit, be sure to copy the USB key to your local drive before making any modifications.

I’ve heard a lot of feedback around customizing the install directory of the Training Kit. This can be achieved by updating the offline product list found at feeds\latest\WebProductsList.xml after running OfflineMaster.cmd. We use the %HomeDrive% environment variable to select an install location, simply replace %HomeDrive% with a specific location to customize the install directory.

Clearing the WebPI Caching

Sometimes it’s necessary to clear the WebPI cache, so I’ve made a simple PowerShell script to help clear the cache automatically.

Note: It’s only necessary to clear the WebPI cache if for some reason you are not getting refreshed content that you know is available.