Connecting to Azure Redis Cache Service from Python

Recently I had someone ask me how to connect to the Microsoft Azure Redis Cache Service from Python. I figured it would be easy considering how simple python is to learn. It turned out to be a little trickier than expected but still not too hard nonetheless.

Below is the sample code, then I’ll explain what each part is doing.

import redis
 
r = redis.StrictRedis(host='[cache-name].redis.cache.windows.net', port=6380, db=0, password='[access-key]', ssl=True)
r.set('foo','bar')
result = r.get('foo')
 
print result 

To start you will need to install redis-py (or a redis client of your choice) from your favourite package installer, I’m using pip.

pip install redis

There are three things you need to be aware of when connecting to Redis on Azure:

  1. SSL
  2. Password
  3. Port

Most of the time when you connect to a redis server it will be on your local machine which is fairly secure because there is no need for an outbound connection to the internet. When connecting to a Cloud Server there are many things that could go wrong in the security department so the Microsoft Azure Redis Cache uses a few things to avoid security issues.

First is a secure connection (SSL), when you connect to Azure you want to ensure that the data going across the wire is encrypted. Second a Password is used to authenticate access with the cache. Finally, the port number has been changed from the default due to the secure connection.

The key when using any Redis library is to ensure that it supports these three things, then once you know that the client supports them, it may still take some investigating to ensure that they are properly enabled when attempting to connect.

When looking at the connection object in python, you’ll notice that SSL is explicitly set to True this is required or you will receive an exception: ConnectionError: Error while reading from socket: (10054, ‘An existing connection was forcibly closed by the remote host’).

Happy Clouding!

Windows Boxes for Vagrant Courtesy of Modern.ie

The following post is releasing experimental bits for feedback purposes.

If you’re like me, a clean development environment is crucial to being effective. I used to carry around a portable hard drive with my golden image (starting point VM) and a number of other environments I’ve had already configured for the projects I was currently working on. This could be for a number of different reasons, Development, Testing (to address side-by-side browser issues), etc.

One of the sites that helped make my environments simple was Modern.ie as they provided a series of Virtual Machine images with multiple versions of Windows with different versions of Internet Explorer installed. These images are available for users on a Mac, Linux or Windows machine by taking advantage of different virtualization technologies including Hyper-V, Parallels, Virtual Box and VMware Player/Fusion.

I’m pleased to be announcing a new way to leverage the Modern.ie VMs for your testing purposes — Vagrant. If you aren’t familiar with Vagrant, Vagrant is a handy tool for your tool belt which is used to create and configure lightweight, reproducible and portable development environments.

A special Thank You to the Modern.ie team for their hard work working on these VMs to make them available to Vagrant users. Read the License Terms which are offered in the link below which is outlined on the Modern.ie site.

The Microsoft Software License Terms for the IE VMs are included in the release notes and supersede any conflicting Windows license terms included in the VMs.

By downloading and using this software, you agree to these license terms.

Known Issues

  • Currently only the Virtual Box Provider is supported
  • OpenSSH is not installed at this time which disables Provisioning

We would like to hear your feedback, reach out to @IEDevChat on twitter.

How to Avoid Big Cloud Computing Bills

A long time ago in a galaxy far far away… Ok, maybe not, but 4 years ago I did write a blog post Clearing the Skies around Windows Azure which was aimed at helping people understand the pricing model of the Cloud. It seems much hasn’t changed in the past 4 years regarding pricing being the biggest concern when investigating the move to the Cloud, however, there have been a number of changes in Microsoft Azure to help address this concern.

Azure Spending Limits

If you have a Microsoft Developer Network (MSDN), BizSpark or Microsoft Partner Network (MPN) account you are able to activate benefits for Azure credits. These accounts come with a form of protection from spending additional dollars on top of the credits given in the form of a Spending Limit which is set initially to $0. A $0 spending limit means that you do not want to be charged for reasons over and above the credits you receive as part of the benefit.

This $0 spending limit is achieved by disabling the subscription when the benefit amount is reached, which intern will suspend access to resources (storage and SQL become read-only), and delete resources which would continue to accrue charges under normal operation (Cloud Services, VMs, etc). The subscription is re-activated at the beginning of the next billing cycle provided more credits are available.

image

It is possible to change the spending limit as described in the Change the Azure Spending Limit which supports the following actions:

  • Remove spending limit indefinitely
  • Remove spending limit for the current billing period
  • Turn on spending limit in the next billing period <start date of billing period>
  • Keep my current spending limit option

If you remove the spending limit, the earliest you can turn it back on is at the start of the next billing cycle.

Azure Billing Alerts (Preview)

Where as Azure Spending Limits provide a way to avoid spending additional money, Azure Billing Alerts (currently in preview at the time of writing this blog entry) provide a way to set an Alert when you reach a spending threshold with your subscriptions billable resources.

How to Setup a Billing Alert on Microsoft Azure

  1. Login to the Account Center using the Microsoft Account for your subscription
  2. Select the subscription you wish to add a Billing Alert for.
    image
  3. Select the Alerts menu item
    image
  4. Select Add Alert
    image
  5. Fill out the Alert form with the following details:
    1. Alert Name
    2. Alert For
      1. Billing Total
      2. Monetary Credits
    3. Amount Spent
    4. Email Recipient 1
    5. Email Recipient 2 (Optional)

    image

  6. Click Save.

Installing CakePHP in the Microsoft Azure Preview Portal

A while back, last year actually, I wrote a blog post on how you can Install CakePHP on Windows Azure Web Sites using the App Gallery. At Build 2014 we introduced a new Preview Portal which enables much more to an application owner including in-place billing information, application analytics and a whole new way to visualize your cloud experience.

In this thread, I’ll show you how to create a new CakePHP application via the Preview Portal.

If you’re an experienced CakePHP Developer, you might want to check out Sunitha Muthukrishna blog post on using CakePHP’s Bake Console Tool on Microsoft Azure Websites.

Install CakePHP on Azure Web Sites

From the Start Board, select the Azure Gallery.

image

This will open the App Gallery Blade, where you can select from a list of categories. Select Web, then CakePHP.

image

This will start the CakePHP installation, select Create. Thus begins your Journey.

image

You’ll need to create a new resource group. Enter a name for your Resource Group, then click on the Website configuration

image

You’ll need to select your Hosting Plan. For this demo, I created a free site.

image

Then configure the application, by clicking on Web App Settings. Set the Security Salt and Cipher Seed.

image

Then select the datacenter location you’d like to deploy your application to.

image

Click OK to finish the Web Site Configuration and move on to create the Database.

image

Select Database.

image

Accept the Terms for the ClearDB database.

image

Select a different subscription, if required. Then click Create.

image

Your site has started to deploy and should be ready for you to start creating within seconds.

image

You can monitor your application, change settings, or set up source control from your new site.

image

Enjoy!

How to backup a Web Site running on Microsoft Azure Web Sites

Keeping regular backups is important for anything on the web, nay technology, especially for mission critical applications, enterprise applications, or keeping your meme generator application from Build 2014 [not sure what I’m talking about? Watch the Day 2 Keynote].

In this example, I’m actually going to outline how I keep a backup of my blog, yes the one you’re currently reading right now. It is running on WordPress and represents a good portion of my journey into a career in technology, that means it’s countless hours of my time that I continuously have the opportunity to read what I’ve done in the past after doing a quick Bing search on something I’m currently working on.

Take a Backup of a Web Site.

In the Microsoft Azure Management Portal select the Web Site you wish to backup.

image

As you can see in the image below, I run my site in a Shared Web Site. This provides me with enough resources for people navigating my blog to get an excellent experience without it being too heavy on my pocket book.

image

The backup feature of Web Sites only works in Standard, so for now, I’m going scale my site to Standard. This is as simple as clicking on the Standard Button, then clicking on Save in the command bar at the bottom of the screen.

image

Once I click on the Save button, I am prompted to let me know that scaling to standard will increase my costs, but I’m not too worried as I’ll be scaling back down to shared again shortly.

image

After the scaling task finishes, I’ll be able to use the form in the Backups navigation to select my storage account I wish to have my backups save to, the frequency in which they are saved as well as a database which is linked to my Web Site as a Linked Resource in the previous tab.

image

So I’ll select my favourite storage account.

image

And include my ClearDB database which is linked to my site to be backed up as well.

image

Then I’m only one click away from knowing all my archived hard work is saved for me in my storage account.

image

After the backup is done, pay attention because this is important, I go back into the Scale tab and scale my site back down from Standard to Shared. This moves me back down into the lower billing range that I am comfortable running my site in.

What does Microsoft Azure Web Sites Backup?

In the image below you can see two files which identify a backup. The first which is an xml file describing the site that was backed up at a high level including the custom domain, web site service domain as well as the name of the database which was backed up. The second file is a zip file which contains a backup of your site which I will outline in more detail below.

image

Here is a quick snapshot of the contents of the zip file: a fs folder, a temp folder, a meta file and a file named the same as your database.

image

What is in the Azure Web Site Backup zip folder

FS – if you haven’t already guessed it, FS stands for File System. This retains a snapshot of the file system of your web site at the time the backup was taken. This includes both the site and logFiles folders so you have access to anything you would need.

Temp – My temp folder was unused.

Meta – This is an xml file which describes all aspects of your website including but no limited to Custom Domains, Configured SSL Certificates, App Settings, Connection Strings, Default Document settings, Handler Mappings (for custom FastCGI Handlers); Remote Debugging, Web Sockets. I could go on, but I believe you get the picture, if it’s something you set in the portal for your web site, it’s backed up in this file.

Database Name – In my case, I had a MySQL database selected, so this file is a MySQL dump file. This can be used to completely restore my database from schema to data.

Automating Environment Creation using Microsoft Azure PowerShell

During preparations for the Web Demo, presented by Mads Kristensen at Build 2014 [watch the day 2 keynote], it was necessary to stand up multiple environments to do development, testing and dry runs of the presentation. The quickest way to accomplish this was to script out the environment provisioning with PowerShell and the Microsoft Azure PowerShell and Cross Platform Command Line Tools.

Download the Tools

In order to automate your Microsoft Azure Environments, it’s helpful to download and install the following tools.

Microsoft Azure PowerShell Tools

Microsoft Azure Cross Platform Tools

Environment Automation Script

Considering the size of our environment, the number of stamps required and the need to be agile to changes in the environments it was necessary to build out an environment setup script for our Azure resources.

Our environment consists of:

  • 3 Azure Web Sites, 1 Staging Slot
  • 1 Azure Storage Account, 2 Storage Containers
  • 1 Traffic Manager Namespace, 2 Traffic Manager endpoints

Storage and Web Sites are easy to automate with PowerShell or the Cross Platform Command Line Tools. However, there isn’t currently a way to automate Traffic Manager without programming directly against the Microsoft Azure Service Management API.

To be able to support the creation of multiple assets, using Hash tables enables a unique value for the Web Site or Storage account, while still being able to store the region for creation. This fulfills the minimum requirements for creating an Azure Web Site, or an Azure Storage Account.

Leveraging a PowerShell cmdlet, the previous environment is deleted in the begin function, then the environment is rebuilt in the process function. Service creation is an asynchronous action, so we took advantage of Jobs in PowerShell to handle creation in background threads. If additional configuration is needed on a service, the Job is waited upon until completion, before additional service calls are preformed on the service.

Here is the script we called for each stamp of our environment:

Stamp Automation Script

The Environment Automation Script takes care of the heavy lifting, while the Stamp automation script is responsible for describing the resources required in the Environment.

First we must register the environment automation script so that it can be called within the same process that is executing our stamp automation script, then describe the stamp in variables then pass it to the environment setup script.

Conclusion

We generated custom scripts to suit our needs, but this isn’t something that you would need to do. Introduced in the VS 2013 Update 2 release and the Azure SDK 2.3 tools, you can have PowerShell scripts generated for your Web Application directly within Visual Studio File New Project dialogue (for more details read Using Windows PowerShell Scripts to Publish to Dev and Test Environments)