Connecting to Azure Redis Cache Service from Python

Recently I had someone ask me how to connect to the Microsoft Azure Redis Cache Service from Python. I figured it would be easy considering how simple python is to learn. It turned out to be a little trickier than expected but still not too hard nonetheless.

Below is the sample code, then I’ll explain what each part is doing.

import redis
 
r = redis.StrictRedis(host='[cache-name].redis.cache.windows.net', port=6380, db=0, password='[access-key]', ssl=True)
r.set('foo','bar')
result = r.get('foo')
 
print result 

To start you will need to install redis-py (or a redis client of your choice) from your favourite package installer, I’m using pip.

pip install redis

There are three things you need to be aware of when connecting to Redis on Azure:

  1. SSL
  2. Password
  3. Port

Most of the time when you connect to a redis server it will be on your local machine which is fairly secure because there is no need for an outbound connection to the internet. When connecting to a Cloud Server there are many things that could go wrong in the security department so the Microsoft Azure Redis Cache uses a few things to avoid security issues.

First is a secure connection (SSL), when you connect to Azure you want to ensure that the data going across the wire is encrypted. Second a Password is used to authenticate access with the cache. Finally, the port number has been changed from the default due to the secure connection.

The key when using any Redis library is to ensure that it supports these three things, then once you know that the client supports them, it may still take some investigating to ensure that they are properly enabled when attempting to connect.

When looking at the connection object in python, you’ll notice that SSL is explicitly set to True this is required or you will receive an exception: ConnectionError: Error while reading from socket: (10054, ‘An existing connection was forcibly closed by the remote host’).

Happy Clouding!

Windows Boxes for Vagrant Courtesy of Modern.ie

The following post is releasing experimental bits for feedback purposes.

If you’re like me, a clean development environment is crucial to being effective. I used to carry around a portable hard drive with my golden image (starting point VM) and a number of other environments I’ve had already configured for the projects I was currently working on. This could be for a number of different reasons, Development, Testing (to address side-by-side browser issues), etc.

One of the sites that helped make my environments simple was Modern.ie as they provided a series of Virtual Machine images with multiple versions of Windows with different versions of Internet Explorer installed. These images are available for users on a Mac, Linux or Windows machine by taking advantage of different virtualization technologies including Hyper-V, Parallels, Virtual Box and VMware Player/Fusion.

I’m pleased to be announcing a new way to leverage the Modern.ie VMs for your testing purposes — Vagrant. If you aren’t familiar with Vagrant, Vagrant is a handy tool for your tool belt which is used to create and configure lightweight, reproducible and portable development environments.

A special Thank You to the Modern.ie team for their hard work working on these VMs to make them available to Vagrant users. Read the License Terms which are offered in the link below which is outlined on the Modern.ie site.

The Microsoft Software License Terms for the IE VMs are included in the release notes and supersede any conflicting Windows license terms included in the VMs.

By downloading and using this software, you agree to these license terms.

Known Issues

  • Currently only the Virtual Box Provider is supported
  • OpenSSH is not installed at this time which disables Provisioning

We would like to hear your feedback, reach out to @IEDevChat on twitter.

Installing CakePHP in the Microsoft Azure Preview Portal

A while back, last year actually, I wrote a blog post on how you can Install CakePHP on Windows Azure Web Sites using the App Gallery. At Build 2014 we introduced a new Preview Portal which enables much more to an application owner including in-place billing information, application analytics and a whole new way to visualize your cloud experience.

In this thread, I’ll show you how to create a new CakePHP application via the Preview Portal.

If you’re an experienced CakePHP Developer, you might want to check out Sunitha Muthukrishna blog post on using CakePHP’s Bake Console Tool on Microsoft Azure Websites.

Install CakePHP on Azure Web Sites

From the Start Board, select the Azure Gallery.

image

This will open the App Gallery Blade, where you can select from a list of categories. Select Web, then CakePHP.

image

This will start the CakePHP installation, select Create. Thus begins your Journey.

image

You’ll need to create a new resource group. Enter a name for your Resource Group, then click on the Website configuration

image

You’ll need to select your Hosting Plan. For this demo, I created a free site.

image

Then configure the application, by clicking on Web App Settings. Set the Security Salt and Cipher Seed.

image

Then select the datacenter location you’d like to deploy your application to.

image

Click OK to finish the Web Site Configuration and move on to create the Database.

image

Select Database.

image

Accept the Terms for the ClearDB database.

image

Select a different subscription, if required. Then click Create.

image

Your site has started to deploy and should be ready for you to start creating within seconds.

image

You can monitor your application, change settings, or set up source control from your new site.

image

Enjoy!

How to backup a Web Site running on Microsoft Azure Web Sites

Keeping regular backups is important for anything on the web, nay technology, especially for mission critical applications, enterprise applications, or keeping your meme generator application from Build 2014 [not sure what I’m talking about? Watch the Day 2 Keynote].

In this example, I’m actually going to outline how I keep a backup of my blog, yes the one you’re currently reading right now. It is running on WordPress and represents a good portion of my journey into a career in technology, that means it’s countless hours of my time that I continuously have the opportunity to read what I’ve done in the past after doing a quick Bing search on something I’m currently working on.

Take a Backup of a Web Site.

In the Microsoft Azure Management Portal select the Web Site you wish to backup.

image

As you can see in the image below, I run my site in a Shared Web Site. This provides me with enough resources for people navigating my blog to get an excellent experience without it being too heavy on my pocket book.

image

The backup feature of Web Sites only works in Standard, so for now, I’m going scale my site to Standard. This is as simple as clicking on the Standard Button, then clicking on Save in the command bar at the bottom of the screen.

image

Once I click on the Save button, I am prompted to let me know that scaling to standard will increase my costs, but I’m not too worried as I’ll be scaling back down to shared again shortly.

image

After the scaling task finishes, I’ll be able to use the form in the Backups navigation to select my storage account I wish to have my backups save to, the frequency in which they are saved as well as a database which is linked to my Web Site as a Linked Resource in the previous tab.

image

So I’ll select my favourite storage account.

image

And include my ClearDB database which is linked to my site to be backed up as well.

image

Then I’m only one click away from knowing all my archived hard work is saved for me in my storage account.

image

After the backup is done, pay attention because this is important, I go back into the Scale tab and scale my site back down from Standard to Shared. This moves me back down into the lower billing range that I am comfortable running my site in.

What does Microsoft Azure Web Sites Backup?

In the image below you can see two files which identify a backup. The first which is an xml file describing the site that was backed up at a high level including the custom domain, web site service domain as well as the name of the database which was backed up. The second file is a zip file which contains a backup of your site which I will outline in more detail below.

image

Here is a quick snapshot of the contents of the zip file: a fs folder, a temp folder, a meta file and a file named the same as your database.

image

What is in the Azure Web Site Backup zip folder

FS – if you haven’t already guessed it, FS stands for File System. This retains a snapshot of the file system of your web site at the time the backup was taken. This includes both the site and logFiles folders so you have access to anything you would need.

Temp – My temp folder was unused.

Meta – This is an xml file which describes all aspects of your website including but no limited to Custom Domains, Configured SSL Certificates, App Settings, Connection Strings, Default Document settings, Handler Mappings (for custom FastCGI Handlers); Remote Debugging, Web Sockets. I could go on, but I believe you get the picture, if it’s something you set in the portal for your web site, it’s backed up in this file.

Database Name – In my case, I had a MySQL database selected, so this file is a MySQL dump file. This can be used to completely restore my database from schema to data.

Automating Environment Creation using Microsoft Azure PowerShell

During preparations for the Web Demo, presented by Mads Kristensen at Build 2014 [watch the day 2 keynote], it was necessary to stand up multiple environments to do development, testing and dry runs of the presentation. The quickest way to accomplish this was to script out the environment provisioning with PowerShell and the Microsoft Azure PowerShell and Cross Platform Command Line Tools.

Download the Tools

In order to automate your Microsoft Azure Environments, it’s helpful to download and install the following tools.

Microsoft Azure PowerShell Tools

Microsoft Azure Cross Platform Tools

Environment Automation Script

Considering the size of our environment, the number of stamps required and the need to be agile to changes in the environments it was necessary to build out an environment setup script for our Azure resources.

Our environment consists of:

  • 3 Azure Web Sites, 1 Staging Slot
  • 1 Azure Storage Account, 2 Storage Containers
  • 1 Traffic Manager Namespace, 2 Traffic Manager endpoints

Storage and Web Sites are easy to automate with PowerShell or the Cross Platform Command Line Tools. However, there isn’t currently a way to automate Traffic Manager without programming directly against the Microsoft Azure Service Management API.

To be able to support the creation of multiple assets, using Hash tables enables a unique value for the Web Site or Storage account, while still being able to store the region for creation. This fulfills the minimum requirements for creating an Azure Web Site, or an Azure Storage Account.

Leveraging a PowerShell cmdlet, the previous environment is deleted in the begin function, then the environment is rebuilt in the process function. Service creation is an asynchronous action, so we took advantage of Jobs in PowerShell to handle creation in background threads. If additional configuration is needed on a service, the Job is waited upon until completion, before additional service calls are preformed on the service.

Here is the script we called for each stamp of our environment:

Stamp Automation Script

The Environment Automation Script takes care of the heavy lifting, while the Stamp automation script is responsible for describing the resources required in the Environment.

First we must register the environment automation script so that it can be called within the same process that is executing our stamp automation script, then describe the stamp in variables then pass it to the environment setup script.

Conclusion

We generated custom scripts to suit our needs, but this isn’t something that you would need to do. Introduced in the VS 2013 Update 2 release and the Azure SDK 2.3 tools, you can have PowerShell scripts generated for your Web Application directly within Visual Studio File New Project dialogue (for more details read Using Windows PowerShell Scripts to Publish to Dev and Test Environments)

Using Guzzle to Interact with the Windows Azure Management API

Guzzle is a very simple abstraction over cURL which provides a great HTTP client for working with web services. This provides a great way of interacting with the Windows Azure Management API with PHP. In this example, I’m going to show how you can enable and disable an SSL Certificate in a Windows Azure Web Site using the Windows Azure Management API.

The majority of Windows Azure Web Site tasks can be automated using either the Windows Azure PowerShell cmdlets or the Windows Azure Cross Platform Command Line tools. This includes the ability to upload an SSL Certificate for your Web Site, however at this point there is no ability to bind the SSL Certificate to the Web Site itself. That’s where Guzzle and the Windows Azure Management API come into the picture.

You can do the following exercise from the Windows Azure Management Portal by following the instructions in the article Enable HTTPS for a Windows Azure web site. This entry is to demonstrate how you can achieve SSL Certificate binding as part of an automated environment script.

Automating SSL Certificate Upload to Azure Web Sites

Before we start interacting with the Management API let’s get the Windows Azure Web Site ready by adding the SSL Certificate to the Environment. This task is incredibly simple to accomplish using the Windows Azure Cross Platform tools.

SSL certificates can be uploaded only in Standard mode. Learn more about configuring custom domains.

azure site cert add [ssl-cert-path].pfx --key [cert-password] [web-site-name]

That’s it! The cert will now be added to the Windows Azure Web Site.

Create and Upload Export a Windows Azure Management Certificate

You could very well create and upload a management certificate of your own by creating one with OpenSSL and uploading it via the Windows Azure Management Portal (Settings > Management Certificates). However, there is a much easier way to achieve this without having to generate your own certificates, by using the Windows Azure Cross Platform Tools.

azure account cert export [--subscription]

The optional subscription parameter can be either the name of the subscription name or id from the azure account list command output.

Create a PHP Application with Guzzle to interact with the Windows Azure Management API

This is where things get fun! Let’s start by creating a composer file to acquire Guzzle.

Next we’ll want to take a look at the documentation for how to Enable or Disable SSL in Windows Azure Web Sites with the Management API. This gives us the information for the rest endpoint, http verb and xml/json payload to enable or disable the SSL Certificate.

Warning! At the time of writing SSL Certificates are only available for upload/binding if the Web Site is in Standard Mode. If the site isn’t in Standard Mode your requests will return with the status code HTTP 400 Bad Request.

Enable SSL JSON Playload

Disable SSL JSON Playload

PHP Source Code

To enable or disable SSL in a Web Site you will need to make a PUT request against the management API pass in your SubscriptionId, the webspace of the site, the site name and the Client Certificate. In Guzzle the request is constructed by the client, which you can pass an array of values into for replacement when you create the request by calling the PUT method.

Conclusion

This is how simple it is to make calls to the Windows Azure Management API from PHP using Guzzle. This blog post covers the usage of JSON for the request payload, however, there is a full example available as a Gist if you’d like to see how this can be done with XML.