Secure Backups with Amazon S3 Storage

Amazon Web Services has revolutionized cloud computing in so many ways. The services they provide are numerous and learning all of the capabilities of this service can be mind numbing. There are two services in particular where I spend most of my time – EC2 and S3. EC2 is the virtual computing section of AWS. This is where you can build and use virtual servers of all shapes, sizes and configurations. S3 is the AWS cloud storage solution. Together, these two services can give you a lot of flexibility to build servers and backup that data easily to cloud storage.

A Wake Up Call

I began learning how you could setup a methodology for backing up virtual server data to S3 using Amazon’s command line interface, or CLI. The process was actually simpler than I thought. It only took a few steps, and I was off and running. Then a business contact of mine sent me an article that sent chills running down my spine. The article, Hacker puts ‘full redundancy’ code-hosting firm out of business (NetworkWorld – June 20, 2014), talks about how a company was hacked and how the hackers used the AWS tools to completely destroy the company’s servers and their backups. After reading the article, I understood how it was possible. It’s easy to find articles explaining a “quick and dirty” way to setup access to S3 with open access to make integration easy. However, it wasn’t easy finding a way to lock it down so that incidents like the one described in this article don’t occur. It was time to rethink my strategy.

My Security Requirements

Based on this article, I sat down and decided to architect what I wanted my backup solution to do. Out of that design, the following requirements were identified:

  • I want to keep 30 days of backup files in S3 and then purge anything older than that.
  • The user account for uploading these files should only be able to upload and download files with a specific S3 storage bucket.
  • In case the server is ever compromised, the user account should not be able to delete files from the backup server. (If a hacker could compromise the server AND delete all of the backups, then having a backup solution is a useless exercise.)
  • I want this user to have the ability to list the files that are in the S3 bucket. This simplifies the process for restoring a backup file to the server. (NOTE: This is a personal preference and many would argue that this is itself a security violation. If a hacker has compromised my system, then they already have the data. Seeing into the backup repository isn’t going to gain them much more, particularly in the context of my web servers. In a financial application, I might think differently about this approach.)
  • The user account should not have access to any other AWS functionality.

Armed with this information, I set out to research the numerous security mechanisms on AWS and build the secure backup solution that I needed. The solution isn’t nearly as “quick and dirty” as the articles I had previously read. The remainder of this blog aims to outline all of the steps that were needed to make the solution work as I designed it.

Creating an S3 Bucket for Backup Storage

The first step in this process was to define an S3 bucket specifically for holding my backup files.

  1. From the AWS console, click the S3 service icon.
  2. Once you are in the S3 Management Console, click the Create Bucket button at the top of the screen.
  3. Enter a bucket name for your backups. For the purpose of this exercise, we will name our bucket tkreiner-com-web-backups.
  4. Select a region to host this storage and then click Create.
  5. Back on the S3 Management Console, you should see the new bucket that you just created. If it isn’t already selected, click on the bucket name to select it.
  6. On the right side of the screen, you will see the properties for your bucket. Expand the section labeled as Lifecycle. This is where we will define our 30 days retention policy.
  7. Click the Add Rule button to add a new lifecycle.
  8. Step 1 of the Lifecycle wizard asks what this rule will be applied to. Keep the default option to apply this to the entire bucket and click the Configure Rule button.
  9. In Step 2, we are asked what action we will take. In the dropdown list, select Permanently Delete Only and then set the number of days box below to 30. Click the Review button.
  10. In Step 3, you are asked to give a name for your rule and verify all of the details for the rule. After entering a name, click the Create and Activate Rule button.

We now have a bucket to store our backup files in and we have a retention policy defined that only keeps 30 days worth of backups.

Create a Backup User in AWS

Here is the critical point in this process. Many people probably started out using AWS the same way I had. You register as a new user for AWS and you setup your username and password. First you create some new computers in EC2. You even start exploring S3. You learn how to setup CLI credentials so that you can communicate directly from your EC2 servers to your S3 storage. You start copying files back and forth to S3 and life is wonderful!

Here’s the problem – that first login that you create into AWS is the root level user to your environment. This is the super user of all super users. When you created your CLI credentials, you most likely created them associated to your root account. With those credentials you can do ANYTHING and EVERYTHING to your AWS environment through the CLI commands. I mean it . . . EVERYTHING!!! Picture this – imagine using the CLI interface to create and startup hundreds of new virtual servers in EC2. It happened! Another article that I read talked about a small company that didn’t know they were hacked until they received the following month’s AWS invoice that was for approximately $30,000! This is scary stuff!

Back to the process. We need to setup a user in AWS that does not have any rights in AWS and then only grant that user the rights it needs to conduct our backups. Here’s how we create that user:

  1. In the AWS console, click on the drop down menu in the top right corner of the screen where your username is.
  2. Select the Security Credentials option from this menu.
  3. You will likely receive a prompt asking you how you want to proceed. Your options will be Continue to Security Credentials or Get Started with IAM Users. The first option pertains to the security of your root level account. We want to setup a non-root user, so click the Get Started with IAM Users button to go to the IAM Users configuration.
  4. In the IAM Management Console, click the Create New Users button.
  5. In the screen that appears, you will see that you can create a couple of users in one step. We only need to create the one user, so enter a username in the first field. For our example, we will create a user called mybackup.
  6. Below the user names, leave the box checked that is labeled Generate an access key for each user. The access key is needed to setup the CLI environment.
  7. Click the Create button at the bottom of the screen to create your new user.
  8. You will see a screen where you can download the user’s security credentials. This information will be needed later to setup the CLI environment. Click the Show User Security Credentials link and copy the Access Key ID and Secret Access Key for later use. When you are finished, click Close at the bottom of the screen.

Our new user is created and by default, this user has no privileges in AWS. We will need to grant this user privileges necessary to conduct a backup. This next section will explain how to define that security.

Define a New Security Policy

In AWS, all security is handled through the use of security policies. These policies can be written in a number of different ways. We will define a simple policy that allows a user to read and write files with our S3 bucket.

  1. From the IAM Management Console, click on the Policies link from the menu on the left side of the screen.
  2. Click the Create Policy button at the top of the policy list screen.
  3. We are going to use Amazon’s Policy Generator to help aid in building our new policy. Click the Select button next to the Policy Generator option.
  4. We are first going to create the rule that allows us to list the S3 bucket contents. To start, set the Effect option to Allow.
  5. In the AWS Service dropdown list, select Amazon S3.
  6. In the Actions field, place a check next to the ListBucket action.
  7. In the Amazon Resource Name (ARN) field, we would enter arn:aws:s3:::tkreiner-com-web-backups (NOTE: This is using the example name that we provided above. Please be sure to replace tkreiner-com-web-backups with the name of your S3 bucket.)
  8. Click the Add Statement button to add this security to our new policy.
  9. Next, we are going to create the rule that allows the user to read and write files to our bucket. To start, set the Effect option to Allow.
  10. In the AWS Service dropdown list, select Amazon S3.
  11. In the Actions field, place a check next to the GetObject (read a file) and PutObject (write a file) actions.
  12. In the Amazon Resource Name (ARN) field, we would enter arn:aws:s3:::tkreiner-com-web-backups/* (NOTE: Be sure to add the final “/*” to the end of your bucket name. This tells AWS that the policy applies for any file inside of the S3 bucket.)
  13. Click the Add Statement button to add this security to our new policy.
  14. With all of our rules defined, click the Next Step button at the bottom of the screen.
  15. At the Review Policy screen, you are asked to provide a name and a description for your policy. For this example, we will call our policy AllowS3Backup. Give your policy a name and description and click the Create Policy button.

Grant Backup Policy to Backup User

Our security setup is almost complete. When we created our backup user, I said that the user does not have permission to do anything yet. We need to add this policy to our user account so that they then have the rights to conduct the backup.

  1. While you are still in the list of policies, search for the policy you just created in the previous select and click on the policy name.
  2. In the Policy Detail screen, scroll down to the section titled Attached Entities.
  3. Click the Attach button.
  4. Place a checkmark next to your backup user and click the Attach Policy button.

Where Are We At?

I said this process wasn’t easy. We have taken a lot of steps to get here, but where is “here”? Here’s a quick recap:

  • We created a new storage area in S3.
  • We set a retention policy on that S3 storage to keep contents for only 30 days.
  • We defined a new user whose credentials will be used to write the backup files to S3.
  • We created a security policy to allow the user access to a specific S3 bucket and to list, read and write to that bucket.
  • We added this security policy to our backup user.

From a security setup standpoint, we are done! The rest of this article is a brief introduction to setting up the CLI interface and copying the files to S3.

Installing and Using AWS CLI

With all of the security work done, it is now time to setup our command line interface. If you are using an Amazon imaged server, you may already have the CLI tools installed as part of the image. However, if the software is missing, see Installing the AWS Command Line Interface page on Amazon’s site for more details for installing.

With the software installed, we need to configure it to use the credentials of our new backup user. In both Windows and Linux, from a command prompt, enter the following command:

aws configure

You will first be prompted to enter an Access Key ID and Secret Access Key. Enter the information that you captured in the last step of setting up your new user. Next, you will be asked for a default region and an output format. Simply press Enter at both of these prompts.

Your CLI environment should now be ready. Let’s run through some tests.

List S3 Bucket

Let’s first see if we can see the contents of our S3 bucket. From the command prompt, enter the following:

aws s3 ls s3://tkreiner-com-web-backups

Again, remember to replace tkreiner-com-web-backups with the name of the S3 bucket that you created. When you run this command, you shouldn’t see any files, but you also shouldn’t receive any errors. So far . . . so good.

Copy Backup File to S3 Bucket

Now we should try to copy a file to our new S3 bucket. Let’s assume that you have your backup data written to a TAR or ZIP file. In this example, I will use a file called mybackup.tar. To copy the file to your S3 repository, you will use a command like the following:

aws s3 cp mybackup.tar s3://tkreiner-com-web-backups

You should see the file get copied to your backup bucket. Once the upload is complete, use the command above to list the contents of the bucket and verify that your backup copied correctly.

Retrieve Backup File from S3 Bucket

Let’s try and pull that same backup file back down to our computer. We will use a command that is similar to the command for uploading a file. The command will look something like:

aws s3 cp s3://tkreiner-com-web-backups/mybackup.tar .

Again, you should be able to see the command download the file to your current directory. When the command completes, review the files in your directory and you should see your file.

Delete???

Remember that one of our requirements was that the backup user can’t delete files from the backup bucket. We should test and make sure that is the case. Let’s try to delete the mybackup.tar file using the following command:

aws s3 rm s3://tkreiner-com-web-backups/mybackup.tar

You should receive an error telling you that you don’t have sufficient permission to delete files.

Success!!!

If all of the commands above ran without an issue, then all of your configuration efforts have been a success! You can now begin setting up your backup scripts and jobs and start securely copying your files to Amazon’s S3 storage.

Going Further

This article serves as a guideline for setting up security for transferring files back and forth to S3. There are lots of ways to configure security policies. For example, through policies, you can limit what IP address the request is allowed from. The possibilities are endless. If you want to learn more, there is extensive documentation on the AWS Documentation pages with many examples to learn from.

Photographing Horse Shows – Part 1

Over the last 3 years, I have had the opportunity to photograph horses and their riders competing in shows at the Prince George’s Equestrian Center in Upper Marlboro, Maryland. Much of that experience has been in the covered outdoor arena that is pictured above. Photographing fast horses, in a shaded environment, with a bright background, is not an easy task.

I have learned a lot about photography through this experience and I wanted to share some of what I have learned with others. I  decided to start a multi part series about this subject. Each post will present a different technique or lesson that I have learned at these shows.

The Problem

As I mentioned, trying to photograph a horse in shade with a brightly lit background is a challenge for a camera. If you let the camera try to meter the picture, you are likely to get a picture like this:

Under Exposed Rider
Nikon D750, f/7.1, 1/800s, ISO 1250

In photography terminology, this is an issue of dynamic range. Although your eye can see the trees and grass in the distance as well as the horse and rider in the foreground without any problem, the camera has a hard time seeing the very broad range of light differences and when it tries to meter, it meters for the brighter light that fills the majority of the picture. This end up leaving the rider and horse in darkness.

Meter The Ground

One technique that I have used to help in this situation is to meter the ground just in front of the place where you expect the rider to be. Let’s say that you want to get a picture of the horse jumping over a fence. Here is what you do:

  1. Frame up your picture with the jump fence filling the frame as you need. With a zoom lens, this means getting your zoom set to frame the picture as you need.
  2. Tilt your camera down towards the ground. Try to fill the center of your camera with the ground in front of the jump. By doing this, you are taking the bright background out of the frame so that the camera can focus on the lighting that is around the jump. Your frame will look something like the following:
    Meter Ground Near Fence
    Nikon D750, f/4, 1/800s, ISO 2200

     

  3. While the camera is pointed at the ground, use your Exposure Lock button to lock the exposure settings into your camera. On my Nikon D750, I simply press the AE-L button on the back of my camera. Check out your camera’s manual for information about how to set this with your camera.
  4. With the exposure settings now locked into the camera, reframe your shot and wait for the moment that the horse jumps over the fence. Shoot the best picture ever!

    Proper Exposed Rider
    Nikon D750, f/4, 1/800s, ISO 2500
  5. Add a little post production and you get…
    ProcessedRider

Learn More About Exposure Lock

Want to learn more about how exposure lock works on a camera? There is a great article on Photography Life’s web site titled Nikon AE-L / AF-L Button.