CCSK Domain 4 – Compliance and Audit Management

This section on the CCSK domains is about compliance management and audits. This section goes through in some detail aspects one should think about for a compliance program when running services in the cloud. The key issues to pay attention to are:

  • Regulatory implications when selecting a cloud supplier with respect to cross-border legal issues
  • Assignment of compliance responsibilities
  • Provider capabilities for demonstrating compliance

Pay special attention to: 

  • The role of provider audits and how they affect customer audit scope
  • Understand what services are within which compliance scope with the cloud provider. This can be challenging, especially with the pace of innovation. As an example, AWS is adding several new features every day. 

Compliance 

The key change to compliance when moving from an on-prem environement to the cloud is the introduction of a shared responsibility model. Cloud consumers must typically rely more on third-party auudit reports to understand compliance arrangement and gaps than they would in a traditional IT governance case. 

Many cloud providers certify for a variety of standards and compliance frameworks to satisfy customer demand in various industries. Typical audit reports that may be available include: 

  • PCI DSS
  • SOC1, SOC2
  • HIPAA
  • CSA CCM
  • GDPR
  • ISO 27001

Provider audits need to be understood within their limitations: 

  • They certify that the provider is compliant, not any service running on infrastructure provided by that provider. 
  • The provider’s infrastructure and operations is then outside of the customer’s audit scope, relying on pass-through audits. 

To prove compliance in a servicec built on cloud infrastructure it is necessary that the internal parts of the application/service comply with the regulations, and that no non-compliant cloud services or components are used. This means paying attention to audit scopes is important when designing cloud architectures. 

There are also issues related to jurisdictions involved. A cloud service typically will let you store and process data across a global infrastructure. Where you are allowed to do this depends on the compliance framework, and you as cloud consumer have to make the right choices in the management plane. 

Audit Management

The scope of audits and audit management for information security is related to the fulfillment of defined information security practices. The goal is to evaluate the effectiveness of security management and controls. This extends to cloud environments. 

Attestations are legal statements from a third party, which can be used as a statement of audit findings. This is a key tool when working with cloud providers. 

Changes to audit management in cloud environments

On-premise audits on multi-tenant environments are seen as a security risk and typically not permitted. Instead consumers will have to rely on attestations and pass-through audits. 

Cloud providers should assist consumers in achieving their compliance goals. Because of this they should publish certifications and attestations to consumers for use in audit management. Providers should also be clear about the scope of the various audit reports and attestations they can share. 

Some types of customer technical assessments, such as vulnerability scans, can be liimted in contracts and require up-front approval. This is a change to audit management from on-prem infrastructures, although it seems most major cloud providers allow certain penetration testing activities without prior approval today. As an example, AWS has published a vulenrability anpenetration testing policy for customers here: https://aws.amazon.com/security/penetration-testing/

In addition to audit reports, artifacts such as logs and documentation are needed for compliance proof. The consumer will in most cases need to set up the right logging detail herself in order to collect the right kind of evidence. This typically includes audit logs, activity reporting, system configuration details and change management details. 

CSA Recommendations for compliance and audit management in the cloud

  1. Compliance, audit and assurance should be continuous. They should not be seen as point-in-time activities  but show that compliance is maintained over time. 
  2. Cloud providers should communicate audit results, certifications and attestations including details on scope, features covered in various locations and jurisdictions, give guidance to customers for how to build compliant services in their cloud, and be clear about specific customer responsibilities. 
  3. Cloud customer should work to understand their own compliance requirements before making choices about cloud providers, services and architectures. They should also make sure to understand the scope of compliance proof from the cloud vendor, and ensure they understand what artifacts can be produced to support the management of compliance in the cloud. The consumer should also keep a register of cloud providers and services used. CSA recommends the Cloud control matrix is used to support this activity (CCM).

Making Django, Elastic Beanstalk and AWS RDS play well together

A couple of days ago I decided I should learn a bit more hands-on AWS stuff. So I created a free tier AWS account, and looked around. I decided I’d take a common use case; deploy a web application to Elastic Beanstalk and add a domain and SSL.

Setting up tools

Step 1: reading documentation. AWS has a lot of documentation, and it is mostly written in a friendly manner with easy-to-follow instructions. Based on the documentation I opted for using the command line Elastic Beanstalk tool. To use this you need Python and pip. You can install it with the command

pip install awsebcli –upgrade

If you are having a permissions problem with doing this, you can throw in a “–user” flag at the end of that command. This will install the tool you need to create and manage EB environments from your command line. Since it is a Python utility it works on Windows, as well as Mac and Linux. Installing this did not pose any hiccups. You can read more about how to set this tool up and updating your system path here: https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/eb-cli3-install.html.

Before using it you need to set it up. Issue the command

eb init.

This will give you a prompt asking for a number of things, like region to set up in, etc.

Learning point: if you want to set up a database, such as MySQL in the EB environment, you should use the database option when issuing the next command. Anyway to set up your environment, use

eb create https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/eb3-create.html

If you want a database in your environment add the –db flag with the desired options; you cannot create the database in the EB Console (web-based interface) afterwards, at least not for micro instances that are allowed in the free tier. According to someone on Stack Overflow, this is a bug in AWS that you can wait for them to fix – or use the command line option (supposedly that works but it is not what I did).

If you create a database in your EB environment, your DB will be terminated too if you terminate that environment. You may not want that, so you can consider setting up an external database and connecting to it outside of EB. That is what I did, and there’s more about that a little further down this post.

Creating a Django app

To have something to deploy I created a Django app. This is an asocial network; you can post things with links and hashtags but you can’t follow other users or anything like that. It has user management and uses the default Django admin system and authentication system (session based). I called it woodscreaming and you can view it here: woodscreaming.com.

Setting up a virtual environment

First, to avoid mixing up things and creating a requirements file that works, create a virtual environment. For this I like to use the tool virtualenv (works on all platforms, can be installed with pip if you don’t have it):

virtualenv –python=python venv

“venv” is the name of your virtual environment. Everything you install when the environment is active will be contained in that environment, and you have all dependencies under control (think of it like a semi-container-solution). To activate the environment on Linux/Mac:

source venv/bin/activate

On Windows:

venv\Script\activate

When you have all the dependencies your app needs in place, run

pip freeze > requrements.txt

This creates a requirements.txt file that EB will use to install your app in the cloud.

Adding EB configuration files to the Django project

To make things work, you also need to add some EB specific configuration files to your Django project. Create a folder named .ebextensions in your project’s root folder. In this folder you will need to add a django.config file with the following contents:

option_settings:
  aws:elasticbeanstalk:container:python:
    WSGIPath: projectname/wsgi.py

Of course you need to change the word projectname into the name of your project. This tells EB where to find your wsgi file. This file describes how the web server should be set up and is a Python standard.

You should also tell EB to run migrations to get your data models to work with your database. Adding a file (I called it db-migrate.config) to the .ebextensions folder fixes this. Here’s what you need to add to that file:

container_commands:
  01_migrate:
    command: "django-admin.py migrate"
    leader_only: true
option_settings:
  aws:elasticbeanstalk:application:environment:
    DJANGO_SETTINGS_MODULE: discproject.settings

You should also create a folder called .elasticbeanstalk. The command line client will populate this with a YAML file called config.yml tha tells EB what resources are needed (you don’t need to edit this file yourself).

That’s it to begin with – some changes need to be made when adding an RDS database and setting up http to https forwarding.

Deploying to EB

Deploying to EB is very easy, you simply deactivate your virtual environment by issuing the command “deactivate” and then you run

eb deploy

It now zips your source, uploads it to AWS and installs it and provisions the resources defined in your config.yml file. It takes a while, and then it is done. Then you can see your web app online by issuing the command

eb open

The app will get its own URL automatically, of the format “something.aws-region.elasticbeanstalk.com”. It does not get an SSL certificate (https) automatically – you will need to set up a custom domain for that (more about that later). Anyway, opening it up shows the web app running in the cloud and I am able to use it.

Dev database vs prod database

By default django-admin.py sets up a project hat uses an SQLite database; a single file SQL database that is popular for persistent storage in mobile apps and embedded applications. When deploying your development environment’s database is deployed too, and with each redeploy you will overwrite it. It is not great for concurrent operations, and obviously overwriting all user data on each deploy is not going to work. There are ways around this if you want to stick to SQLite but that is normally not the best solution for a web app database (although it is great for development).

Next we look at how we can create a database in the cloud and use that with our production environment, while using the SQLite one in local development.

Adding an RDS database

Attempt 1: Using the EB Console

In the EB console (the web interface), if you go to “Configuration”, there is a card for “Database” and an option to “modify”. There you can set up your desired database instance and select apply. The problem is… it doesn’t work for some reason. The deployment fails due to some permission error. I’m sure it is possible to fix but I didn’t bother fiddling enough with it to do that. And as mentioned above; if you terminate the environment you will also terminate the database.

Attempt 2: Setting up and RDS database external to EB

This worked. Basically following AWS documentation on how to set it up was quick and easy:

  • Go to RDS, create a new instance. Select the type of database engine, EC2 instance type etc.
  • Select db name, username, password (remember to write those down – I use secure notes in LastPass for things like this). Set the DB instance to be “public” to allow queries from outside your VPC to reach it.
  • Add the RDS security group to your EB EC2 instance. This is important – if you do not do this, it is not possible to query the database from EB.

To add that security group in EB you need to go to the EB console, head to configuration and then select the card for instances. Select “modify” and then head to the security groups table – add the RDS one (it is automatically generated and named something like rds-default-1) and click apply. Because the database is external to EB you also need to add environment variables for the connection. To do this, head to the console again and select “modify” on the software card. Add the following environment variables:

  • RDS_DB_NAME
  • RDS_HOST_NAME
  • RDS_PORT
  • RDS_USERNAME
  • RDS_PASSWORD

The values are found in your RDS instance overview (head to the RDS console, select your instance, and you find the variables a bit down on the page). Now, you also need to tell your Python app to read and use these. Add this to your Django settings file:

if 'RDS_HOSTNAME' in os.environ:
    DATABASES = {
        'default': {
            'ENGINE': 'django.db.backends.mysql',
            'NAME': os.environ['RDS_DB_NAME'],
            'USER': os.environ['RDS_USERNAME'],
            'PASSWORD': os.environ['RDS_PASSWORD'],
            'HOST': os.environ['RDS_HOSTNAME'],
            'PORT': os.environ['RDS_PORT'],
        }
    }
else:
    DATABASES = {
        'default': {
        'ENGINE': 'django.db.backends.sqlite3',
        'NAME': os.path.join(BASE_DIR, 'dbp.sqlite3'),
        }
    }

After doing this the EB environment health was showing as “green” and all good. But my web app did not show up and the log showed database connection errors. The solution to that was: read the docs. You also need to add the RDS default security group (the one that allows inbound connections) to the allowed sources for inbound connections. Details here: https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/AWSHowTo.RDS.html. After doing this – it works!

Adding a Django superuser to the RDS database

You could SSH into your EC2 instance running the Django app and use the manage.py utility; but this kind of beats the point of having a PaaS that supposedly should be able to configure things without SSH-ing into everything.

To add a Django superuser you should thus add a new Django command to your environment. Here’s a good description of how to do that: https://github.com/codingforentrepreneurs/Guides/blob/master/all/elastic_beanstalk_django.md. You can add the command to your db-migrate.config file in the .ebextensions folder.

Configuring DNS with Route 53

Now, having the default URL is no fun, and you can’t add SSL on that one. So we need to set up DNS. I chose to buy a domain name from Amazon and then set up DNS with Route 53. Setting that up for an EB environment is super-easy: you make an A record as alias to your EB environment URL.

Adding an SSL certificate that terminates on the load balancer

Now that we have a working domain name, and we’ve set up the DNS records we need, we can add an SSL certificate. The easiest way to provision the certificate is to use Amazons certificate management service. You provision one for your domain, and you can verify by adding a CNAME record to your DNS hosted zone in Route 53.

The next thing you need to do to make things work is add the certificate to your Elastic Beanstalk environment. Depending on your threat model and your needs, you can choose the simple route of terminating https on the load balancer (good enough for most cases), or you can set up AWS to also use secure protocols in internal traffic (behind the load balancer). I chose to terminate traffic on the load balancer.

The AWS docs explains how to do this by adding a secure listener on the load balancer: https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/configuring-https-elb.html.

Forwarding http to https

To forward http traffic to https there are several ways this can be done. The easiest is to set up forwarding on the Apache server. Since we are not using SSH to fiddle with the server directly, we do this by adding a configuration file to our .ebextensions folder in the Django project, and then redeploying. Adding a file https.config with the following contents does the job:

files:
    "/etc/httpd/conf.d/ssl_rewrite.conf":
        mode: "000644"
        owner: root
        group: root
        content: |
            RewriteEngine On
            <If "-n '%{HTTP:X-Forwarded-Proto}' && %{HTTP:X-Forwarded-Proto} != 'https'">
            RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R,L]
            </If>

Summary

This post is a walk-through of getting the essentials done to use Elastic Beanstalk to serve a web application:

  • Create an environment and deploy an app
  • Use config files to manage server processes and configurations
  • Setting up an external RDS database and connect to it using environment variables
  • Configuring a custom domain name and setting up DNS
  • Adding SSL termination on the load balancer
  • Adding a http to https rewrite rule to Apache on the web server using a config file