ra2 studio - Fotolia
Moving an application from an on-premises infrastructure into the public cloud generates a number of benefits -- especially in information security. Cloud providers, such as AWS, assume responsibility for securing the cloud infrastructure, but that doesn't mean there's nothing for administrators to do.
Cloud consumers bear the burden for securing applications and data. This distribution of security effort is known as a shared responsibility approach.
One of the benefits of cloud computing is that Amazon manages the security of the infrastructure, such as network devices, servers, storage systems and physical infrastructure. AWS provides all aspects of physical security, such as controlling physical access to data centers and monitoring network infrastructure. As a general rule, if a service or device is at or below the level of a hypervisor, then AWS will manage all aspects of its security.
AWS customers using services such as Elastic Compute Cloud (EC2) servers and Amazon Simple Storage Service (S3) are responsible for securing applications, operating systems and identities, as well as authentication and authorizations. The exception to this shared responsibility approach is that AWS provides additional security for platform as a service (PaaS), including DynamoDB and Relational Database Service. In the case of PaaS, AWS provides the security of the underlying database, while users still maintain access controls on database structures.
AWS manages a significant portion of overall security. Still, there is much for Amazon customers to consider, starting with operating systems.
AWS customers have the same type of responsibility for OS security in the cloud as on premises. System administrators should enforce good practices, such as limiting the types of applications and libraries available on servers. Production instances should not have compilers installed, and network traffic should be blocked on unneeded ports, for example. The Center for Internet Security offers free guidelines on hardening operating systems.
For operating systems, think about how to configure available applications and services, and close down ports. Consider using a hardened operating system, such as CIS Ubuntu.
It's also important to encrypt data at rest and in motion. S3 may be encrypted automatically using server-side encryption. Client applications can write unencrypted data to an S3 bucket, where that data will be encrypted automatically. When the data is retrieved, it will be decrypted and returned to the calling application.
Data also can be encrypted in DynamoDB, but users will need to use client-side encryption. In this model, the data is encrypted before it is saved to the data store and decrypted by the client when the data is retrieved.
AWS manages encryption keys when server-side encryption is used; the client application needs to manage encryption keys when client-side encryption is used.
Using security groups
Another good practice is segmenting traffic on the network. Virtual private clouds (VPCs) are used to define a logical network for a set of related server, load balancing and related resources. Think of VPCs as virtual data centers. A corporate customer can, for example, restrict network traffic that originates from IP addresses in its on-premises network. Within a VPC, traffic can be further segmented using security groups and network access control lists (NACLs).
Security groups are stateful firewalls that control access to EC2 instances. Security groups consist of sets of rules that specify the protocols allowed to communicate with an instance -- such as, HTTP, HTTPS and SSH -- and any restrictions on the sources of that traffic. A single security group can be applied to multiple instances, so it is a useful way to apply common sets of firewall rules for multiple servers.
NACLs are stateless firewalls that provide for fine-grained control over protocols. NACLs are used in conjunction with security groups to implement network security policies.
CloudWatch monitors performance and measures key metrics on instances, storage systems and platform services. Although it is not primarily a security tool, it can help identify anomalous events on infrastructure, such as an unusually large download from a database server.
Gartner's Jay Heiser explains why enterprises need to do their part in securing their clouds.
CloudTrail is a logging service that captures details of calls made to Amazon APIs. This allows cloud administrators to monitor significant events, such as starting or shutting down instances, as well as other changes, such as adding users to the AWS Identity and Access Management repository.
Minimize your risks
Organizations that need additional security applications could look to the AWS Marketplace for Amazon partners that offer enhanced security tools. Alert Logic, for example, provides Web application firewall and vulnerability scanning tools, while Sophos offers a unified threat-management application.
As part of the shared responsibility approach, AWS customers should plan for disaster recovery. AWS products are designed to be durable and available, but outages occur. If you need high availability at all times, consider application architectures that span multiple regions.
Another way to minimize the risk of configuration errors is to employ DevOps procedures to automate infrastructure deployment. Cloud administrators should review automation scripts to minimize the risk of undetected configuration errors. Consider putting AWS Config to work to catch misconfigured infrastructure.
The AWS shared responsibility approach relieves businesses and other users of cloud services from many security concerns associated with running a data center, but there are still many facets of information security that remain in the hands of customers.
Locking down APIs is a security must
Shared security model secures AWS resources
New Lambda, RDS features bolster security