Kit Wai Chan - Fotolia
Containerized applications make it easier for IT teams to build better software. Containers often are comprised of a complete application and all of its dependencies, eliminating environmental inconsistencies. So, instead of spending hours building up servers and environments for development, test and production, developers can speed the process by managing containers. But there's still some complexity involved with manually deploying applications to containers.
But even though it's fast and simple to deploy and run containerized applications, developers typically need additional resources when scaling and managing containers to support the load and availability requirements of a production application. For example, a developer can run services in one or more containers on a laptop during development. When it comes time to run containerized applications in production, however, he has to consider good architectural practices, such as eliminating single points of failure and allowing the application to scale to accommodate an increase in demand. Depending on the size of the application, this may require tens or hundreds of containers.
Manually deploying and scaling containers for production is a painful process; container clustering and orchestration services can help. Developers can choose to build and oversee their own container management infrastructure or offload that work to a managed service.
The native AWS option
The Amazon EC2 Container Service (ECS) enables developers to build, operate and manage container clusters on AWS; it also allows developers to run Docker containers across a managed cluster of Amazon Elastic Compute Cloud (EC2) instances. Developers have root access to container instances, which are EC2 instances that launch and run containers, and they can use APIs to interact with ECS. This integration starts and stops container-based applications, gains insight into the state and health of the cluster and schedules the placement of Docker containers across the cluster to accommodate application demand and availability.
The big benefit of ECS is that developers don't have to install container management software -- it is provided as a service. This gives IT teams time to focus on developing applications and building container images that can be uploaded to a container registry and made available for deployment.
ECS also integrates with existing AWS tools, including Elastic Load Balancing (ELB), Virtual Private Cloud and Identity and Access Management (IAM). The service is free to use; customers only pay for the underlying infrastructure, such as EC2 instances, Elastic Block Store volumes and ELB.
Alternatives to ECS
In addition to using ECS as a managed service, developers can choose to install and manage their own clustering and orchestration services. Open source options like Kubernetes and Mesos have thousands of contributors who provide an array of features and functionality. But implementing and managing these tools can add complexity and operational overhead. For some organizations, that trade-off is worth it if they need to support complex scenarios and provide customization.
With Docker's swarm mode, a native clustering and orchestration capability built in to the release of the 1.12 Docker engine, developers can create a swarm -- a cluster of self-healing Docker engines designed to host distributed containerized applications. Like Kubernetes or Mesos, developers can create a swarm on bare-metal or on virtual machines, such as EC2 instances. Developers can add management and worker nodes to swarm with a single command.
ECS evolves with managing containers
Many large-scale businesses currently run containers in production on AWS. Some make use of ECS, while others choose to manage their own clustering services to avoid vendor lock-in. But AWS recently added some compelling functionality, such as Application Load Balancing, to assist those who want to offload the burden of managing containers' infrastructure.
Test your cloud container technology knowledge
The popularity of cloud containers is soaring. If your company isn't already working with containers, the time to get on board is now. Take this brief quiz to test your knowledge.
Application Load Balancing, which supports dynamic ports, eliminates port conflicts when running multiple containers. This feature also allows multiple services to share a single load balancer.
Developers can also assign an IAM role to an ECS task. The ECS task is based off a task definition, which can include multiple containers to support a single service or an entire application. Applications running in a container can assume the roles assigned via IAM. This eliminates the need to provide AWS credentials in the application code.
ECS also supports Service Auto Scaling, which enables IT teams to scale ECS in the same manner as EC2 instances. Scaling policies can use CloudWatch Alarms to scale up or demand, as the application requires.
All you need to know about Amazon ECS
Container management relies on several key considerations
Create and schedule tasks in Amazon ECS