IT teams release software changes these days at a break-neck pace. To achieve this type of efficiency, IT teams...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
follow Agile software development practices and embrace the DevOps mindset. One key principle to reaching this level of efficiency is to get fast feedback when new changes are integrated into an application. To accelerate this process, many organizations use containers in conjunction with a cloud platform, such as Docker and AWS, to rapidly release multiple application updates on a daily basis.
Amazon Elastic Compute Cloud (EC2) instances are great for running custom application code; but they take some time to launch, and developers have to wait for the operating system to boot. Containers, on the other hand, are portable, lightweight and don't require all the baggage that comes with a typical VM. Instead, containers use process isolation on an existing host to grab a portion of the CPU, memory and storage resources. IT teams can have an existing set of clustered hosts running on Amazon EC2, allowing them to quickly spin up containers.
Docker is the de facto open source technology for containers, and running Docker and AWS in tandem can be a strong fit for continuous delivery. The Amazon EC2 Container Service (ECS) allows developers to run Docker containers on a managed cluster of EC2 instances. They can also run Docker on a single EC2 instance, if needed.
Continuous delivery with Docker and AWS
In addition to Amazon ECS, AWS offers several features to help IT teams build a container-based continuous delivery pipeline. Here's a rundown of how to set this up on AWS.
- Local development: When working locally, developers can use Docker Toolbox on Mac OSX or Windows. This feature allows developers to quickly test applications in the same container environment that will run in production. To get started, developers create a Dockerfile that defines the settings for the container, such as the base image to use and which software packages to install. They can use the Docker run command to launch the container locally on a development machine to ensure the application works as expected. For multi-tier apps that require multiple containers, developers can use Docker Compose, which spins up multiple containers defined in a docker-compose.yml configuration file.
- Source control: Once a developer is satisfied with the application, he can commit code to a version control repository. At this point, an orchestration tool is necessary to poll the source control repository for changes and kick off the rest of the pipeline. An easy way to do this is to use AWS CodePipeline, which can be configured to poll AWS CodeCommit for changes. Developers can also use a repository hosted on GitHub. In addition to the application code, store the Dockerfile and docker-compose.yml file in this repository.
- Build: When a developer commits code to source control, CodePipeline triggers a build stage. There are a number of build servers compatible with CodePipeline. The Jenkins build server is a popular choice, but there are other third-party tools available. During the build stage, application source code can compile and a Docker build can execute to create container images. These images must be accessible from either a public or private container registry. The build stage can push these Docker images to Docker Hub or EC2 Container Registry.
- Test: Another common approach for using containers on AWS is to create one or more test stages in CodePipeline. For example, a test stage could run unit tests against the application code prior to launching containers. After everything is up and running, there may be another stage that runs integration tests to ensure that the application is functional, accessible and serving up the right content.
- Deploy: After container setup passes all stages, developers can push changes into production, and the containers will launch from the images pushed to the container registry during the build phase. Deployments to production either occur automatically, which could be considered continuous deployment, or the latest version of the application is considered deployable for production use when the team is ready for it. IT teams can set up the deployment stage for a continuous delivery pipeline with Docker and AWS in a number of ways, including interfacing directly with ECS or via AWS Elastic Beanstalk, which also supports single- and multi-container Docker environments.
Getting quizzical with open source cloud computing
An apple on the teacher's desk won't help you with this open source cloud quiz, which includes facts on OpenStack, Amazon Web Services, Docker, Hadoop and more. Test your knowledge to see where you stack up.
Shuttle AWS app code on a CD pipeline
It's not easy to deploy DevOps on AWS
Tools to facilitate DevOps environments on AWS