kantver - Fotolia
- Mike Pfeiffer, CloudSkills.io
With the current rate of innovation in cloud computing, the industry is loaded with acronyms and buzzwords that, on the surface, might be misleading or just plain confusing. You may have heard about people building applications using serverless computing platforms or designing software that runs on a microservices architecture. Even though these ideas sound like hype, the reality is that they're changing the way businesses build, deploy and operate applications.
A serverless computing architecture is a way for developers to build applications without having to think about servers. It's simply a layer of abstraction that enables developers to focus on writing code while ignoring the concept of servers and traditional infrastructure.
In 2014, Amazon released AWS Lambda, a service that enables developers to create cloud-based functions that run on an existing fleet of managed instances. AWS later released its API Gateway service, which can be used to provision a public endpoint to invoke Lambda functions over HTTP. Together, AWS Lambda and API Gateway help organizations build web, mobile and internet of things back ends that are inherently scalable and require no servers.
AWS technologies are considered leaders in the serverless computing architecture category, but the company isn't the only provider. Microsoft, Google, IBM and others have released similar offerings that are considered functions as a service (FaaS). Developers can work with any of those providers to create and run functions to enable a serverless application architecture.
Improve scalability and efficiency
Regardless of the FaaS platform, serverless computing architectures get a lot of attention. The following are a few key features of a FaaS platform.
Reduced complexity. Building highly available application architectures is easier because of public cloud platforms like AWS and Azure. It's fairly straightforward to spin up an autoscaling group of VMs behind a load balancer. You can even stretch application architectures across multiple regions to enable geographic redundancy. With a serverless computing architecture, the infrastructure goes away. Developers can focus completely on writing the code for the functions that power their applications. The cloud provider manages the servers that invoke those functions, so they are highly available.
Built-in scalability. One of the trickiest parts of building applications for web scale in the cloud is tuning the autoscaling configuration for virtual servers. You have to find the right balance to make sure you can scale out based on a spike in traffic and then scale back when things subside. It sounds easy, but every application has its own behavior, and these settings must be dialed in to optimize costs and provide the best performance. With serverless computing, this is another task that gets offloaded to the provider, and the customer is free to focus on the application. For example, the AWS Lambda service runs on a fully managed fleet of Elastic Compute Cloud instances. The service scales your application based on the code being executed, and developers or operations engineers no longer need to manage VMs or autoscaling groups.
Elimination of idle resources. Another big benefit to serverless computing is that you're only charged for the time it takes the instances to run code. With traditional servers, some resources are left unused, even with autoscaling applications. But with serverless, you pay only if someone actually uses your application. You don't have to pay an hourly charge to run a VM that may or may not do any work. AWS offers sub-second metering with its Lambda service, so you're charged only for every 100 milliseconds of code execution. Other FaaS providers offer similar pricing.
Inside serverless computing architecture
To understand how a serverless application is built, let's take a look at a common web application using an AWS serverless architecture.
As you can see in Figure 1, the application uses several AWS utilities. Let's break down the architecture.
API tier. In this scenario, AWS Lambda and API Gateway power the web app back end. Developers can write discrete, stateless Lambda functions to handle create, read, update and delete operations for a variety of resources the application supports. The front-end code calls Lambda functions through the API Gateway to do the heavy lifting for the application.
Database tier. Persistent data can be stored in other managed services like Amazon DynamoDB, which is a NoSQL database service, or Amazon Relational Database Service. The AWS Lambda functions can communicate directly with these services to retain data.
Remember, this is just one example of a serverless computing architecture. Mobile and internet of things back ends, along with real-time stream processing, are other use cases for serverless computing.
Amazon and Netflix have used the microservices architectural style for years. The idea with microservices is to break one large application into a collection of task-oriented services.
Business apps are typically built as a single, monolithic unit. These often include the front end of the application that serves up HTML to end users, the back-end server-side code that handles the heavy lifting and a database tier for storing and retaining data. Monolithic application architectures worked great for years. But as applications grow to support a large number of users, updates become more difficult. Because components of a monolithic application are tightly coupled, even a slight change to the codebase could require a completely new version of the app.
As an alternative, a number of smaller microservices can collectively power an entire application. Because microservices typically have few responsibilities, they focus on doing a single job or performing one task that supports the overall application.
Benefits of microservices
So why should a business break up its monolithic applications to use a microservices architecture? There are a few solid reasons.
Fault isolation. When each component of the application operates as a separate service, it can fail without crashing the entire system. Smaller teams responsible for their respective microservices can iterate quickly and make changes to their codebase without seeing bugs cascade into the entire application. This reduces overall downtime for the application and improves productivity for smaller teams building and supporting a specific microservice.
Loose coupling. Each microservice is completely independent and can run its own technology stack. As long as other services within the application can communicate with the microservice using a nonproprietary HTTP API, the underlying implementation of the microservice can change at any time. This enables teams to implement the technology that makes the most sense, and it prevents the need to use an entire monolithic application to commit to a single technology stack.
Reduced barrier to entry. Smaller microservices are less complex and thus easier to understand. This makes it easier to add new developers to a team that's working with them.
Unlike FaaS, which currently makes use of public cloud services, microservices architectures can run on premises or in the cloud.
While it is common to build microservices using containers, an emerging practice is to build a microservices architecture with serverless functions. Figure 2 illustrates this approach using services from AWS.
Figure 2 shows three individual microservices -- each built using a set of AWS Lambda functions that handle the operations for the service. Each set of these functions sits behind its own API gateway, which other microservices and components can use. You could, for instance, have a static website in Amazon S3 using front-end code to call one or more of the microservice APIs within the overall architecture.
Before you commit
There's no doubt that the patterns and practices discussed here have big advantages. Still, there are important things to consider before diving into serverless microservices.
Developer tooling. One of the biggest gripes at this stage is developer tooling. IT teams working on large-scale serverless app developments need to manage dependencies with a handful of tools. Expect this to improve drastically over the next few years.
Service limits. Each provider will have its own limits on how long functions can execute and how much capacity will be available for application code and dependencies. This is something you should watch when looking into serverless applications.
Operational maturity. Deploying and supporting a microservices architecture is not for the faint of heart. Sure, Amazon, Microsoft and Netflix have used these patterns with great success. But not every team has similar technical skills. If you're considering a serverless microservices approach, be ready to hire talented workers and to provide training so that existing staff will be productive.
Serverless expert explains the future of the field
AWS Lambda brings serverless to the enterprise
Learn how Lambda functions and microservices can work together
Review potential performance, portability issues with serverless
- 6 Priorities for Ops in a 'Serverless' World –New Relic