AWS Lambda is the popular choice for developers to build an AWS serverless architecture, but Google and Microsoft have narrowed the gap and caused IT teams to look outside the AWS cloud.
An AWS serverless architecture comes with unique challenges and concerns around lock-in, security and tools. Developers struggle to visualize Lambda functions, which are more difficult to assess than typical virtual machines. Still, many enterprises opt to build an AWS serverless architecture so they can focus on application logic instead of server management.
Peter Sbarski, VP of engineering at A Cloud Guru, head of Serverlessconf and author of the book Serverless Architectures on AWS, spoke with SearchAWS about AWS serverless architecture concerns and why serverless technology continues to grow.
AWS Lambda was a forerunner for serverless computing, but competitors are catching up. How do the major serverless vendors stack up in terms of adoption?
Peter Sbarksi: Lambda, at the moment, is the leader, without a doubt. But other vendors are starting to catch up and they are investing serious resources. … Azure Functions is really coming up; they are doing a lot of great work to solve some of the pain points for people who use a serverless approach. Obviously, IBM OpenWhisk is doing quite well. And, finally, Google is coming up. [Google Cloud Functions] still may be in beta … but anybody can actually use it now and try it out. So, yeah, absolutely, it's a rapidly growing and an evolving environment right now for these technologies.
You built the serverless calculator on GitHub to help developers estimate costs. What are some common mistakes developers make that result in unnecessary serverless costs?
Sbarski: If you are doing sustained processing, if you are executing functions all the time at a high rate -- nonstop -- you may end up actually paying more than you would if you had a server that you were running. [But] say you have an e-commerce side, and you have these irregular spikes when people come to your site, buy some products, maybe have a Black Friday that really spikes up. You might find that with these irregular workloads, going with a serverless approach is a lot cheaper than having to run and manage a server, for example. This is from a purely cost perspective.
I developed this serverless calculator for developers to kind of estimate, to look at how many executions they may have a month, how long the function may run and get an indicative cost. It's obviously very important to keep other development principles in mind. You wouldn't want to needlessly execute your function if you don't need to. You need to make sure that if you have a function executing on a regular schedule that you don't execute it too much.
Let's pretend that your function takes 101 milliseconds to execute on average; if that happens, you'll be charged for 200 milliseconds because AWS will round up. If you can reduce the function execution by 2 milliseconds, if you can drop it down to 99 milliseconds, then you'll be charged for the 100 milliseconds that the function executes. So, you have effectively reduced your bill by half.
How should developers approach security with AWS serverless architectures compared to what they do with traditional architectures?
Peter SbarskiVP of engineering, A Cloud Guru
Sbarski: Make sure to sanitize any input that goes into your Lambda function. So, somebody issues the request from the website or your mobile app, that request goes to the [Amazon] API Gateway [and] API Gateway invokes your Lambda function. Your Lambda function has to check what this input is. You must not just blindly execute whatever you've been given. You may have to sanitize or check what you've been given. You have to apply common sense and think about those concerns.
I think the security situation overall becomes easier to manage with a serverless approach. In some ways, I think it's going to be harder to compromise serverless systems because now you have Amazon or another vendor looking after the fleet of service that's executing your code. On the other hand, the surface area for attack is a lot bigger -- especially if you use multiple managed services. We use Lambda and Firebase and Auth0 for authentication and authorization, so, suddenly, that surface area is a lot larger. You have to think about how to secure those services and how to do it properly.
Are lock-in concerns around Lambda overblown? Or is lock-in more of a concern with applications running on Elastic Compute Cloud instances?
Sbarski: I see lock-in a little bit differently than most people. The danger of lock-in is not in Lambda. … You can wrap [Lambda] functions in the Express framework and run them off your own server if you want to. The lock-in comes from the other services that you may end up using with Lambda.
Let's say you have an S3 (Simple Storage Service) bucket that invokes Lambda whenever a file is placed into this bucket. If you have to move away from AWS to somewhere else, will you be able to recreate the same system there? Will you be able to have block storage or some kind of file storage and have that invoke your function? What if you use other services? We use Elastic Transcoder. We use DynamoDB as well. We use a whole bunch of different services in AWS. So, I think [relocating] those services becomes a little trickier and complex, not functions.
How is serverless tooling evolving within the AWS Marketplace and with other vendors?
Sbarski: Initially, there wasn't really anything out there. Serverless Framework was one of the first tools. Serverless Framework allows you to define your serverless architecture and deploy it. It's a very effective framework for organizing serverless applications. There was also Apex, Claudia and a few others. Now, there is the Serverless Application Model from AWS. … It allows you to define your serverless application and deploy it.
I would still say that Serverless Framework, at the moment, is by far the leader in this space. They are moving and maturing really quickly. And [the Serverless Framework is] also going across cloud; so now they are supporting Azure Functions, and they're supporting IBM OpenWhisk. Effectively, in one place, you can define a system that spans multiple clouds.
What's at the top of your serverless wish list?
Sbarski: I think tooling would definitely be up there. I think offline, local debugging would be at the very top as well. I would love to be able to locally, on my computer, run my entire serverless application in some way to be able to test, run and debug [offline]. I can run individual functions without a big problem, but the issue becomes running the whole application -- especially if you use other services. You can run a simple function, but how do you also simulate other services, like [Simple Notification Service] and S3 and Dynamo and Firebase? That becomes very hard to simulate all those services and all the interactions. If I could do that locally on my computer and run it and see it, that would be a big thing.
Lambda's complexity leads to serverless woes
An AWS serverless architecture because of Lambda
Lambda visualization helps developers monitor serverless environments