Fotolia

Lambda support for OpenFaaS among wish-list items

Lambda got a long-awaited upgrade with SQS support, but AWS has more work to do. Here are a few more integrations that could improve Lambda and expand its uses.

Based on user feedback, Amazon continues to evolve its Lambda serverless platform. One recent example of this is Lambda's integration with Simple Queue Service -- a move that, for some enterprises, was long-awaited. But what's next?

There are still a few pain points with Lambda, such as limited options for programming languages, runtime limits and a lack of direct integration with third-party event sources. Meanwhile, Amazon continues to update its container service, Fargate, enabling users, for example, to link CloudWatch Events to trigger Fargate tasks. So, it seems only natural that Docker and Lambda begin to merge. And, as the open source framework OpenFaaS becomes increasingly popular with developers who aren't all-in on AWS, there's a chance we'll see more Lambda support for that specific function as a service (FaaS) model.

Containers and FaaS are natural allies

Lambda already uses containers under the hood, and there are examples of software engineers who have even hacked Docker to run in a Lambda function.

Docker -- at least docker config files -- makes sense for Lambda functions in the long run for a number of reasons:

  1. Dockerfiles include both an install and a run
  2. Docker lets developers run any language they want.
  3. Docker is already widely used and underlies OpenFaaS.

To optimize function startup time, Amazon could run the Dockerfiles' install step when new functions are uploaded and then reuse containers for executions. This would help address the issue of Lambda cold starts, as the install step would only happen when Lambda needs to scale more containers, rather than at the first execution.

Since Docker also lets developers customize their environments, the integration of Docker and Lambda would eliminate the need for Amazon to optimize the serverless platform for each new programming language fad. It would enable Lambda support for basic shell scripts and let Lambda keep up with popular languages, like Go and Node.js. In addition, this integration could vastly expand the developer ecosystem and further reduce lock-in concerns.

Third-party event sources

In addition to Docker integration, some developers would like to see Amazon extend its Lambda support for third-party event sources.

For example, Twitter and Facebook bots are a popular use case for Lambda functions. While Microsoft Azure Functions supports workflows that use Twitter and Facebook as event sources, Lambda developers have to use a workaround. To create Twitter bots with Lambda, developers need to integrate with API Gateway, set up separate access tokens in Twitter and have them hit an HTTP endpoint.

Instead, Amazon could work with Twitter and Facebook to add event sources directly. Just as it did with Serverless Application Repository or AWS Marketplace, Amazon could open up event sources to third parties and let them set their prices. And, given AWS' leading position in the cloud market, the move could also benefit companies like Twitter and Facebook and help them gain developer support.

Longer runtimes

Despite the recent increase from five to 15 minutes, runtime limits are another common issue for Lambda developers. Even with the extended 15-minute limit, API Gateway connections will still time out at 30 seconds, so you can't have a Lambda function attached to an API Gateway that lives for more than 30 seconds.

Fargate is currently the best option for those that need to deploy long-running functions. With Fargate support for cron-style executions, developers can easily migrate long-running scripts to run in a serverless model. That said, Fargate is serverless in the sense that you only pay for what you use, run workloads on demand and never have to worry about EC2 instances, but it's still not quite FaaS, since it requires bootstrapping the OS.

In addition to triggering a Fargate task, a separate workaround to the runtime limit is to execute another invocation of a Lambda function when the remaining time runs low. Developers can currently use the context.getRemainingTimeInMs() function to accomplish this. However, they need to save state somewhere persistent and reload that state at the start of the next invocation.

A much simpler way to support long-running functions would be Lambda functions that can execute for hours, or even days, instead of just 15 minutes

Yes, limits should always be in place -- mostly to prevent scripts that run amuck and cause massive charges or backlogs -- but developers should be able to extend the limit to hours or days. Amazon Greengrass, for example, enables developers to run Lambda-style functions on their own hardware, which does not impose the same 15-minute limitation. Since that framework for longer-running functions already exists, it seems natural for AWS to open up that option to its cloud-hosted model.

A matter of time

It's probably a stretch to expect OpenFaaS support for Lambda this year, but we could see longer runtimes and extended language support. Amazon recently introduced PowerShell support for Lambda, a move that represents a new class of languages -- shell scripting -- and shows Amazon's commitment to broadening Lambda use cases.

And, as Fargate evolves and approaches a true serverless model, expect AWS to continue to advance its FaaS platform as well and add Lambda support for more traditional workloads.

Dig Deeper on AWS cloud development

App Architecture
Cloud Computing
Software Quality
ITOperations
Close