What is serverless, is it the future?

In the five years since their large cloud provider launch, serverless hosting and Function as a Service (FaaS) platforms have become increasingly popular. For those unfamiliar, serverless computing is

Introduction

In the five years since their large cloud provider launch, serverless hosting and Function as a Service (FaaS) platforms have become increasingly popular. For those unfamiliar, serverless computing is an execution model in which a provider runs a server and dynamically allocates machine resources based on consumption. In contrast to the traditional model in which servers are pre-purchased at certain capacities, serverless computing resources only exist at execution time. All of the major cloud providers now offer one or multiple FaaS platforms for customers, some of the most popular being AWS Lambda, Google Functions, and Microsoft Azure Functions. Is going serverless the right move for your organization? We’ll review why serverless has become so popular, how to implement serverless computing, and how to get the most out of your serverless platform.

Why Serverless

Why consider serverless for your business? Incorporating serverless computing into your application architecture allows engineers to focus on business functionality and spend less time on infrastructure. This is a huge win for development teams that are maintaining antiquated hardware or having to manually manage systems. This ultimately leads to cost and time savings, two of the most obvious positives of moving away from traditional infrastructure. Applications that are rarely used can be significantly cheaper to run in a FaaS platform, only being billed when they are actively running. FaaS platforms provide virtually infinite scalability, high availability, flexibility, and efficiency.

The largest benefit of serverless and FaaS are the corresponding changes to software development life-cycle processes. Tooling for serverless deployments makes it easy to package and deploy applications securely in existing CI/CD pipelines. After an up-front investment to configure serverless hosting, development processes can be solely focused on application development. In the case new environments or deployments are required, there is little to no overhead to create new functions in your FaaS platform-of-choice. Some serverless tooling even lets you easily switch from one cloud provider to another, without having to update your serverless deployment configuration. This greatly reduces risk of vendor-lock-in and allows for business flexibility.

Serverless Implementation

So, you’ve decided that you want to try serverless computing, but you’re not sure where to start? Lucky for you, much of the major tooling created to support serverless infrastructures is easy to use, well supported, well documented, and production-ready. Some of the most popular serverless libraries and systems include AWS SAM, Pulumi, Serverless Framework, Stackery, Apex, Terraform, and Zappa. Each of these services provides a clean and declarative way to create serverless computing resources and each has its pros and cons. What is right for one project may not be the best tool choice for another project. For your use case, make sure that your chosen tooling meets your expectations with regards to supported technology stack, available FaaS platform integrations, and feature support.

Although there is an initial investment to learn serverless, this cost is greatly reduced on subsequent serverless project implementations. Due to the configuration-as-code nature of serverless deploys, much of the setup can be reused between applications to reduce cost. It is also important to note that serverless is not a silver bullet. Not every application design allows for it to be run in a serverless manner, nor would anyone advise that every piece of infrastructure should be serverless. Incorporating serverless into your existing infrastructure can start as simply as choosing one application to pilot on a FaaS platform. A project that is infrequently used, runs quickly, and/or needs to operate at scale is a prime candidate for serverless conversion and would provide immediate benefit.

Getting the most out of serverless

Properly utilizing your new serverless infrastructure will ensure that you get the most out of your investment. There are many features provided by FaaS platforms and serverless tooling, combined with common patterns that can be followed to ensure successful management of your newly serverless application. Two useful functions that are provided by most FaaS platforms are using serverless instances for asynchronous task execution and event-driven function execution. Asynchronous task execution provides a mechanism by which serverless instances can be spawned to handle asynchronous tasks. As an example, rather than sending a user registration in a blocking fashion, lengthening server response time, a new serverless instance can be created to send the registration email. Event-driven function execution is an elegant solution to a classic developer problem – how to trigger execution of code based on actions in other services. For instance, running a report generation script when new input files are received. Most FaaS platforms provide configurable methods of triggering serverless functions when events occur in other platforms and services.

In addition to event-driven functions, developers are able to schedule jobs that execute particular functions. This is extremely similar to jobs run on traditional servers, but there is no underlying server monitoring and supervising jobs. The application is only instantiated when the schedule triggers execution and jobs are run in separate execution environments, isolated from one another. This disconnected nature of serverless, along with the fact that there is no underlying hardware on which to investigate issues, means that proper logging and debugging are integral to success. In a serverless environment it is of the utmost importance to have robust logging that is remotely aggregated. This is easily achievable in most platforms by forwarding all application logs from serverless instances to a remote server or service. Examples of this service would include Splunk, Sumo Logic, or Loggly. Similarly, debugging information and error tracking are important to manage with services such as Sentry, Datadog, or NewRelic. By following best practices with regards to logging and debugging, serverless applications can be easier to maintain than traditional servers.

Is Serverless the future?

It is undeniable that serverless and FaaS platforms have a place in the future of infrastructure and operations. More recently, providers have even started providing serverless alternatives to traditional managed database. Serverless is not a one-size-fits-all solution and should not be treated as such. Serverless functions fill a void that has been present since cloud computing started, providing a manner in which to quickly build out and deploy applications of limited scope. In traditional microservice or service-oriented architectures, serverless functions can be integrated directly into the existing ecosystem with minimal overhead and maintenance cost. Serverless computing also provides yet another manner by which monolithic applications can be composed and can be easier to manage than container-based services. Going forward there are sure to be many more use cases for serverless computing as it continues to grow in market share and popularity.

The JBS Quick Launch Lab

Free Qualified Assessment

Quantify what it will take to implement your next big idea!

Our assessment session will deliver tangible timelines, costs, high-level requirements, and recommend architectures that will work best. Let JBS prove to you and your team why over 24 years of experience matters.

Get Your Assessment