Characteristics of a Serverless Application
Now that you understand something about the philosophy around serverless, what are some of the characteristics of a serverless application? Though you may get varying answers as to what serverless is, following are some traits and characteristics that are generally agreed upon by the industry.
DECREASED OPERATIONAL RESPONSIBILITIES
When you decide to implement FaaS, the only thing you should have to worry about is the code running in your function. All of the server patching, updating, maintaining, and upgrading is no longer your responsibility. This goes back to the core of what cloud computing, and by extension serverless, attempts to offer: a way to spend less time managing infrastructure and spend more time building features and delivering business value.
HEAVY USE OF MANAGED SERVICES
Managed services usually assume responsibility for providing a defined set of features. They are serverless in the sense that they scale seamlessly, don’t require any server operations or need to manage uptime, and, most importantly, are essentially codeless.
Benefits of a Serverless Architecture
These days there are many ways to architect an application. The decisions that are made early on will impact not only the application life cycle, but also the development teams and ultimately the company or organization. In this book, I advocate for building your applications using serverless technologies and methodologies and lay out some ways in which you can do this. But what are the advantages of building your application like this, and why is serverless becoming so popular?
One of the primary advantages of going serverless is out-of-the-box scalability. When building your application, you don’t have to worry about what would happen if the application becomes wildly popular and you onboard a large number of new users quickly—the cloud provider will handle this for you.
With the traditional approach, you often paid for computing resources whether or not they were utilized. This meant that if you wanted to make sure your application would scale, you needed to prepare for the largest workload you thought you might see regardless of whether you actually reached that point. This approach meant you were paying for unused resources for the majority of the life of your application.
With serverless technologies, you pay only for what you use. With FaaS, you’re billed based on the number of requests for your functions, the time it takes for your function code to execute, and the reserved memory for each function. With managed services like Amazon Rekognition, you are only charged for the images processed and minutes of video processed, etc.—again paying only for what you use.
The bill from your cloud provider is only one part of the total cost of your cloud infrastructure—there’s also the operations’ salaries. That cost decreases if you have fewer ops resources.
In addition, building applications in this way usually facilitates a faster time to market, decreasing overall development time and, therefore, development costs.
With fewer features to build, developer velocity increases. Being able to spin up the types of features that are typical for most applications allows you to quickly focus on writing the core functionality and business logic for the features that you want to deliver.
When shipping a new feature, you often assess the risk (time and money involved with building the feature) against the possible return on investment (ROI). As the risk involved in trying out new things decreases, you are free to test out ideas that in the past may not have seen the light of day.
A/B testing (also known as bucket testing or split testing) is a way to compare multiple versions of an application to determine which one performs best. Because of the increase in developer velocity, serverless applications usually enable you to A/B test different ideas much more quickly and easily