1

So what exactly is serverless computing?

A large server room in a data-centre

What does serverless mean?

Wikipedia defines Serverless as “a cloud computing execution model in which the cloud provider dynamically manages the allocation of machine resources“.  It’s sometimes referred to as Functions-as-a-service (FaaS). The term comes from the Serverless Framework.

So how does it work if there’s no server?

Ok, all is not quite as it seems. There is still a server or in fact lots of servers. There is no magic, but the beauty is that you don’t need to worry about them.  You don’t pay for them or have to maintain them.  All that pain is shifted to the cloud service provider.  It’s similar to shared web hosting – you don’t have a server to yourself but instead, you share one with several other people.  It’s up to the provider to decide how many clients they can support on each server.  On the flip-side, you don’t get to decide what software should be installed on the servers, you’re stuck with the functionality provided.  The provider has gone out of their way to ensure you can’t even see the server.

What’s the point? Why should I use it?

Well, it means that you won’t have to worry about operating systems, patching, rebooting and most of your day-to-day maintenance.

Serverless architectures work well as a set of microservices, or as a glue between layers of an application.  They tie in well with messaging frameworks and can often be triggered by multiple means such as queues, files, HTTP calls, and database triggers.

Serverless normally support multiple languages such as .NET, Javascript, Python.  It can improve your time to market by speeding up deployment.

It often comes with built-in monitoring, such as Azure Application Insights.

The main benefit, however, is scale – both small and large scale applications can benefit as the serverless architecture can scale-out automatically to cope with need.  It prevents having a huge expensive server sitting idle, or a tiny server being overwhelmed. You can scale up and down without having to reboot, or having to do anything at all.  In fact, it’s something you don’t need to think about.  The function will trigger on as much hardware as it needs when it needs it.

It means that developers can concentrate on writing code and deploying code (often automatically) without having to provision and configure servers meaning you can ship features quicker.

How much is it?

The great thing about Serverless is that you only pay for what you use.  If your service is only called once per day or even once per week you only pay for one call, and not for the thousands of seconds you’re not using it.

Serverless typically has a tiny cost per call – for example, Azure gives 1 million executions per month for free.  Additional costs are about £0.15 per million further executions, which is basically free.

Although there’s no fixed monthly outlay, you are restricted on how long your process runs, so you can’t have one execution which lasts a month!

What’s the alternative?

A server (duh!).  The traditional way of running a service or business requires physical servers, either in your building or in someone else’s data-center.  Servers are required to run 24/7.  They have significant costs in energy used to power and cool them.  There’s a capital expense to buy them in the first place, and no guarantee you’ll get your estimates right, so will often under or over-provision them.  All of this costs money whether you use the server or not.  To top it off, in 3 years, the hardware is out of date.

If you stick with a physical (on-premise or cloud) server, you get to decide exactly what you have installed on it, but you also have to worry about keeping both the operating system and any software up to date and secure.

Where can I use it?

All three major cloud providers offer it, Microsoft Azure offer “Azure Functions“, Amazon AWS offers “Lambda” and Google Cloud offers “Google Cloud Functions

Why wouldn’t I use it?

Serverless is fairly new, and as a result, a lot of the offerings are proprietary and thus vendor lock-in can be an issue.  The other services that your functions will connect to (for example messaging or cloud databases) are often specific to the cloud vendor.  All serverless vendors offer HTTP triggers which you can use with anything but you may end up painted into a corner if you stick with the default services on offer.  There are improvements being made in this space, however.  A lot of the Azure code is open source on Github, and projects like OpenFaaS are trying to make it possible to make functions portable across different vendors’ clouds and even run locally on your own hardware.

One downside is that there’s no local state or storage – each call is stateless and independent of the previous call.  This is a good architecture to follow, however, it may be difficult to port existing applications that may have been developed to depend on files on disk.  To move to a Serverless architecture you may need to partially or completely re-write your application to de-couple different components into well-defined tiers.

The lack of state can be overcome by gluing together different functions or microservices using APIs to connect to databases, queues or blob storage.

Cold startup times can be a problem, depending on how ‘big’ your functions are, and how many 3rd party libraries are used (and thus have to be loaded) it could take several hundred milliseconds for your function to spin up.  Your function is typically also prevented from running for too long.  Some vendors will prevent you from running over a certain time, others charge by the second for how long your function takes to run.  This encourages small microservices, and thus more of them.  If each service does its own well-defined step, then it can finish quickly, and hand off the next task to a queue. It can also allow the calling function to return quickly and won’t use up valuable network resources such as sockets by keeping connections open too long, making it easier to scale your architecture horizontally.

Another downside that’s not just relevant to FaaS but clouds in general concerns who you’re sharing the infrastructure with.  Your code is not running on your server and so it’s likely your sharing that server with dozens or hundreds of other people. That means you’re reliant on that vendor keeping your code and data separate and secure from yours.  Perhaps more importantly than that is that you cannot guarantee what sort of performance you’ll get.  If another customer on your server suddenly starts using large amounts of CPU or memory your application could be affected. You’re reliant on the cloud vendor to make sure they’re got their provisioning right, which of course is the exact problem you’re trying to avoid worrying about.

Isn’t this just Platform as a Service?

Platform as a Service (also known as PaaS) runs all the time, whereas functions only run when called.  Typically if the function can be spun up in less than 100ms it can be considered Serverless.

In both cases, the cloud or PaaS vendor takes care of the underlying hardware, and you’re just given a platform or set of APIs to code against.

PaaS also normally forces you to consider scale at some level. You will typically be forced to choose a pricing tier, and pay a fixed amount per month to use ‘up-to’ that amount of resource. Therefore if your API suddenly goes viral (I’m not sure APIs go viral, but you get the idea) you will be capped at your current tier and will have to manually scale your subscription to deal with the new load. Because serverless just counts executions and charges you based on that, it will scale infinitely.  On the flip-side, if your service suddenly gets millions of hits without you realising, or if someone starts calling your service in a while(true) loop it could get very expensive very quickly as there’s no cap.

What about Docker?

I think the IT community is in general agreement that we no longer want to think about buying, racking, powering, cooling and patching our own servers, but there’s now several different alternatives to doing that.  One that pre-dates serverless is the use of containerization which is still growing in popularity.

Solutions such as Docker Swarm and Kubernetes allow you to package a bundle of functionality, plus everything it needs to run, in a portable agreed-upon format.  This makes deployment and versioning much easier and upgrades are a case of destroying your container and deploying a new one.  This means that your servers all stay identical and you don’t end up with the special-snowflake server we’ve all seen where no-one can remember exactly how it was built and why on earth it behaves differently to that server over there.  Each container is immutable.

Although containers, PaaS and FaaS all head in the right direction, again serverless is the only one where you don’t have to think about scale at all.  Docker and Kubernetes, and the cloud platforms that support them still charge per container and it’s a manual process to decide if you need to add more or delete some to meet the demands of your customers at a sensible price point.

Can I fire my SysAdmin now?

In short, no, although if you’ve hired loads of them and they’re all still attached to the idea of wiring up server rooms their time may be limited. Although there’s no hardware there’s still lots to monitor to ensure everything is running smoothly. I believe there’s still a role for all the things a developer doesn’t like doing.  Monitoring logs, dealing with out-of-hours calls at 3 am, that sort of thing 🙂

Another major issue that isn’t going away anytime soon is security.  Wherever your code is running it still needs to be secured against attack.  This means familiarity with firewalls, VPNs, encryption and authorization/authentication. By lowering the barrier to entry and speeding up the deployment process it can lead to shortcuts being taken and security being overlooked. A second pair of eyes will always be a good idea to challenge you to keep your companies, and your customers’ data secure.

Comments 1

  1. Pingback: Securing Serverless with Puresec | Ditch The Server!

Leave a Reply

Your email address will not be published. Required fields are marked *