The popular interpretation of Serverless is Functions-as-a-Service (FaaS) where developers can upload code that is run within Compute containers that are triggered by a variety of events, are ephemeral and fully managed by the cloud platform. FaaS obviates the need to provision, manage, scale or manage availability of your own servers. The most popular FaaS offering is AWS Lambda but Microsoft Azure Functions, Google Cloud Functions.
Usually, in order to “invoke” a Serverless Function we use “triggers”, A triggers can be either a Cron (time-based) trigger, event-based trigger such as “A new row has been added to my database” or “A new file has been created in my filesystem”, the function will perform an action is response to event\trigger. Ultimately, the trigger will issue a direct HTTP call to the Function URL.
A Regular Serverless Paradigm:
So What “Serverless at the Edge” Means?
“The ability to set triggers and execute Functions in Edge locations.”
Imagine you deploy your code to one region of Cloud and then are able to run it in any of the 10s or 100s of the Cloud Edge locations around the globe. Imagine the reduction in network latency for your end users. Serverless at the Edge allows code to be executed on Global Edge locations such as CDN pop locations.
- HTTP Request/Response manipulation (i.e Change HTTP Headers or Cookies perform URL rewriting or redirection, without actually reaching to your application servers)
- Dynamic content generation (Custom error page, signup, Dynamic HTML)
- Latency optimization (401 error if no auth, custom validation and transformations)
While you could use CDN to deliver pages faster, customized processing still required requests to be forwarded back to compute resources at centralized servers. This slows down the end user experience. With Serverless at the Edge, you can use Functions in response to CDN events and customize content delivered through your CDN at reduced network latency with execution at the Cloud edge locations.
While Serverless at the Edge is a very promising and cutting-edge technology it still comes with some limitations and constraints that developers and DevOps must meet in order to utilize it.
Today’s Lambda@Edge Limits
- Cold start* = propagations
- Low RAM (128MB)
- Low space (1MB for code)
- Only NodeJS
- No external HTTP requests (as of today)
A Few More Trends in Edge-Serverless Computing
Peter Levin, Partner at Andreessen Horowitz recently claimed that the “Cloud is Dead” or The end of cloud of computing is just right around the corner. Isn’t?, Well, We can definitely realize we face a whole new era of decentralized computing where drones, wearables, self-driving cars, IoT, smart homes, cities, and robots are the new “decentralized cloud” paradigm, just like we had clients and servers 15 years ago. We will be soon seeing the new paradigm of “decentralized cloud” spreading rather than a centralized paradigm that was introduced to us by the Cloud (not too long ago).
The rapid increase of the Internet of Things(IoT) drastically increases the amount of data generated at the edge of the network. Examples include sensor data, events generated by devices and gateways. While there are many ways to build a custom solution for data processing spanning the cloud and the edge, it would be beneficial to standardize on a programmable platform that would make it easy to develop, deploy and operate custom data pipelines, preferably taking advantage of the emerging ‘Serverless’ event-driven paradigm.
Amazon Greengrass and Local Compute
Not a long time ago, AWS has introduced a new sub-paradigm of Serverless (or FaaS) Computing called “Greengrass” which actually brings a new concept called “Local compute”. It enables to run Serverless Functions offline on devices in the field. This is another great feature which extends the meaning of a “decentralized cloud”. Most of the use-cases for this feature are in IoT space.
Companies need “Local Compute” due to the following reasons
- Round Trip Latency
- Intermittent connectivity
- Expensive bandwidth
IoT devices have always been relatively low-powered, both in terms of CPU and local storage. That’s why these devices are so reliant on the cloud. Still, occasionally we may want to do the computing right on the device or when the connectivity is down.
Andy Jassy, CEO of Amazon Web Services said “It’s easy to take advantage of the cloud to supplement the power of these devices, but there are going to be times where you don’t want to make the round-trip to the cloud, What we have heard repeated now from both companies that are using AWS’ IoT offering and device management — what they really want is to have on these devices is the same flexibility and program model to do compute as they have on AWS.”
Greengrass builds on top of AWS IoT and AWS Lambda, Amazon’s “Serverless” compute service. It will allow developers to write Lambda code (in Python) that can run right on the IoT device. The Greengrass Core runs these Lambda functions locally, but can also talk to the AWS cloud and allows IT admins to manage these devices and the code that runs on them.
A Final Word,
Serverless trends evolve fast and being adopted rapidly and exponentially by companies. AWS has stated that more than 50% of Cloud-based workloads will be migrated to Serverless compute paradigm in less than 3 years from now. In addition, Serverless is being adopted 10x faster than containers showing that this ‘hype’ is real. Serverless at the Edge just makes it more interesting and powerful. In a world full of drones, smart cars, smart homes, and smart cities, “Local Compute” will be an essential part of every decentralized IoT device.