36,59 €
Set up complete CI and CD pipelines for your serverless applications using DevOps principles
Key Features
Book Description
Serverless applications are becoming very popular among developers and are generating a buzz in the tech market. Many organizations struggle with the effective implementation of DevOps with serverless applications. DevOps for Serverless Applications takes you through different DevOps-related scenarios to give you a solid foundation in serverless deployment.
You will start by understanding the concepts of serverless architecture and development, and why they are important. Then, you will get to grips with the DevOps ideology and gain an understanding of how it fits into the Serverless Framework. You'll cover deployment framework building and deployment with CI and CD pipelines for serverless applications. You will also explore log management and issue reporting in the serverless environment. In the concluding chapters, you will learn important security tips and best practices for secure pipeline management.
By the end of this book, you will be in a position to effectively build a complete CI and CD delivery pipeline with log management for serverless applications.
What you will learn
Who this book is for
DevOps for Serverless Applications is for DevOps engineers, architects, or anyone interested in understanding the DevOps ideology in the serverless world. You will learn to use DevOps with serverless and apply continuous integration, continuous delivery, testing, logging, and monitoring with serverless.
Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:
Seitenzahl: 288
Veröffentlichungsjahr: 2018
Copyright © 2018 Packt Publishing
All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.
Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the author, nor Packt Publishing or its dealers and distributors, will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book.
Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.
Commissioning Editor: Gebin GeorgeAcquisition Editor: Rahul NairContent Development Editor: Nithin George VargheseTechnical Editor: Mohit HassijaCopy Editor: Safis EditingProject Coordinator: Drashti PanchalProofreader: Safis EditingIndexer: Priyanka DhadkeGraphics: Tom ScariaProduction Coordinator: Aparna Bhagat
First published: September 2018
Production reference: 1280918
Published by Packt Publishing Ltd. Livery Place 35 Livery Street Birmingham B3 2PB, UK.
ISBN 978-1-78862-344-5
www.packtpub.com
To my lovely grandmother, Nemu, and my sweet mom, Padmavati. There are no words to describe their love and sacrifice.
Mapt is an online digital library that gives you full access to over 5,000 books and videos, as well as industry leading tools to help you plan your personal development and advance your career. For more information, please visit our website.
Spend less time learning and more time coding with practical eBooks and Videos from over 4,000 industry professionals
Improve your learning with Skill Plans built especially for you
Get a free eBook or video every month
Mapt is fully searchable
Copy and paste, print, and bookmark content
Did you know that Packt offers eBook versions of every book published, with PDF and ePub files available? You can upgrade to the eBook version at www.packt.com and as a print book customer, you are entitled to a discount on the eBook copy. Get in touch with us at [email protected] for more details.
At www.packt.com, you can also read a collection of free technical articles, sign up for a range of free newsletters, and receive exclusive discounts and offers on Packt books and eBooks.
Shashikant Bangera is a DevOps architect with 18 years of IT experience. He has vast experience with DevOps tools across the platform, with core expertise in CI, CD, the cloud, and automation. He has helped his customers adopt DevOps; architected and implemented enterprise DevOps for domains such as banking, e-commerce, and retail; and also contributed to many open source platforms. He has designed an automated on-demand environment with a set of open source tools and also contributed to the open source arena with an environment booking tool, which is available on GitHub.
Ifemakin Olasupo is a certified systems and cloud expert with over a decade of IT experience. His background is in infrastructure engineering and he has a strong passion for automation. He has worked on and delivered cutting-edge projects for various multinational institutions and currently consults as the lead DevOps engineer for HM Revenue and Customs. He enjoys building event-driven automations and helping organizations build bespoke platforms for application delivery.
If you're interested in becoming an author for Packt, please visit authors.packtpub.com and apply today. We have worked with thousands of developers and tech professionals, just like you, to help them share their insight with the global tech community. You can make a general application, apply for a specific hot topic that we are recruiting an author for, or submit your own idea.
Title Page
Copyright and Credits
DevOps for Serverless Applications
Dedication
Packt Upsell
Why subscribe?
packt.com
Contributors
About the author
About the reviewer
Packt is searching for authors like you
Preface
Who this book is for
What this book covers
To get the most out of this book
Download the example code files
Download the color images
Conventions used
Get in touch
Reviews
Introducing Serverless
Introduction to serverless
Core concept
Backend as a Service
Function as a Service
AWS Lambda
Azure Functions
Google Functions
OpenWhisk
Other serverless architectures
Serverless benefits
Faster time to market
Highly scalable
Low cost
Latency and geolocation improvement
Serverless drawbacks
Increased complexity
Lack of tooling
Complexity with architecture
Drawback in implementation
DevOps with serverless
Summary
Understanding Serverless Frameworks
ClaudiaJS
Command-line library
API builder library 
Bot builder library
Apex
Zappa
Serverless Framework
Framework features 
Services and deployment
Functions and events
Variables and plugins
Resources
Setting up AWS access keys 
Installation of Serverless Framework
Lambda service and function deployment
Invoking locally
Deploying and invoking locally 
Summary
Applying DevOps to AWS Lambda Applications
Manual deployment of a Lambda function
AWS Lambda with DevOps
Serverless frameworks with AWS CodePipeline
Continuous integration and continuous deployment with Lambda
Setting up Jenkins for a serverless application
Automated testing for Lambda functions
Unit testing a deployed application
AWS Lambda pipeline
Prerequisites
Deployment methods
Canary deployment
Setting up a simple environment 
Setting up canary deployment 
Making sure the deployment works fine
Deploying CodeDeploy hooks 
Blue-green deployment
Integration of CloudWatch with ELK
Summary
DevOps with Azure Functions
Building a simple Azure function
Implementing continuous integration and continuous delivery with Azure Functions
Continuous integration with Azure Functions
Prerequisites
Setting up environment variables
Continuous deployment to Azure Functions
Setting up a continuous deployment Azure deployment tool
Blue-green deployment in Azure Functions
The deployment dashboard
Monitoring and logging
Accessing logs through Kudu
Logging information via table storage
Monitoring the Azure Function
Integrating with New Relic
Best practice
Source code management
Folder structure
Testing and static code analysis
Deployment and release
Summary
Integrating DevOps with IBM OpenWhisk
OpenWhisk
OpenWhisk setup and configuration
Triggers
Actions
Rules
Sequences
Building an OpenWhisk application
Creating a hello world OpenWhisk action
OpenWhisk with Serverless Framework
A simple OpenWhisk application
Continuous integration and continuous delivery with OpenWhisk
Setting up the toolchain and repository integration
Configuring the deployment
Continuous integration and continuous delivery with Serverless Framework
Continuous delivery pipeline to OpenWhisk
Deployment patterns with OpenWhisk
Canary deployment
Blue–green deployment
Dynamic dashboard
Activity Summary
Activity Timeline
Activity Log
OpenWhisk action logging management
Setting up ELK
OpenWhisk actions
OpenWhisk log forwarder
Summary
DevOps with Google Functions
CI and CD pipelines with Google Functions
Prerequisites for Cloud Functions
Cloud Functions through the GCP console
Cloud Function using a gcloud command line
Building and testing locally
CI and CD with testing
Source code management
Continuous integration and testing
Continuous delivery with Google Functions
Google environments
Monitoring and logging
Best practice
Summary
Adding DevOps Flavor to Kubeless
What is Kubeless?
Kubeless architecture
How to set up Kubeless
Setting up continuous integration and deployment
Creation of the service
Deploying the function
Invoking the function
Serverless logs
Continuous integration with Jenkins
Monitoring Kubeless
Kubeless pros and cons
Summary 
Best Practices and the Future of DevOps with Serverless
Important aspects of DevOps
Collaboration and tools strategy
Agile development
Version control everything
Capture every request 
Automate test and source code analysis
Continuous feedback
Time to market and the cycle time
Log metrics
Best practices for Serverless
One function, one task
Functions call other functions
Minimize libraries
With HTTP –  one function per route
Database connection to RDBMS
Use messages and queues
Data in motion
Measure the scale
DevOps best practices and troubleshooting for AWS Lambda
Source code versioning
Build
Test
Unit testing
Integration testing 
Performance testing
Monitoring
Deployment
Logging
Security 
An IAM role per function
No long-lived credentials
Do not persist secrets
Lambda in VPC
AWS Lambda best practices
Keep the handler independent of business logic 
Keep warm containers alive 
Dependency control
Shorter timeout for dependencies
Exception handling
Recursive coding
High availability
Runtime language
Code for cold and warm containers
Cost optimizing 
Best practices for Azure functions
Avoid large and long-running functions
Cross-function communication
functions should be be stateless
functions is defensive
The same function app should not have code for test and production
Use asynchronous code, but avoid blocking calls
Configure host behaviors to better handle concurrency
Best practices for Google Functions
Code idempotent functions
Signal the completion of function calls
Do not start background activities
Always delete temporary files
Local development
Error reporting
Use SendGrid to send emails
Use dependencies wisely
Use global variables to reuse objects in future invocations
Do lazy initialization of global variables
Summary
Use Cases and Add-Ons
AWS Lambda use cases and add-ons
AWS Lambda use cases
Serverless website
Video and image processing
Logs Processing and notification
Controlling the IoT
Backup and daily tasks
Continuous integration and continuous deployment
Monitoring
Static website
Warm-up
Azure Functions add-ons
Google Functions add-ons
Summary
DevOps Trends with Serverless Functions
The impact of Serverless on Ops
The direction of DevOps with serverless
Summary
Other Books You May Enjoy
Leave a review - let other readers know what you think
Serverless development provides developers with the freedom to just concentrate on development and not worrying about the server-side aspects. This book aims to simplify your serverless deployment experience by showing you how to effectively apply DevOps principles to serverless applications. At each step along the way, you will be introduced to best practices for effectively building a complete Continuous Integration and Continuous Delivery pipeline and log management for serverless applications.
DevOps for Serverless Applications is for DevOps engineers, architects, or anyone interested in understanding the DevOps ideology in the serverless world. You will learn how to use DevOps with serverless and apply continuous integration, continuous delivery, testing, logging, and monitoring with serverless development.
Chapter 1, Introducing Serverless, describes what serverless is with simple words, covering its benefits and drawbacks. It talks about different service providers of serverless and provides an introduction to the serverless services they provide.
Chapter 2, Understanding Serverless Frameworks, talks about different deployment frameworks for serverless and also dives into the Serverless Framework, which is used in most of the chapters in the book.
Chapter 3, Applying DevOps to AWS Lambda Applications, deep dives into AWS Lambda with respect to DevOps. It sails through multiple hands-on tutorials on the continuous integration and continuous deployment pipeline with the Serverless Framework and Jenkins, as well as monitoring and logging. It also covers how to set up canary and blue/green deployments for AWS Lambda.
Chapter 4, DevOps with Azure Functions, talks about Azure functions. It starts with how to create and deploy Azure functions, then teaches how to set up a continuous integration and continuous deployment pipeline through Jenkins. It also covers monitoring and logging and talks about best practices for DevOps with Azure functions.
Chapter 5, Integrating DevOps with IBM OpenWhisk, introduces Openwhisk and covers setting up a deployment pipeline with Serverless Framework and Jenkins. Monitoring and dynamic dashboard of OpenWhisk on IBM cloud are also covered.
Chapter 6, DevOps with Google Functions, starts with an introduction to Google functions. It walks through multiple tutorials, from creating a function to setting up an automated deployment pipeline. It also talks about how to monitor and log with the Google stack driver.
Chapter 7, Adding DevOps Flavour to Kubeless, explains that Kubeless is an open source serverless architecture that sits on Kubernetes. This chapter introduces Kubeless and explains how to set up continuous integration and continuous deployment for Kubeless.
Chapter 8, Best Practices and the Future of DevOps with Serverless, discusses how we all deal with performance issues with our applications in the journey of development, and this leads us to best practices. So, this chapter outlines the best practices for serverless and DevOps so that we can develop and deploy with ease.
Chapter 9, Use Cases and Essentials, covers some popular use cases for serverless and also essential tit-bits that can improve development and deployment.
Chapter 10, DevOps trends with Serverless, covers how serverless will the shape DevOps and how DevOps has to change its track with the adoption of serverless.
The main focus of this book is to shed light on serverless and different service provider for serverless. It also teaches how to adopt an automated way to set up DevOps for serverless functions.
You have an understanding of DevOps, continuous integration, and continuous deployment, and should have some knowledge of popular DevOps tools such as Jenkins, ELK (Elasticsearch, Logstash, and Kibana). It is an added advantage if you know about different cloud service providers, such as AWS, Azure, and Google.
Most of my tutorials are build on top of MacBook and Docker containers, so I would recommend using some form of Linux for the tutorials.
You can download the example code files for this book from your account at www.packt.com. If you purchased this book elsewhere, you can visit http://www.packt.com/support and register to have the files emailed directly to you.
You can download the code files by following these steps:
Log in or register at
www.packt.com
.
Select the
SUPPORT
tab.
Click on
Code Downloads & Errata
.
Enter the name of the book in the
Search
box and follow the onscreen instructions.
Once the file is downloaded, please make sure that you unzip or extract the folder using the latest version of:
WinRAR/7-Zip for Windows
Zipeg/iZip/UnRarX for Mac
7-Zip/PeaZip for Linux
The code bundle for the book is also hosted on GitHub athttps://github.com/PacktPublishing/DevOps-for-Serverless-Applications. In case there's an update to the code, it will be updated on the existing GitHub repository.
We also have other code bundles from our rich catalog of books and videos available at https://github.com/PacktPublishing/. Check them out!
We also provide a PDF file that has color images of the screenshots/diagrams used in this book. You can download it here: https://www.packtpub.com/sites/default/files/downloads/9781788623445_ColorImages.pdf.
There are a number of text conventions used throughout this book.
CodeInText: Indicates code words in text, database table names, folder names, filenames, file extensions, pathnames, dummy URLs, user input, and Twitter handles. Here is an example: "In the statement, we pass the parameter to the CLI and the stage value is populated in the serverless.yml file."
A block of code is set as follows:
# serverless.ymlservice: myServiceprovider: name: aws runtime: nodejs6.10 memorySize: 512 # will be inherited by all functions
Any command-line input or output is written as follows:
$ pip install zappa
Bold: Indicates a new term, an important word, or words that you see onscreen. For example, words in menus or dialog boxes appear in the text like this. Here is an example: "Click on Users on the left-hand sidebar, then click on the Add User button and add the username adm-serverless."
Feedback from our readers is always welcome.
General feedback: Email [email protected] and mention the book title in the subject of your message. If you have questions about any aspect of this book, please email us at [email protected].
Errata: Although we have taken every care to ensure the accuracy of our content, mistakes do happen. If you have found a mistake in this book, we would be grateful if you would report this to us. Please visit www.packt.com/submit-errata, selecting your book, clicking on the Errata Submission Form link, and entering the details.
Piracy: If you come across any illegal copies of our works in any form on the Internet, we would be grateful if you would provide us with the location address or website name. Please contact us at [email protected] with a link to the material.
If you are interested in becoming an author: If there is a topic that you have expertise in and you are interested in either writing or contributing to a book, please visit authors.packtpub.com.
Please leave a review. Once you have read and used this book, why not leave a review on the site that you purchased it from? Potential readers can then see and use your unbiased opinion to make purchase decisions, we at Packt can understand what you think about our products, and our authors can see your feedback on their book. Thank you!
For more information about Packt, please visit packt.com.
This book will introduce us to the world of the serverless approach to information technology, looking at multiple different cloud service providers, such as AWS, Azure, Google, OpenWhisk, and a few others. We will look in detail at each cloud service, as well as the different methods that are used to apply DevOps to them. We will look at the different use cases and learn the best practices for each of them.
The following topics will be covered in this introductory chapter:
Introduction to serverless
Core concept
B
ackend as a service
(
BaaS
)
Function as a service
(
FaaS
)
AWS Lambda
Azure functions
Google functions
OpenWhisk
Pros and cons of serverless
DevOps with serverless
When we hear the wordserverless, the first thing that comes to mind is, oh, my code will magically run without any server!. In a way, this is right: the serverless approach is a process where we deploy the code into a cloud and it is executed automatically, without worrying about the underlying infrastructure, renting or buying servers, scaling, monitoring, or capacity planning. The service provider will take care of all these things. Also, you won't believe how cheap it is and how much easier it is to manage. Now, you are probably thinking, how is that possible?. To look at its workings in more detail, let's compare the serverless approach with something we do in our daily lives.
The serverless approach is a bit like dealing with our laundry. We all need to wash our clothes, and for this, we need to buy a washing machine. But the usage of this washing machine will be about 10 to 15 hours per week, and the rest of the time the washing machine will be idle. Interestingly, we buy servers to host our application, and most of the time, our servers are idle when waiting for requests and sit unused. We have piles of servers that are hardly managed or decommissioned. As they are not properly used or managed, resources, such as the power supply, capacity, storage, and memory, are wasted.
Also, while doing the laundry, the washing machine will allow only a certain load and volume. The same applies to servers: They too allow only a certain volume and load. The more the load or traffic, the slower the processing will be, or it may stop completely. Now, to take care of our extra load, we might decide to buy a bigger washing machine, which will allow a bigger volume of laundry and support a larger load. But again, this high-end machine will take the same resources if we have to wash a huge pile of clothes or just one piece of clothing, which is wasteful. The same is the case in our server analogy. When catering for higher traffic or requests, we could buy a high-end server. But we will end up using the same resources for 10 requests a day as we would for 10,000 requests, even with a high-end server.
Also, to use the washing machine, we have to separate our clothes before washing, select the program, and add the detergent and softener, and if these elements are not handled properly, we might ruin our clothes. Similarly, when using a server, we have to make sure we install the right software—as well as the right version of the software—make sure that it is secure enough, and always monitor whether the services are running.
Also, if you are renting an apartment, then you might not have a washing machine, or perhaps you might find launderettes to be cheaper when you wash you laundry in bulk, and also less worrisome. So launderettes or coin-operated laundry machines can be rented whenever you need to wash your clothes. Likewise, many companies, such as AWS, Azure, or Google, started by renting their servers. So we too can rent a server, and the provider will take care of the storage, memory, power, and basic setup.
Say that we've decided that a coin-operated washing machine at the local launderette is our best option. Now we just put the coin in and wash our clothes, but we still need to make sure we add detergent and fabric softener, and set the right program, otherwise we will end up ruining our clothes. Likewise, when we rent a server on the cloud, we might not bother dealing with the power, storage, and memory. But we still need to install the required software, monitor the application service, and upgrade the software version from time to time, as well as monitor the performance of the application.
Say that I found a new launderette, one that has a delivery service and that would charge me per item of clothing, so I can send clothes in bulk or one piece at a time. They will wash and iron my clothes too. Now, I don't need to worry about which detergent or comforter to use, or what cleaning program to use, and I also don't need to own an iron. But in the case of the world of information technology, companies are still using the rental coin laundry system. They still lease servers and manage them through Platform as a Service (PaaS), still manage application downtime, upgrade the software version, and monitor services.
But this can all be changed by adopting a serverless approach. Serverless computing will automatically provision and configure the server and then execute the code. As the traffic rises, it will scale automatically, apply the required resources, and scale down once the traffic eases down.
In earlier days, the term serverless referred to an application that was dependent on third-party applications or services to manage server-side logic. Such applications were cloud-based databases, such as Google Firebase, or authentication services, such as Auth0 or AWS Cognito. They were referred to as Backend as a Service (BaaS) services. But serverless also means code that is developed to be event-triggered, and which is executed in stateless compute containers. This architecture is popularly known as Function as a Service (FaaS). Let's look at each type of service in a bit more detail.
The BaaS was conceptualized by Auth0 and Google Firebase. Auth0 started as authentication as a service, but moved to FaaS. So basically, BaaS is third-party service through which we can implement our required functionality, and it will provide server-side logic for the implementation of the application.
The common approach is that most web and mobile application developers code their own authentication functionality, such as login, registration, and password management, and each of these services has its own API, which has to be incorporated into the application. But this was complicated and time consuming for developers, and BaaS providers made it easy by having a unified API and SDK and bridging them with the frontend of the application so that developers did not have to worry about developing their own backend services for each service. In this way, time and money was saved.
Say, for example, that we want to build a portal that would require authentication to consume our services. We would need login, signup, and authentication systems in place, and we would also need to make it easy for the consumer to sign in with just a click of a button using their existing Google or Facebook or Twitter account. Developing these functionalities individually requires lots of time and effort.
But by using BaaS, we can easily integrate our portal to sign up and authenticate using a Google, Facebook, or Twitter account. Another BaaS service is Firebase, provided by Google. Firebase is a database service that is used by mobile apps, where database administration overhead is mitigated, and it provides authorization for different types of users. In a nutshell, this is how BaaS works. Let's look at the FaaS side of the serverless approach.
As mentioned at the start of chapter, FaaS is essentially a small program or function that performs small tasks that are triggered by an event, unlike a monolithic app, which does lots of things. So, in FaaS architecture, we break our app into small, self-contained programs or functions instead of the monolithic app that runs on PaaS and performs multiple functions. For instance, each endpoint in the API could be a separate function, and we can run these functions on demand rather than running the app full time.
The common approach would be to have an API coded in a multi-layer architecture, something like a three-tier architecture where code is broken down into a presentation, business, and data layer. All the routes would trigger the same handler functions in the business layer, and data would be processed and sent to the data layer, which would be a database or file. The following diagram shows this three-tier architecture:
That might work fine for small numbers of simultaneous users, but how would we manage this when traffic grows exponentially? The application will suddenly become a computing nightmare. So, to resolve this problem, ideally, we would separate the data layer, which contains the database, into the separate server. But the problem is still not solved, because the API routes and business logic is within one application, so the scaling would still be a problem.
A serverless approach to the same problem is painless. Instead of having one server for application API endpoints and business logic, each part of the application is broken down into independent, auto-scalable functions. The developer writes a function, and the serverless provider wraps the function into a container that can be monitored, cloned, and distributed on any number of servers, as shown in the following diagram:
The benefit to breaking down an application into functions is that we can scale and deploy each function separately. For instance, if one endpoint in our API is where 90 percent of our traffic goes, or our image-processing code is eating up most of the computing time, that one function or bit code can be distributed and scaled more easily than scaling out the entire application.
In a FaaS system, the functions are expected to start within milliseconds in order to allow the handling of individual requests. In PaaS systems, by contrast, there is typically an application thread that keeps running for long periods of time, and handles multiple requests. FaaS services are charged per execution time of the function, whilst PaaS services charge per running time of the thread in which the server application is running.
In the microservices architecture, the applications are loosely coupled, fine grained, and light weighted. The reason for the birth of microservices is to break down the monolithic application into small services so that it can be developed, managed, and scaled independently. But FaaS takes that a step further by breaking things down into even smaller units called functions.
The trend is pretty clear: The unit of work is getting smaller and smaller. We'are moving from monoliths to microservices, and now to functions, as shown in the following diagram:
With the rise of containers, many cloud vendors saw that serverless functions architecture will provide better flexibility for developers to build their applications without worrying about the ops (operations). AWS was first to launch this service with the name Lambda, then other cloud providers followed the trend, such as Microsoft Azure with Azure Functions and Google Cloud with Google Functions. But this popularity also gave an opportunity for some vendors to build open source versions. Some popular versions are IBM's OpenWhisk, which is Apache licensed, Kubeless, which is built over the top of Kubernetes, and OpenFaaS, which is built over the Docker container. Even Oracle jumped into the foray with Oracle Fn. Let's briefly look at each vendor in this chapter, learning about how they work. We will then travel with them over the rest of the book, looking at their approach to DevOps.
Amazon Web Services (AWS) were the first to launch a FaaS, or serverless, service in 2014, called Lambda. They are currently the leaders in this kind of serverless provision. AWS Lambda follow the event-driven approach. At the trigger of an event, Lambda executes the code and performs the required functionality, and it can automatically scale as the traffic rises, as well as automatically descale. Lambda functions run in response to events, such as changes to the data in an Amazon S3 bucket, an Amazon DynamoDB table change, or in response to an HTTP request through the AWS API Gateway. That's how Lambda helps to build the triggers for multiple services, such as S3 DynamoDB, and the stream data store in Kinesis.
So, Lambda helps developers only worry about the coding—the computing part, such as the memory, CPU, network, and space, is taken care of by Lambda automatically. It also automatically manages the patching, logging, and monitoring of functions. Architecturally, a Lambda function is invoked in a container, which is launched based on the configuration provided. These containers might be reused for subsequent invocations for functions. As the demand dies, the container is decommissioned, but this is all managed internally by Lambda, so users do not have to worry about it as they do not have any control over these containers. The languages supported by AWS Lambda functions are Node.js, Java, C#, and Python.
While building serverless applications, the core components are functions and the event source. The event source is the AWS service or custom application, and the Lambda function processes the events. The execution time for each Lambda function is 300 seconds.
Let's look at an example of how AWS Lambda actually works. In a photo-sharing application, people upload their photos, and these photos need to have thumbnails so that they can be displayed on the user's profile page. In this scenario, we can use the Lambda function to create the thumbnails, so that the moment the photo gets uploaded in the AWS S3 bucket, S3, which supports the events source, can publish the object-created events and invoke the Lambda function. The Lambda function code reads the latest photo object from the S3 bucket, creates a thumbnail version, and saves it in another S3 bucket.
In Chapter 3, Applying DevOps on AWS Lambda Applications, we will look at how we can create, run, and deploy Lambda functions in an automated way, and we will also monitor and perform root-cause analysis through logging.
Azure Functions is Microsoft's venture into serverless architecture. It came onto the market in March 2016. Azure Functions allows functions to be coded in C#, F#, PHP, Node.js, Python, and Java. Azure Functions also supports bash, batch, and PowerShell files. Azure Functions has seamless integration with Visual Studio Team System (VSTS), Bitbucket, and GitHub, which will make continuous integration and continuous deployment easier. Azure Functions supports various types of event triggers, timer-based events for tasks, OneDrive, and SharePoint, which can be configured to trigger operations in functions. Real-time processing of data and files adds the ability to operate a serverless bot that uses Cortana as the information provider. Microsoft has introduced Logic Apps, a tool with a workflow-orchestration engine, which will allow less technical users to build serverless applications. Azure Functions allows triggers to be created in other Azure cloud services and HTTP requests. The maximum execution time is five minutes per function. Azure Functions provides two types of app service plan: Dynamic and Classic. App Service
