AWS Lambda 2016 – The Complete Guide (With Hands On Labs)
AWS Lambda is changing the way that we build systems in the cloud. This new compute service in the cloud runs your code in response to events, automatically managing the compute resources for you. This makes it dead simple to create applications that respond quickly to new information. Lambda is the backbone of Serverless Computing with AWS.
This is a comprehensive course that will teach you how to write, deploy, scale and manage lambda functions. Armed with this knowledge you’ll be able to architect solutions from tiny composable microservices that scale massively and respond almost in real time.
In this course we’ll take you through the entire Lambda journey. From setting up your local environment, writing your first Lambda function, to deploying interesting and unique lambda functions that will help you massively scale your operations.
The hands on labs will show you how to write Lambda functions that:
- Run when files change in S3 (eg. image thumbnail generation, metadata extraction, indexing etc)
- Run when tables are updated in DynamoDB (eg. analytics/trend detection, auditing, etc)
- Run when kinesis messages are received (eg. notification generation, message filtering, etc)
So join us and become a cloud guru today.
An Introduction To Lambda
Welcome to the course! Meet Ryan from https://serverlesscode.com/ and see what topics we'll be covering for the rest of the course on Amazon's Lambda service.
Serverless is a hot topic (and hashtag) these days. Learn about the origin of the word and what it means for an app to be serverless and to fully outsource the compute needs of your application.
Learn about the difference it makes to you and your team to write a service to use Lambda instead of Amazon's Elastic Compute Cloud (EC2).
In this lecture, we'll cover a little bit about the underlying parts of the Lambda service and help you understand the programming model. Knowing this, you'll understand how timeouts affect the application you're writing and what event sources are available to you.
Walk through how a simple application works that uses S3 to trigger a Lambda function. This is very similar to the S3 lab later in the course.
Also learn about your language options when writing apps for Lambda.
Creating Our First Lambda Function
We're going to start applying our knowledge in Lab 0, where we won't be setting up any event sources. Instead, we'll focus on learning how to use the AWS console to create and manage Lambda functions.
To do this lab (and all subsequent ones) you'll need to go to https://aws.amazon.com/ to create an account. The labs for this course all qualify for the free tier, so you won't be charged for the resources you use. For more information on creating and activating your AWS account, follow these instructions https://aws.amazon.com/premiumsupport/knowledge-center/create-and-activate-aws-account/
This lecture isn't required for the course, but it will make it faster for you to get around in the console by making quick links to the services this course uses in the toolbar.
Finally! Learn to set up your first Lambda function, and see a hello world example on NodeJS version 4.3.
Start your first Lambda function using the AWS console. We'll add in our hello world code and create security credentials so our function can run.
Learn about different memory and performance parameters on Lambda functions and get ready to take your function for a spin.
Test your function from the AWS console in the browser, then check that the output is saved (and searchable!) in AWS CloudWatch.
After this lecture, you'll have a much better understanding of what AWS CloudWatch is used for and how to take full advantage of it to manage and debug your Lambda function.
Expanding Our Knowledge with Lambda & S3
In this lab, we'll learn how to write a Node.js application that processes Comma Separated Value (CSV) files as they are uploaded to S3.
For this lab we'll also be making use of the AWS command-line tool to upload code and view settings on our function. The CLI tool is extremely powerful, and makes it possible to script the deployment of Lambda functions to make it more repeatable.
Before we can react to S3 events, we'll need an S3 bucket. Remember that these names are GLOBALLY unique so you'll need to pick a unique name, otherwise you won't be able to continue.
Learn how to make an event trigger from the AWS console, connecting events in your S3 bucket to the Lambda function we'll be making later in the lab.
Create an IAM role to allow your Lambda function to read the files it'll be reacting to. This will help teach you a bit about the IAM Policy system and how to control access to resources in the cloud.
Use the AWS CLI to upload the CSV processing code with the "update-function-code" command.
Using the browser testing interface, test out the function using the CSV file we uploaded earlier. This will verify that you've deployed the new code properly and view the output from your new function. This will bring back the CloudWatch tool from earlier in the course.
AWS Lambda provides a few builtin ways to manage versions of your code and to point applications at the right function. Versions and qualifiers are vital for safely deploying new code to your application, and directing events to the right version of code.
Recall earlier when we mentioned timeouts? We're going to talk about them more now that we have some experience under our belts, and give you a way to check how much time your function has left during its execution.
Lambda functions can be called in different contexts, such as asynchronously by S3 or synchronously by API gateway or in the console. Using function outputs lets you send back values to callers that use your function synchronously, just like a return statement in normal programming.
Reacting to Event Streams with Lambda & Kinesis
Before starting on the Lambda function for this lab, learn about Kinesis, AWS' streaming service. Kinesis lets you have many writers and many readers against a stream for high scale applications. Kinesis is commonly used for collecting analytics such as clickstream or transaction data, and we're going to use it to collect the same transaction lists from S3, but in a more realtime fashion.
Back in the AWS console, apply some of the vocab you learned in the last lecture to navigate around the Kinesis console and create a stream for us to attach our Lambda function to.
In Kinesis, messages are sent to Lambda in a base64-encoded format, so to make a test event we must encode the input before putting it in the test event in our console. Once the event is set up, we'll test the new function in the console before hooking it up to the real stream.
Finally, we're going to use the AWS CLI to put events in our Kinesis stream and check that the function executed by viewing the CloudWatch logs.
Creating Data-Driven Apps with Lambda & DynamoDB
DynamoDB is a popular managed NoSQL store that's easy to get started with. In the previous labs we've been moving closer and closer to a "realtime" way of processing data as it comes in. Now we'll be taking new transactions as they get written to the database and calculating the sums, then storing those back into the same record.
Learn how DynamoDB and Lambda integrate to make a powerful compute+storage combo for making your database smarter and adding asynchronous processing to your datastore.
From the AWS Console, create the event trigger for DynamoDB to send updates to the Lambda function we're creating and try out the custom code from the testing interface in the console.
With the function set up, we can use the AWS Console to write data into our DynamoDB table and check the reaction of our Lambda function in CloudWatch.
To avoid future charges for AWS resources, make sure you clean up all the things we made in this course. If you'd rather use a checklist, here's the list of all the resources created that you'll want to delete:
- S3 Bucket
- Kinesis Stream
- DynamoDB Table
Don't forget to delete all the CloudWatch log groups, which will be named for the functions. There are also IAM roles we created, but IAM roles and policies are 100% free for life, so you won't be charged for them ever.
Thanks for taking the course, reach out to @ryan_sb on Twitter to tell me what you thought!