npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

learn-aws-lambda

v1.0.4

Published

How-to Guide for AWS Lambda, DynamoDB, SNS & API Gateway

Downloads

6

Readme

Learn Aws Lambda

aws lambda intro image

Learn to use AWS Lambda to create scalable micro-services in less time and cost far less to run than "traditional" server-based apps.

Codeship codecov.io dependencies Status devDependencies Status contributions welcome

Contents

What is Lambda?

Amazon Web Services (AWS) Lambda lets you run JavaScript (Node.js), Java & Python scripts/apps in Amazon's (virtually) infinately-scalable cloud environment without having provision VM instances or other "orquestration"; Everything is dynamically auto-scaled so if you have 1 user or 1 billion you pay for usage.

Self-Disruption

AWS are effectively disrupting their (own) existing business with Lambda. Instead of forcing us to pay for EC2 instances in fixed increments and have complex application monitoring/scaling, AWS have built a much simpler way of building & running micro-services.

Lambda also disrupts other Platform-as-a-Service ("PaaS") providers such as Heroku, Google App Engine, Azure or Modulus where you pay for a specific amount of compute power & RAM but have a scaling delay and scale in a fixed increment (instances).

Lambda Features

Ephemeral Storage

No access to a filesystem or memory persistence (e.g. on-instance Redis) so you cannot store data or the result of an operation locally.

Use an AWS Datastore Service

The lack of local persistence on Lambda is resolved by having low-latency access to AWS S3 and other AWS Datastores e.g: ElastiCache (in-memory cache), DynamoDB (NoSQL ssd-based database), RDS (relational database), however there's an important (and potentially expensive) catch: PUT/POST/GET requests to all AWS data stores are NOT Free! While per-run costs on Lambda are tiny, if you GET and PUT something to S3 on each execution cycle you could rack up the bill!

How?

  • General Intro (if you're completely new, watch the video!): http://aws.amazon.com/lambda/
  • How it Works: http://docs.aws.amazon.com/lambda/latest/dg/lambda-introduction.html
  • Getting Started Guide (Overview): http://docs.aws.amazon.com/lambda/latest/dg/welcome.html

Create and Test Your Own AWS Lambda Function

'HELLO WORLD!' Example (inline)

Here's a super simple walkthrough of a 'HELLO WORLD!' example to help get you started with AWS Lambda:

  1. If you haven't already done so, create a free AWS account here.

  2. Sign in to the AWS management console, select your region in the top right hand corner and then open the AWS Lambda console.

  3. Choose 'Get Started Now' and then select the 'hello-world' blueprint from the list of option tiles.

  4. On the 'Configure Function' page, edit the existing inline code to create your function. AWS Lambda expects us to export an object which has a property called handler. Here's our example:

Configure Function

The value of that property is a function that takes two arguments, event and context. The event will be created by us and the context consists of the runtime information which will be supplied by AWS lambda. They both take the form of JSON objects.

  1. Beneath the function you then have to specify the handler and the role you wish to give it. (A role is an AWS identity with permission policies that determine what the identity can or cannot do in AWS. For more information on roles click here). We chose the 'lambda_basic_execution' role because our function is extremely simple:

Handler and Roles

In the 'Advanced Settings' section you can specify the amount of memory that each AWS Lambda instance should be allocated. Note: by increasing the memory, this also increases the cost of your function runtime!

  1. Click 'next' to review your code and then if you are happy click 'create function'. You'll then be taken to your AWS Lambda functions where you should see the one you just created.

Test

  1. We can then test our function by clicking on the blue 'Test' button in the top left. Here we will be able to specify the payload of the event that gets passed to our function. There will be an existing event template but we changed the key value pairs as shown below:

event object

Click the 'Save and test' button to test your function.

  1. Below your function you should now be able to see the results from your test. Here are the results from our example:

test results

You can reconfigure your test at any time. Click on the 'Actions' dropdown beside the 'Test' button and select the 'Configure test event' option.

'HELLO WORLD!' Example (.ZIP)

An alternative (and perhaps more commonly used) way of creating an AWS Lambda function is to write it in a text editor, zip it up and then upload it. Here's how:

  1. Follow the first two steps of the previous example if you haven't already created an AWS account.

  2. On the 'Select blueprint' page hit the 'skip' button because we'll be creating our own.

  3. On the 'Configure Function' page give your Lambda function a name and then select the 'Upload a .ZIP file' radio button. It will then prompt you to upload a file.

zip file upload

  1. Open up your text editor and write your Lambda function just as you would if you were writing it inline and then save it as a .js file:

function code

  1. Zip up this file by typing the following into the command line. The command consists of the first filename which is the zip file you want to create (call it whatever you like .zip) followed by the files you want to zip up. In our example you can see the name of the .js file we created earlier:

$ zip -r hello-world.zip hello-world.js

You should now be able to see a ```.ZIP``` file alongside your ```.js``` file.  
**NOTE: If your function has any dependencies then you must include your ```node_modules``` file within your .ZIP file. Simply add ```node_modules``` after the files you wish to zip up!**
  1. Go back to the 'Configure Function' page and click the 'Upload' button and select the .ZIP file you just created.

  2. Next select the Lambda function handler and role. The handler is the name of the .js file that contains your function followed by the name of the handler you are exporting. We've selected the basic execution role just like the previous example:

handler and role

  1. Just like the previous example, click 'next' and if you are happy with your function click 'Create function'. Your new function should now show up in your list of AWS Lambda functions:

function list

This function can now be tested in the same way as the inline example.

'HELLO WORLD!' Example (API Gateway)

Another really cool thing about AWS Lambda is that you can invoke a Lambda function through a web endpoint i.e. they can be triggered via HTTP calls. You can configure these endpoints straight from the AWS Lambda console:

  1. Open your AWS Lambda console and click on the function that you wish to create an endpoint for. (if you haven't created a Lambda function already you can do so by following one of the previous examples!)

  2. On this page Select the 'API endpoints' tab and then click '+ Add API endpoint':

add api endpoint

  1. Configure your API endpoint settings:
  • API endpoint type : API Gateway

  • API name : whatever-you-like (we recommend having all lower case letters separated by a dash for readability)

  • Resource name: /YourLambdaFunctionName

  • Method : GET/POST/PUT/DELETE...

  • Deployment stage : Defines the *path through which an API deployment is accessible

  • Security : Defines how your function can be invoked

    *The path will be a URI ending in >> .../deploymentStage/ResourceName

api endpoint settings

We've set our 'Security' to be 'Open with access key' because this is an example of a more common use case. These keys can be configured by following the link in the blue box. Our example URI will end in >> .../prod/Concatenate

Click 'Submit' once you are happy with your Lambda function configuration.

  1. You should now be able to see your function listed at a URL that has been generated by AWS Lambda.

api url

  1. Now let's test our endpoint to see if our Lambda function is being invoked. Go to the AWS console and then click on 'API Gateway' under Application Services. This will take you to a list of your APIs where you should see the one we just created. Click on the title.

api list

  1. Click on the METHOD beneath your Lambda function, in our case it's 'POST'. Then click on the word 'TEST' with the lightning bolt underneath it:

POST method

  1. Enter the input values that your API will be expecting (this is the event object we have been using to previously test our functions) then click the blue 'Test' button on the right. Your response body should then return the results of your Lambda function :

test api

How to Access the Lambda Function via API Gateway

By default, access to your API Endpoint and therefore the Lambda function are set to 'Private' this means that when you attempt to access/visit the function you just created the API Gateway endpoint for in the previous section will not be accessible if you attempt to access it.

  1. If you aren't already viewing the API Gateway, select it from your AWS Console Menu: aws01-aws-dashboard-select-api-gateway

  2. Create an API Key in the Amazon API Gateway section of the AWS Console: aws02-api-key-create

  3. Create a New API Key: aws03-api-key-create0ew

  4. Name your key, Enable it and click Save button: aws03-api-key-create-new-specify

  5. Once you enable your API Key, a section will appear below the creation form that allows you to assign the new API Key to one of your APIs "Stage". Select the API & Stage (in our case the API is LambdaMicroservice and the stage is prod) then click the Add button: aws04-api-key-create-assign-to-stage You should now see that the API Key is Enabled for your prod stage: aws05-api-key-associated

  6. Copy the API key from this screen and save it to your notepad. aws05-copy-the-api-key

  7. Return to your AWS Console and select Lambda. This will display the list of your Lambda functions. Select the Concatenate Lambda function you created earlier. aws06-list-of-lambda-functions

  8. When you are viewing your Lambda Function, select the API Endpoints tab and copy the API endpoint URL: aws07-view-api-endpoints-and-copy-the-link

  9. With the endpoint URL and API Key copied you can now run a cURL Command in your terminal to access the endpoint:

curl --header "x-api-key: LhGU6jr5C19QrT8yexCNoaBYeYHy9iwa5ugZlRzm" https://r09u5uw11g.execute-api.eu-west-1.amazonaws.com/prod/Concatenate

aws-lambda-curl-with-api-key-works

Note: I slightly modified my Lambda function to return a timestamp so I know when the function gets executed:

exports.handler = function(event, context) {
    console.log('Received event:', JSON.stringify(event, null, 2));
    console.log('context:', JSON.stringify(context, null, 2));
    event.key1 = event.key1 || 'Hello'; // set default values
    event.key2 = event.key2 || 'World!';
    console.log('value1 =', event.key1);
    console.log('value2 =', event.key2);
    var date = new Date();
    var time = date.toString();
    context.succeed(event.key1 + ' ' + event.key2 + ' >> ' + time );
};

For even more steps on enabling API Keys on AWS API Gateway, see: http://docs.aws.amazon.com/apigateway/latest/developerguide/how-to-api-keys.html

Create an API with GET/POST Methods that uses Lambda functions to retrieve/update records from a DynamoDB table

  1. First we'll need to create a table in DynamoDB. Go to the DynamoDB console and then click the 'Create Table' button. Give your table a name (call it something relevant to the type of data your DynamoDB table will hold). We've called ours 'Users'. The 'Primary key' is made up of a 'Partition key' (hash key) and an optional 'Sort key'. (The partition key is used to partition data across hosts for scalability and availability):

create table

table name

For 'Table settings' just check the 'Use default settings' checkbox and then click the blue 'Create' button:

table setup

  1. Once the table is created, click on the 'Alarms' tab and then delete the basic alarms if they have been created:

alarms

Then click on the 'Capacity' tab and then specify the 'Read' and 'Write' capacity units as 3 each and then click 'Save':

capacity

  1. Next we will have to create a policy that allows your AWS functions to access Cloudwatch logs as well as the table you just created. Go to the IAM console, select 'Roles' and then 'Create new role'. We've called ours 'APIGatewayLambdaExecRole':

create role

Select the 'AWS Lambda' role:

lambda role

And then click 'Next step' to skip the 'Attach Policy' section:

skip attach policy

In the 'Review' section click the blue 'Create Role' button to finish:

review role

Click on the title of the role you just created then click the down arrow for 'Inline Policies'. Follow the link to create an inline policy:

inline policies

Click on the 'Custom Policy' radio button and then click 'Select':

custom policy

Give your custom policy a name (we've called ours 'LogAndDynamoDBAccess') and then enter the following in the 'Policy Document' section. Make sure your "Resource" at the bottom is set to the ARN of your table and the second "SID" is set to "_YourTableName_DynamoDBReadWrite". (the ARN can be found in your 'Table details' by going to your DynamoDB console and clicking on your table.):

{
  "Version": "2012-10-17",
  "Statement": [
      {
          "Sid": "AccessCloudwatchLogs",
          "Action": [
              "logs:*"
          ],
          "Effect": "Allow",
          "Resource": "arn:aws:logs:*:*:*"
      },
      {
          "Sid": "UsersDynamoDBReadWrite",
          "Effect": "Allow",
          "Action": [
              "dynamodb:DeleteItem",
              "dynamodb:GetItem",
              "dynamodb:PutItem",
              "dynamodb:UpdateItem"
          ],
          "Resource": [
              "arn:aws:dynamodb:eu-west-1:655240720487:table/Users"
          ]
      }
  ]
}
  1. Now we need to create the Lambda functions for adding and retrieving data to and from the table (we'll be creating our functions in a text editor, zipping them up and then uploading them to Lambda. Follow the instructions in the previous 'HELLO WORLD!' .zip example on how to do this):

Create a new .js file that will contain our first Lambda function. This function will GET information from the DynamoDB table. We've called the file getUserInfo.js. Here is the code:

var AWS = require('aws-sdk');
var DOC = require('dynamodb-doc');
var dynamo = new DOC.DynamoDB();

exports.handler = function(event, context) {
  var callback = function(err, data) {
    if (err) {
      console.log('error on getUserInfo: ', err);
      context.done('Unable to retrieve user information', null);
    } else {
      if(data.Item && data.Item.users) {
        context.done(null, data.Item.users);
      } else {
        context.done(null, {});
      }
    }
  };

  dynamo.getItem({TableName:"Users", Key:{username:"default"}}, callback);
};

Zip up the file and then upload it to Lambda:

zip -r getUserInfo.zip getUserInfo.js

getuserinfo

For the Role, select the one we created earlier. Then click 'Next' and then 'Create function':

role

Click 'Test' to test the function. The results should return an empty objext {}.

Create a second .js file that will contain our second Lambda function. This function will UPDATE information in our DynamoDB table. We've called the file updateUserInfo.js. Here is the code:

var AWS = require('aws-sdk');
var DOC = require('dynamodb-doc');
var dynamo = new DOC.DynamoDB();

exports.handler = function(event, context) {
  var item = { username:"default",
               users: event.users || {}
          };

  var callback = function(err, data) {
    if (err) {
      console.log(err);
      context.fail('unable to update users at this time');
    } else {
      console.log(data);
      context.done(null, data);
    }
  };

  dynamo.putItem({TableName:"Users", Item:item}, callback);
};

Again zip up the file and then upload it to Lambda: zip -r updateUserInfo.zip updateUserInfo.js

Follow the same steps as the previous function to create the second one, giving it the same role. They should both now appear in your functions section:

functions

Test the function with a sample event relevant to your data. We created the following sample event:

{
  "users": [
            {
              "id": 1,
              "name": "John Smith",
              "location": "London"

            }
           ]
}

You should see an empty obect just like the first function {}. Go back to the GetUserInfo function and then click 'Test' again. You should now see a returned result with the object in your sample event like this:

[
  {
    "id": 1,
    "location": "London",
    "name": "John Smith"
  }
]
  1. We're going to have to create one more Lambda function. It essentially does nothing but it is required by the OPTIONS method for CORS (Cross Origin Resource Sharing which is a mechanism that allows restricted resources on a web page to be requested from ). The function is as follows:
  exports.handler = function(event, context) {
    context.succeed('');
  }

Upload it just like the previous Lambda functions:

noop

  1. Next go to the Amazon API Gateway console and create a new API by clicking 'Create API'. Give it a name, we've called our API 'SecureUsers':

api gateway

The Callback Parameter

It used to be the case that in order to terminate a lambda function you had to use context.succeed, context.fail or context.error. Now that AWS Lambda supports node v4.3, we are able to make use of the callback parameter which allows us to explicitly return information back to the caller.

The callback takes two parameters taking the following form callback(Error error, Object result); Let's walk through a quick example of how to implement the callback.

Let's write a simple Lambda function that returns some text after it's been invoked through an SNS topic:

exports.handler = function (event, context, callback) {
  const message = JSON.parse(event.Records[0].Sns.Message);
  const text = message.content.text;
  // checking that the text exists
  if (text && text.length) {
    return callback(null, `Here is some text: ${text}`);
  } else {
    // if no text was found return this message
    return callback(null, 'No text was found');
  }
}

Triggering a Lambda function using an event from DynamoDB

Lambda functions can be set up to be triggered by events from other AWS services like Dynamo DB tables. This can be used to build applications that react to data modifications.

Create a DynamoDB table with a stream enabled

  1. In your AWS Console click on the DynamoDB tab. Then click the blue 'Create Table' button.

Create table

  1. Set the 'Table Name' field to be 'LambdaTest' and in the 'Primary Key' field, set 'Partition Key' to be 'Id' of type 'String'. Then click the blue 'Create' button. You will then be directed to the DynamoDB dashboard.

  2. Click the 'Manage Stream' button and in the pop up window, select the 'New and Old images' option.

Manage Streams

Manage Streams Options

You now have a DynamoDB table with streams enabled.

Create a Lambda function that will be triggered by changes to the DynamoDB table.

  1. Select the 'Tables' option in the navigation pane and select the table you just created.

select table

  1. Click 'Create New Trigger' > 'New Function'. This opens the Lambda Console.

Triggers

  1. The 'Event Source Type' and 'DynamoDB table' fields should already have been filled in. Click 'Next'.

New trigger name

  1. In the 'Configure Function' section, give your lambda function a name in the 'Name' field. e.g. 'DynamoDBLambda'. The 'Runtime' should be set to 'Node.js' and the 'Description' should already have been filled in.

There will already be some default code in the 'Lambda Function Code' section. You can leave the code as it is. It is just logging the data from each data row in the event from DynamoDB along with the action e.g. 'INSERT', 'MODIFY'. We will see the output of these logs in a later step.

console.log('Loading function');

exports.handler = function(event, context) {
    //console.log('Received event:', JSON.stringify(event, null, 2));
    event.Records.forEach(function(record) {
        console.log(record.eventID);
        console.log(record.eventName);
        console.log('DynamoDB Record: %j', record.dynamodb);
    });
    context.succeed("Successfully processed " + event.Records.length + " records.");
};
  1. In the 'Lambda function handler and role' section, select the 'DynamoDB event stream role' option. This will open a new window to create an Identity and Access Management Role (IAM). Click the blue 'Allow' button to enable the creation of the role. This is necessary to enable permission for DynamoDB to invoke your Lambda function.

role

role name

  1. Then click 'Next'

  2. On the final Review page, in the 'Event Sources' section choose the 'Enable now' option. Then Click 'Create Function'

enable now option

Create Data in the DynamoDB table.

  1. Go back to the DynamoDB dashboard and select the 'Tables' tab > 'LambdaTest'. Click 'Create Item'. This will open a pop up window. Enter an 'Id' for your data point. This can be any string you want. Then click 'Save'.

create data button

create data save

  1. Add in some more items and perform some actions to edit/delete the entries in the table e.g. add attributes, delete items. This can be done by selecting the entry and then clicking the 'Actions' dropdown menu. Each of these actions will be logged by our Lambda function and will be visible in the Cloudwatch logs.

edit data

edit data 2

View the output of the Lambda function in response to changes to the DynamoDB table

  1. Back in the AWS dashboard open the Lambda console and select the function that you just created.

  2. Select the 'Monitoring' tab and then the 'View Logs in CloudWatch' option. Select one of the log streams. You should see the console.log output from the lambda function capturing the create, edit and delete operations you performed on data entries in the DynamoDB table.

view output

view output 2

You can now modify the lambda function to perform different operations with the event data from DynamoDB.

Trigger a Lambda function using the Simple Notification System

Amazon SNS is a Publisher/Subscribe System. You can create, subscribe and publish to 'Topics' which are the AWS term for a messaging channel. Lambda functions can be subscribed to topics so when a message is published to that topic, the Lambda function will be invoked with the payload of the published message as an input parameter. The Lambda function can then do any number of things with the information in the message including publishing further messages to the same topic or other topics.

Create a topic

  1. In the AWS SNS console click the 'Create Topic' button.

create topic

  1. In the pop up that opens up add the name of your topic e.g. 'Lambda Test' and then click the blue 'Create Topic' button. You should see a message that says 'Successfully created topic'.

create topic pop up

#### Create a Lambda Function and Subscribe to a topic

  1. Follow the instructions in this previous section to create a Lambda function. In Step 3 of the process select the 'sns-message' blueprint. This function will simply log the message pushed to the SNS topic.

  2. Under 'Configure Event Sources' you can select the Lambda topic the function should subscribe to. Select the one we just created: 'LambdaTest'.

configure sources

  1. Give your function a name e.g. 'LambdaSNSTest'. There will already be default code in the Lambda Function Code section to console.log the message:
console.log('Loading function');

exports.handler = function(event, context) {
    //console.log('Received event:', JSON.stringify(event, null, 2));
    var message = event.Records[0].Sns.Message;
    console.log('From SNS:', message);
    context.succeed(message);
};
  1. In the Execution Role section select 'basic execution role'. In the pop up window, enable the creation of a lambda_basic_execution role and click 'Allow'.

  2. On the final Review page, in the 'Event Sources' section choose the 'Enable now' option. Then Click 'Create Function'. You should be redirected back to the Lambda Console with a confirmation messsage: 'Congratulations! Your Lambda function "LambdaSNSTest" has been successfully created and configured with SNS: LambdaTest as an event source.'

lambda function created

#### Publish a message to a topic

  1. Open the SNS console and select the 'Topics' tab in the left hand menu. Select the 'LambdaTest' topic created in an earlier step. Then click the blue 'Publish to Topic' button.

Publish to topic

  1. The opens the message editor. The topic ARN is the 'Amazon Resource Name' for the topic. ARNs are used to specify a resource unambiguously across all of AWS. We don't need to worry about them for this example! Give your message a subject and add some text to the message body. Leave the 'Time to Live' field blank and click 'Publish Message' in the bottom right hand corner of the screen. You should be redirected back the SNS console.

Publish message

NB: Using the JSON Messsage Generator option it is possible to format messages differently for different viewing platforms. Find out more on the AWS SNS docs.

Viewing the output of the lambda Function

  1. Open up the Cloudwatch logs. Select the 'Logs' tab in the left hand menu.

Logs tab

  1. Click on the LambdaSNSTest option and click on the first Log Stream. It will take you to the log output from the SNS message that we published!

Log stream

Log stream output

Testing Lambda Functions

Unit Testing

  1. Using Lambda to test Lambda!

This method uses Lambda itself as the test platform. This involves creating a “unit” test that calls the Lambda function being tested and then either summarizes whether it succeeded or failed and/or records its output in DynamoDB. AWS lambda has a 'unit and load test harness' blueprint that you can use to test another Lambda function when it is live on AWS. The harness has two modes: 'Unit' and 'Load' so simple scale testing can also be performed.

More information and an example can be found here

  1. Generating mock events and testing locally using a Node.js assertion library

The event and context objects can be mocked so that the lambda function can be tested locally before deployment. Using the 'Test' function in the AWS Lambda console it is possible to view the format of different event objects e.g. DynamoDB events, SNS notifications,

Have a look at mock-events.js to see some examples. These can be used to create helper functions to generate mock events.

The context object has the following form:

{
  //methods
  success,
  done,
  fail,
  getRemainingTimeInMillis,

  //properties
  functionName,
  functionVersion,
  invokedFunctionArn,
  memoryLimitInMB,
  awsRequestId,
  logGroupName,
  logStreamName,
  identity: {
    cognito_identity_id,
    cognito_identity_pool_id
  },
  clientContext: {
    client: {
      installation_id,
      app_title,
      app_version_name,
      app_version_code,
      app_package_name,
      Custom,
    },
    env: {
      platform_version
      platform,
      make,
      model,
      locale,
    }
  }
}

It is slightly harder to mock because the methods (success, done, fail) are asynchronous and also have to be mocked, but has been done on an npm module using promises.

It doesn't yet account for different invocation types i.e. Event or Request/Response. From the AWS docs about the context.sucess function:

If the Lambda function is invoked using the Event invocation type (asynchronous invocation), the method will return "HTTP status 202, request accepted" response. If the Lambda function is invoked using the RequestResponse invocation type (synchronous invocation), the method will return HTTP status 200 (OK) and set the response > body to the string representation of the result.

The following is an example lambda function and associated test using the 'mock-context-object' module and the 'tape' assertion library.

// very simple lambda function
exports.handler = function(event, context) {
    context.succeed(event.key1);  // SUCCESS with message
};
// test set up and simple test
var context = require('aws-lambda-mock-context');
var test = require('tape');

var lambdaToTest = require('../functions/lambdaTest.js');

// creating context object
var ctx = context();
// text event object
var testEvent = {
  key1: 'name'
}

var response = null
var error = null;

test("Capture response", t => {
  lambdaToTest.handler(testEvent, ctx);
  //capture the response or errors
  ctx.Promise
    .then(resp => {
      response = resp;
      t.end();
    })
    .catch(err => {
      error = err;
      t.end();
    })
})

test("Check response", t => {
  t.equals(response, 'name');
  t.end();
})

More info on testing lambda functions locally can be found here and an example of testing by mocking the context object can be found here.

  1. Using grunt-aws-lambda plugin

This plugin for Grunt has helpers for running Lambda functions locally as well as for packaging and deployment of Lambda functions.

More info and an example can be found here

Continuous Integration using Codeship

After writing your tests, the next step is to set up Continuous Integration (CI) for your Lambda Functions so every time you push up your code to GitHub, the tests are run and the code is deployed to AWS if the tests pass. This example goes through how to set up CI using Codeship.

Some initial set up of your project repo is required. This involves having a lambda function file in the correct format (with an exports.handler function), and a data.json file with a test event. The flow will be as follows:

  • Push code to GitHub
  • This triggers Codeship
  • Codeship runs the tests
  • If tests pass, Codeship deploys Lambda function to AWS, else build fails
  • Codeship invokes Lambda function on AWS with the test event to check live version is working as expected
  • If successful, Codeship reports build succeeded!

Follow along with this simple example to try out setting up the process yourself

  1. Create a FREE account on Codeship and connect to your GitHub account

  2. Fork this repo on Github!

  3. Create a project in Codeship connecting to your forked repo.

If you have any problems with the Step 1 or 3, follow the instructions on the Codeship documentation.

  1. Create a hello-world Lambda function on AWS following the steps in this earlier section. In the 'Configuration' tab Make sure that the name of the handler is changed from 'index.handler' to 'LambdaTest.handler'. 'LambdaTest' will be the name of the zip file that we upload to AWS through Codeship.

Also make a note of the ARN for the lambda function - it can be found in the top right hand corner of the page. It should have the form: arn:aws:lambda:YOUR_AWS_REGION:YOUR_AWS_ACCOUNT_ID:function:YOUR_FUNCTION_NAME. You'll need it when setting up the Deployment Script on Codeship.

Lambda arn

  1. Create a User for Codeship in AWS and get an AWS 'access key' and 'access secret key'.

We need to give Codeship access to the lambda function on AWS so it can update and invoke the function. AWS IAM best practices suggest creating a Group with an access policy to which Users can be added.

Navigate to the 'Identity and Access Management' section of the AWS console.

IAM dashboard

Select the 'Users' tab from the left hand menu. Click the blue 'Create New Users' button. Give the first user the name 'Codeship' (We've already done this!). Make sure the 'Generate an access key for each user' checkbox is ticked. Then click 'Create'.

create user button

new user

On the next screen, click the 'Show User Security Credentials' arrow. It will show you an 'access key' and 'access secret key' for this user. Copy the keys and paste them somewhere safe. You won't be shown them again.

Next select the 'Groups' tab in the left hand pane. Click the blue 'Create Group' button.

Create Group button

Give the group the name 'CI' or any name of your choice. In the next page under 'Attach Policy', just click 'Next Step'. We will be adding our own custom policy.

Navigate back to the Groups tab and click on your newly created group. Select the 'Users' tab and click on the blue 'Add Users to this Group' button. You can then add the user we just created.

To give Codeship access to update, invoke and retrieve your AWS Lambda function, you need to add an access policy. Select the tab 'Permissions' and then click on 'Inline Policy' > 'Create new one'.

Add a policy

Select the 'Custom Policy' option and click 'Select'.

Create policy

In the 'Policy Name' field add 'Codeship Policy' and in the 'Policy Document' add in the following text:

  {
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "lambda:UpdateFunctionCode",
                "lambda:UpdateFunctionConfiguration",
                "lambda:InvokeFunction",
                "lambda:GetFunction",
                "lambda:CreateFunction",
            ],
            "Resource": [
              "YOUR_LAMBDA_FUNCTION_ARN_HERE"
            ]
        }
    ]
}

Then click 'Validate Policy' and if the validation is successful, click 'Create Policy'.

  1. Add the AWS User Environment variables to your Codeship project. In the Environment tab in Codeship, add your 'AWS_ACCESS_KEY', 'AWS_SECRET_ACCESS_KEY' and 'AWS_DEFAULT_REGION' (usually 'us-east-1'). This is needed in order to authorise the Codeship to execute commands from the aws cli.

Environment variables

  1. Set up Test and Deployment Scripts for your Codeship project

Click the the Test tab in your project settings.

Test script

You should already see the follow default code:

# By default we use the Node.js version set in your package.json or the latest
# version from the 0.10 release
#
# You can use nvm to install any Node.js (or io.js) version you require.
# nvm install 4.0
nvm install 0.10

We're using tape to run the tests so it also needs to be installed globally on the virtual machine. Add this line in at the end:

npm install -g tape
npm install

AWS Lambda used to only support Node 0.10 so our tests (which are written in es6) are piped through babel so that they could be run without Node 4.0. However this would be no longer required.

Under 'Configure Test Pipelines', in the 'Test Commands' tab add npm test.

In the Deployment Tab, under 'Configure Deployment Pipeline' select the name of the branch on GitHub that you want to test.

Deployment pipeline

Then choose the 'Custom Script' option.

Custom script

This next page looks like this. We will add our own script to the deployment commands.

depoyment script

We're going to first zip up the lambda function, then use the AWS cli to update the version on AWS, and finally invoke it with a test event. Add in the following code to the deployment commands:

pip install awscli
zip -r LambdaTest.zip -j lambda-testing/functions/LambdaTest.js
aws lambda update-function-code --function-name LambdaTest --zip-file fileb://LambdaTest.zip
aws lambda get-function --function-name YOUR_LAMBDA_FUNCTION_ARN_HERE
aws lambda invoke --function-name YOUR_LAMBDA_FUNCTION_ARN_HERE --payload file://lambda-testing/tests/data.json --log-type Tail lambda_output.txt
cat lambda_output.txt
  1. Make a change and push up to GitHub! Try modifying the LamdaTest.js file and/or the data.json file, commit the change and push the code up to GitHub. This should trigger Codeship. View the build log to make sure the build is successful and the test passes.

Codeship build log

Also have a look at the monitoring tab in your Lambda function console. You should see a spike where the function was invoked by Codeship.

AWS monitoring log

For more information have at the Codeship documentation:

Upload Your Lambda Function to an S3 Bucket and Automatically Deploy it to Lambda (bash script example)

In this example will build a script that will execute the neccessary steps to upload a Lambda function to S3 where it can be stored and then automatically deploy it to Lambda.

We will be writing our own bash script that will involve the use of some of the AWS CLI commands. Follow these instructions on how to get set up with the AWS CLI on your local machine:

  1. If you haven't already done so, set up an account with AWS here.

  2. You'll then need to get your 'access key ID' and 'secret access key' by doing the following:

    • Open the IAM console
    • In the navigation pane choose 'Users'
    • Click your IAM username
    • Click 'Security Credentials' and then 'Create Access Key'
    • To see your access key, choose Show User Security Credentials. Your credentials will look something like this: Access Key ID: AKIAIOSFODNN7EXAMPLE Secret Access Key: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
    • Click 'Download Credentials' and store them in a secure location
  3. Install the AWS CLI via a method of your choice here.

  4. Once it's installed you have to configure it. Type aws configure in the command line. You should see something like this:

$ aws configure
AWS Access Key ID [None]: AKIAIOSFODNN7EXAMPLE
AWS Secret Access Key [None]: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
Default region name [None]: eu-west-1
Default output format [None]: ENTER

Enter your aws access key, secret access key and region then press enter on the last option.
You should now be good to go!

  1. Next write a Lambda function in your text editor if you haven't already. Check out our previous example up until step 4 (we'll be automating the zipping in this example).

  2. Once you've done this you'll want to create a new S3 bucket that will store all of your uploaded Lambda functions. Click on the S3 console on the AWS Management Console window:

s3 console

Click the 'Create Bucket' button. Give your S3 Bucket a name and select its region. We've called ours 'lambda-function-container':

s3 create bucket

  1. Next you'll want to write a bash script that will perform 3 commands. The first is to create your deployment package (a .ZIP file containing your lambda function and its dependencies). The second will upload the deployment package to your newly created S3 Bucket. The third will deploy your Lambda function from S3.

To do so create a new file and call it whatever you want and save it as a .sh file. We've called ours 'lambda-upload-create.sh'. The 3 commands require variables as input which is why we've included the echo & read bash commands in order to temporarily save these inputs:

echo and read

We tried to have as few variable inputs as possible so that it reduces the margin for error when typing it into the command line. These are followed by our zip and AWS CLI commands:

The first command (zip) takes two inputs, the name of the zip file you want to create and the names of the files you want to zip up. (in our case its going to be upload and upload.js seeing as we have no dependencies)

zip -r "$ZipFileName.zip" $FilesToBeZipped

The upload command 'put-object' takes three inputs, the name of the bucket, the key which is the file path of the zip file and the body which is the same as the key in this case.

aws s3api put-object --bucket $BucketName --key "./$ZipFileName.zip" --body "./$ZipFileName.zip"

The deployment command 'create-function' takes five inputs, the function name which can be anything you like, the runtime which in our case is nodejs, the role which is the ARN for an IAM role you have used/created in the IAM console, the code which consists of the bucket name that you're deploying from and the key which is the file path of the zip and finally the description of your function which is optional.

aws lambda create-function --function-name $FunctionName --runtime nodejs \
--role $Role --handler "$ZipFileName.handler" \
--code S3Bucket="$BucketName",S3Key="./$ZipFileName.zip" \
--description $Description
  1. Let's create the script that we'll run in our package.json that will trigger the .sh file we just created:

script link

In order to be able to run our script we have to make it executable. Type this command into your terminal:

chmod +x (filenameOfScript.sh)

  1. One final step before we'll be able to run our script. Go back to AWS and go to the IAM console because you need to add some policies that enable you to perform certain methods like 'create-function' or 'put-object'.

Click on the groups and then select 'Create Group'. We've made a 'Public' group, click on it once you've created it: create group

Click on the 'Attach Policy' button and then select 'IAMFullAccess' from the list: Attach policy

Click on the 'Create Group Policy' in the Inline Policies section: inline policy

Select the 'Custom Policy' and then press the 'Select' button: custom policy

Create your custom policy. We've included the necessary effects, actions and resources to have complete access. Then click 'Apply Policy': create custom policy

Once your group has been created you'll need to add a user to it. Any user who is added to that group will have the same permissions. If you haven't created a user you can do that here: create user

Go back to the group you just created and then click 'Add Users to Group' and then select a user to add. The user should be the one that has the access key id and secret access key assigned to it that you're using for the AWS CLI. add users

We should now be able to take our script for a spin!

  1. In the command line, run the script in your package.json. Ours is as follows: $ npm run upload

This should prompt the echo and read commands first:

Enter the name of the files you wish to zip (eg. lambdaFunction.js node_modules): upload.js
Enter the name of the output zip file (eg. lambdaFunction): upload
Enter the name of the s3 bucket you wish to upload to: lambda-function-container
Enter the name of your lambda function: Upload
Enter the ARN of the role you wish to implement: arn:aws:iam::655240711487:role/lambda_basic_execution

After you've hit enter it should return this:

adding: upload.js (deflated 17%)
{
    "ETag": "\"519e9cfc9a2ee33412ba813c82f33a56fa3\""
  }
{
  "CodeSha256": "nbYYHfHKyYSlb09Dpw7vf7wB93F+9V8XEmaTBU=",
  "FunctionName": "Upload",
  "CodeSize": 249,
  "MemorySize": 128,
  "FunctionArn": "arn:aws:lambda:eu-west-1:655240711487:function:Upload",
  "Version": "$LATEST",
  "Role": "arn:aws:iam::655240711487:role/lambda_basic_execution",
  "Timeout": 3,
  "LastModified": "2016-01-28T13:31:28.627+0000",
  "Handler": "upload.handler",
  "Runtime": "nodejs",
  "Description": "Bash Script Tutorial"
}
  1. Go to S3 to check if the deployment package has been uploaded. You should see your .ZIP file:

s3 uploaded

  1. Go to Lambda to check if your Lambda function has been enabled:

lambda enabled

That's all! You should now be able to upload and deploy a Lambda function with a single bash script.

Deploying Lambda Functions using Gulp

Gulp can be used to automate the zipping, deployment and testing of Lambda functions on AWS. The Codeship deployment script can then be reduced to a single command gulp deploy!

The syntax to create a new Gulp task is"

gulp.task('name of task', function() {
  return  //gulp functions to run
})

There many plugins for performing actions like retrieving, moving and zipping files. These actions are also chainable.

We will go through a simple gulp script with tasks for each of the steps involved.

  1. Require in all the relevant modules and files. We'll be using the aws-sdk to deploy and invoke the lambda function. We also need to read in the package.json file in order to add the node modules to the zip file.
```js
var AWS         = require('aws-sdk');
var gulp        = require('gulp');
var zip         = require('gulp-zip');
var install     = require('gulp-install');
var runSequence = require('run-sequence');
var fs          = require('fs');

var packageJson = require('./package.json');
```
  1. Declare Constants.
```js
var region       = 'eu-west-1';  //AWS region
var functionName = 'LambdaTest';  
var outputName   = 'LambdaTest.zip'; //name to be given to output zip file

// the ARN of the execution role to be given to the lambda function - change this to a role from your account
var IAMRole = 'arn:aws:iam::685330956565:role/lambda_basic_execution';

// the paths of the files to be added to the zip folder
var filesToPack = ['./lambda-testing/functions/LambdaTest.js'];

```

**Make sure the IAM role is changed to the ARN of a role from your AWS account and the region is set to the AWS region you want to deploy the Lambda function to!**
  1. Create an archive folder and add the project files
```js
gulp.task('js', function () {
  return gulp.src(filesToPack, {base: './lambda-testing/functions'})
    .pipe(gulp.dest('dist/'));
});
```

`gulp.src` takes an array of file paths as the first argument and an options object as the second. If you specify a base file path in the options only the folders/files after the base are copied i.e. in this case, only the LambdaTest.js file is copied into the archive folder (`dist`).  
  1. Add the node modules to the archive folder
```js
gulp.task('node-modules', function () {
  return gulp.src('./package.json')
    .pipe(gulp.dest('dist/'))
    .pipe(install({production: true}));
});
```

In this task, the `package.json` file is copied to the archive folder and the 'gulp-install' module is used to do an `npm install --production` of all the listed dependencies.
  1. Zip up the archive folder and save it.
```js
gulp.task('zip', function () {
  return gulp.src(['dist/**', '!dist/package.json'])
    .pipe(zip(outputName))
    .pipe(gulp.dest('./'));
});
```

All the files in the dist folder apart from the `package.json` file are zipped up using the 'gulp-zip' module and save in the root of the project folder.
  1. Upload the zip file to AWS. If the function already exists, update it, otherwise create a new Function.
We can create an 'upload' task with gulp

```js
gulp.task('upload', function() {})
```

Inside the function we first have to do a bit of set up:

```js
AWS.config.region = region; // this is set to eu-west-1 from the constants declared in step 1
var lambda = new AWS.Lambda();
var zipFile = './' + outputName; // the outputName has also been set in step 1
```

First we need to check if the function already exists on AWS before deciding whether to create a function or update a function.

```js
lambda.getFunction({ FunctionName: functionName }, function(err, data) {
  if (err) createFunction();
  else updateFunction();
});
```

We also need a function to retrieve the saved zip file in order to pass it in as a parameter in our create function command.

```js
function getZipFile (callback) {
  fs.readFile(zipFile, function (err, data) {
        if (err) console.log(err);
        else {
          callback(data);
        }
  });
}
```
The `getZipFile` function takes a callback which gets called with the file data if the file is read successfully.

Using the aws-sdk we can then define a function to create a new Lambda function from this zip file.

```js
function createFunction () {

  getZipFile(function (data) {
    var params = {
      Code: {
        ZipFile: data // buffer with the zip file data
      },
      FunctionName: functionName, // functionName was set in the constants in step 1
      Handler: 'LambdaTest.handler',  // need to set this as the name of our lambda function file is LambdaTest.js
      Role: IAMRole,  // IAMRole was set in the constants in step 1
      Runtime: 'nodejs'
    };

    lambda.createFunction (params, function (err, data) {
      if (err) console.error(err);
      else console.log('Function ' + functionName + ' has been created.');
    });
  });

}
```
Similarly we can also define `updateFunction`:

```js
function updateFunction () {

  getZipFile(function (data) {
    var params = {
      FunctionName: functionName,
      ZipFile: data
    };

    lambda.updateFunctionCode(params, function(err, data) {
      if (err) console.error(err);
      else console.log('Function ' + functionName + ' has been updated.');
    });
  });
}
```
  1. Invoke the function with a test event to check the live version is working as expected.
We have to first get the function to make sure it exists and only invoke it if there isn't an error.

In the parameters for invoking the function, a JSON object can be specified as the 'Payload' and the 'InvocationType' can be specified as 'RequestResponse' if you want to get a response body.

```js
gulp.task('test-invoke', function() {
  var lambda = new AWS.Lambda();

  var params = {
    FunctionName: functionName,
    InvocationType: 'RequestResponse',
    LogType: 'Tail',
    Payload: '{ "key1" : "name" }'
  };

  lambda.getFunction({ FunctionName: functionName }, function(err, data) {
    if (err) console.log("Function" + functionName +  "not found", err);
    else invokeFunction();
  });

  function invokeFunction() {
    lambda.invoke(params, function(err, data) {
      if (err) console.log(err, err.stack);
      else console.log(data);
    })
  }
})
```
  1. Create a deployment task that runs all the above tasks in series in the correct order.
The `runSequence` module takes a comma separated list of gulp task names or a list of arrays with gulp tasks, and ends with a callback. The tasks are run in the order they are specified. To run two tasks in parallel specify them in the same array.

```js
gulp.task('deploy', function (callback) {
  return runSequence(
    ['js', 'node-modules'],
    ['zip'],
    ['upload'],
    ['test-invoke'],
    callback
  );
});
```

**In the AWS console you can only view functions by region, so if you can't see the function after it has been created, check you're looking at the correct region (in the dropdown menu in the top right of the console)**

![AWSregion](https://cloud.githubusercontent.com