A short tutorial showing how to build a REST API using the serverless framework and AWS Lambda.
Serverless apps have been around for a while, but they became mainstream around 2014 when AWS introduced Lambda. Despite the name, serverless apps do run on servers — they’re just managed by a cloud provider such as AWS. This frees you to focus on coding your app logic. Another benefit of serverless apps is their ability to run in response to events and bill you only when they run.
The Serverless framework is a CLI tool that allows you to build and deploy serverless apps in a structured way. The framework supports a variety of providers, including AWS Lambda, Google Cloud Functions and Microsoft Azure Functions. In this tutorial, we’ll use the serverless framework to build an API powered by AWS Lambda.
We’ll build a REST API for managing products stored in a warehouse. We’ll make these four operations possible:
/products
with the product information in the body. For each product, we’ll store the name, the quantity, a timestamp marking when it was added, and a unique ID./products
./products/{id}
. {id}
here is a placeholder for the product ID./products/{id}
.We’ll use AWS DynamoDB as our data store. Let’s go!
First up, we’ll install the serverless CLI:
1npm install -g serverless
Next, we’ll create a new service using the AWS Node.js template. Create a folder to hold your service (I’m calling mine stockup
) and run the following command in it:
1serverless create --template aws-nodejs
This will populate the current directory with the starter files needed for the service. Your directory should have the following structure:
1stockup 2 |- .gitignore 3 |- handler.js 4 |- serverless.yml
A service provides functions. Functions are points of entry into your app for performing a specific functionality. Remember the operations we listed above for our API? Each of those is going to be a function in our service.
Each function has events that trigger it, and a handler that responds to the event. An event can be a web request (visiting a URL or making an API call), an action from another service, or a custom action that happens in your app. A handler is the code that responds to the event. Each function may also make use of one or more resources. Resources are external services your functions make use of, such as a database, a cache, or external file storage.
The serverless.yml
file serves as a manifest for our service. It contains information that the serverless CLI uses to configure and deploy our service. We’ll write ours, then examine the contents to get a deeper understanding. Replace the contents of your serverless.yml
file with the following:
1service: stockup 2 provider: 3 name: aws 4 runtime: nodejs6.10 5 iamRoleStatements: 6 - Effect: Allow 7 Action: 8 - dynamodb:DescribeTable 9 - dynamodb:Query 10 - dynamodb:Scan 11 - dynamodb:GetItem 12 - dynamodb:PutItem 13 - dynamodb:UpdateItem 14 - dynamodb:DeleteItem 15 Resource: 16 Fn::Join: 17 - "" 18 - - "arn:aws:dynamodb:*:*:table/" 19 - Ref: ProductsDynamoDbTable 20 21 functions: 22 create: 23 handler: handler.create 24 events: 25 - http: 26 path: products 27 method: post 28 cors: true 29 list: 30 handler: handler.list 31 events: 32 - http: 33 path: products 34 method: get 35 cors: true 36 view: 37 handler: handler.view 38 events: 39 - http: 40 path: products/{id} 41 method: get 42 cors: true 43 remove: 44 handler: handler.remove 45 events: 46 - http: 47 path: products/{id} 48 method: delete 49 cors: true 50 51 resources: 52 Resources: 53 ProductsDynamoDbTable: 54 Type: AWS::DynamoDB::Table 55 Properties: 56 TableName: products 57 AttributeDefinitions: 58 - AttributeName: id 59 AttributeType: S 60 KeySchema: 61 - AttributeName: id 62 KeyType: HASH 63 ProvisionedThroughput: 64 ReadCapacityUnits: 1 65 WriteCapacityUnits: 1
We’ve described our service using four top-level keys:
service
: the name of our service (“stockup”)provider
: this is where we specify the name of the provider we’re using (AWS) and configurations specific to it. In our case, we’ve configured the runtime (Node.js) and the IAM (Identity Access Management) role that our functions will run under. Our functions need to read from and write to our DynamoDB permissions, so we added the necessary permissions to the IAM role.functions
: here we specify the functions provided by our service, the API calls that should trigger them, and their handlers (we’ll write the code for the handlers soon)resources
: The resources
key contains all necessary configuration for our resources. In our case, we’ve configured the DynamoDB resource by specifying the name of the table we’ll be interacting with (products). DynamoDB is schemaless but requires you to declare the primary key for each table, so we’ve defined this in our AttributeDefinitions
and KeySchema
. We’re using the id
, a string, as our primary key.Now let’s install our app’s dependencies. Remember that this is a Node.js app, so we can use NPM to install dependencies as normal. Create a file called package.json
in your project root with the following content:
1{ 2 "dependencies": { 3 "aws-sdk": "^2.205.0", 4 "uuid": "^3.2.1" 5 } 6 }
We need the AWS SDK for interacting with DynamoDB and the uuid
module to generate product IDs.
Now run npm install
, and we’re ready to write our event handlers.
Let’s write the code that responds to events. Remember that we have to export our handlers from the file handler.js
. There’s no rule, however, that says that we have to put all the code for them in that one file. To keep our code clean, we’ll write each of our handlers in its own file, then export them all from handler.js
. Let’s start off with adding a product.
Create a sub-directory called handlers
. All our handler files ill go in this directory.
Create a file called create.js
in the handlers
directory with the following code:
1'use strict'; 2 3 const AWS = require('aws-sdk'); 4 const dynamoDb = new AWS.DynamoDB.DocumentClient(); 5 const uuid = require('uuid'); 6 7 module.exports = (data) => { 8 const params = { 9 TableName: 'products', 10 Item: { 11 name: data.name, 12 quantity: data.quantity, 13 id: uuid.v1(), 14 addedAt: Date.now(), 15 } 16 }; 17 return dynamoDb.put(params).promise() 18 .then(result => params.Item) 19 };
In this file, we’re exporting a function that takes in the product data (sent by the user in the body of the request). Our function then inserts the product into the database, returning the result via a Promise
.
Next, we’ll import this module and export the handler in our handler.js
:
1'use strict'; 2 3 const addProduct = require('./handlers/create'); 4 5 const create = (event, context, callback) => { 6 const data = JSON.parse(event.body); 7 addProduct(data) 8 .then(result => { 9 const response = { body: JSON.stringify(result) }; 10 callback(null, response); 11 }) 12 .catch(callback); 13 }; 14 15 16 module.exports = { 17 create, 18 };
How does this work? Let’s take a closer look.
The handler.js
file must export an object with properties matching those named as handlers in the serverless.yml
file. Each handler is a function that takes three parameters:
In the code above, we’re importing our create
module and passing the product data to it, then responding with an error or success to the user.
Now that we’re familiar with the design pattern, let’s write the rest of our handlers.
Our list function (handlers/list.js
) is quite simple. We don’t need any parameters. We call the DynamoDB scan
command to get all the products:
1'use strict'; 2 3 const AWS = require('aws-sdk'); 4 const dynamoDb = new AWS.DynamoDB.DocumentClient(); 5 6 module.exports = () => dynamoDb.scan({ TableName: 'products' }).promise();
Our view function (handlers/view.js
) takes in the product ID and returns the corresponding product using dynamoDb.get
:
1'use strict'; 2 3 const AWS = require('aws-sdk'); 4 const dynamoDb = new AWS.DynamoDB.DocumentClient(); 5 6 module.exports = (id) => { 7 const params = { 8 TableName: 'products', 9 Key: { id } 10 }; 11 return dynamoDb.get(params).promise(); 12 };
And our remove function (handlers/remove.js
) also takes a product ID, but uses the delete
command to remove the corresponding product:
1'use strict'; 2 3 const AWS = require('aws-sdk'); 4 const dynamoDb = new AWS.DynamoDB.DocumentClient(); 5 6 module.exports = (id) => { 7 const params = { 8 TableName: 'products', 9 Key: { id } 10 }; 11 return dynamoDb.delete(params).promise(); 12 };
And, now, putting everything together, our handler.js
becomes:
1'use strict'; 2 3 const addProduct = require('./handlers/create'); 4 const viewProduct = require('./handlers/view'); 5 const listProducts = require('./handlers/list'); 6 const removeProduct = require('./handlers/remove'); 7 8 const create = (event, context, callback) => { 9 const data = JSON.parse(event.body); 10 addProduct(data) 11 .then(result => { 12 const response = { body: JSON.stringify(result) }; 13 callback(null, response); 14 }) 15 .catch(callback); 16 }; 17 18 const list = (event, context, callback) => { 19 listProducts() 20 .then(result => { 21 const response = { body: JSON.stringify(result) }; 22 callback(null, response); 23 }) 24 .catch(callback); 25 }; 26 27 const view = (event, context, callback) => { 28 viewProduct(event.pathParameters.id) 29 .then(result => { 30 const response = { body: JSON.stringify(result) }; 31 callback(null, response); 32 }) 33 .catch(callback); 34 }; 35 36 37 const remove = (event, context, callback) => { 38 removeProduct(event.pathParameters.id) 39 .then(result => { 40 const response = { body: JSON.stringify({message: 'Product removed.'}) }; 41 callback(null, response); 42 }) 43 .catch(callback); 44 }; 45 46 47 module.exports = { 48 create, 49 view, 50 remove, 51 list 52 };
To deploy your service to AWS, you’ll need to first configure the serverless CLI with your AWS credentials. Serverless has published a guide on that (in video and text formats).
When you’ve done that, run this command:
1serverless deploy
And that’s it! Let’s confirm that the deploy was successful. Visit the Lambda Management Console, and you should see all your functions listed like this:
You can see the function names are prefixed with stockup-dev
. “Stockup” here is the name of the service, while “dev” represents the stage. If you click on one of them, say the stockup-dev-create
function, you should see a detail view like this:
The pane on the right contains two lists of cards. The cards on the left are the events that trigger our app. HTTP requests show up in this pane via AWS API Gateway. The cards shown on the right are the resources our app uses. You can see our DynamoDB resource listed; the CloudWatch resource is added by default by AWS and used for logs and monitoring your app.
Now we need to find the URL for accessing this function. Clicking on the “API Gateway” trigger opens a pane below, and if you expand the “Details” box, you’ll see an “Invoke URL” property:
Now we can test the API. Open up Postman or whatever API testing tool you use, and try making a POST request to the /products
endpoint with some data to create a new product:
1{ 2 "name": "Vibranium shield", 3 "quantity": 1 4 }
You should get a response like this:
Try out the other APIs in a similar manner.
That was fun, right? In a very short time, we were able to have a functioning API up and running without provisioning any servers. And we can go beyond that, by, for instance, choosing to run an entire web or mobile app serverless. If you’re interested, you can read more about the serverless framework at its official documentation, and check out the AWS Lambda Node.js docs too. You can also check out the full stockup
source code on GitHub.