
You might already know how to create a table in Amazon DynamoDB. Inserting, updating and querying items. The usual chatter in the average day of a DBA.
Whatâs âcloudyâ about THAT?
I want to know â
Whereâs your sense of exploration, astronaut?
Enough of pep banter.
Câmon, try some database streaming!
Segue: Today, itâs all about you, DynamoDB.
DynamoDB Streams ⌠as in streaming data?
Itâs like listening to events.
What are events? Any action performed on your DynamoDB table is an event.
It includes:
- Inserting records
- Querying records
- Updating records
- Deleting records
Setting up DynamoDB Streams let you know when the items change in your table.
Why? (Sell me on that Shiny Use-Case.)
You are looking to buy an ergonomic chair to support your back during long hours of work. So you asked your friend for recommendations, and he sent you a link to the one he has.
Youâre ecstatic. While going to buy that chair, you see. Itâs currently out of stock.
Hey, you would even wait few weeks for it. But how would you know when it actually comes back-in-stock?
No way, unless you are actually sitting and refreshing the link every few hours.
The Way Out - Getting Notified on Item Back-in-stock
When the admin on the inventory management gets the items, he updates the stock on the item listing.
Imagine that - but their item listing is a DynamoDB table.
In such a case, DynamoDB streams can help. Hereâs a simple workflow for DynamoDB:
- Item quantity (stocks) gets updated, example: 0 to 25 (twenty-five new chairs).
- DynamoDB streams takes note of the action, whether it was an
INSERT
,MODIFY
orREMOVE
. In this case, it wasMODIFY
. - If you set up a Lambda trigger connected to that DynamoDB stream, you can actually get a log of what changed. This can be based on four options:
KEYS_ONLY
- Selecting this doesnât tell the values of the stream records, just the keys that changed (FYI: You canât update the keys, you need to delete and add it again).OLD_IMAGE
- This option gives you the DynamoDB stream records of an item before it was modified / updated.NEW_IMAGE
- This option is better if you want to know the value of DynamoDB stream records after the item was modified.NEW_AND_OLD_IMAGES
- Selecting this will give you the DynamoDB stream records which contain both the item value before and after snapshots. This is the most suitable option if you want to keep track of both the before and after item values.
- You can log this event in your Lambda handler and get the logs sent straight to your CloudWatch log groups.
- You can choose to get notified by email (SES) on occurrence of this event.
Now if the item stock gets updated, youâll be the first to know! (And buy that cozy chair that you deserve)
For DynamoDB Streams, You Gotta Have a Table
Or steal mine!
Start with creating a DynamoDB table.
In my case itâs named Cloud9 after my favorite supermarket.
DynamoDB Insertions: A Simple Inventory
It has two item categories:
- Washington Apples
- Valencia Oranges
As you see in the screenshot below, the partition key (PK) is FRUIT for both, and the category is specified in the Sort Key (SK) in the format where it can be sorted easily (NAME#PLACE):
APPLE#WASHINGTON
ORANGE#VALENCIA
Since itâs a supermarket inventory, we would like to keep track of how many of each fruit we have, hence the Stocks
attribute.
Example: 39 apples and 42 oranges.
The attribute Price
is just storing a DynamoDB JSON for localized prices, i.e. INR and USD (At Cloud9, we love to be global!).

Starting ⌠DynamoDB Streams!
If you go to the Tables tab on the left navigation pane, youâll see all your DynamoDB tables, including the one you just created.
Click on the table name highlighted in a blue link (here itâs Cloud9).

Youâll see an interface similar to this. Now we want to set up our DynamoDB streams.
Go to the Exports and Streams tab.
Youâll see options like the following:


Turning on DynamoDB Stream
You would want to scroll down and hit the Turn On button in the section DynamoDB stream details.
Youâll be directed to this page which gives you the option to select which item-level changes you would like to push to the DynamoDB stream.

View Type for Item Change - We talked about this!
For your first time, you would like to see the entirety of item changes, the before and the after.
Letâs go ahead and select the last option New and old images.
Finally hit the Turn on stream button to make your changes stick.

The stream is on!
Creating a trigger in AWS Lambda
In the last snapshot, you might have seen that Trigger section.
What is it, and why do we need to create one?
In DynamoDB stream, we only get the events on any change to the DynamoDB database records.
But that doesnât give us logs.
With AWS Lambda trigger, we can set up a simple Lambda handler function in under ~ 4 lines of code which will send the changelog to Amazon CloudWatch.
So now â every time our table gets changed, we will get a log in CloudWatch.
This Lambda guy never misses the ring (thanks Lambda).
We click on the Create Trigger button under our DynamoDB stream details section.
You donât happen to have an unused Lambda function lying around, do you? Letâs create a new one using the Create New button on the right.

Create a Lambda handler function from scratch
On this screen, click on the Author from Scratch button.
- Give your function a unique name - I named it DB-Stream-Handler. Awesome, right?
- Select Node.js 18.x as the Runtime. Python and Ruby is out-of-scope right now, but hit me up in the comments if you are going to try those.
- Architecture doesnât matter for this, you can select
arm64
orx86_64
. - Now click on Create Function button to create your Lambda function.

Youâll get to see your Lambda function with a Successfully created
alert.

Scroll down and go to the Code tab.
Open the index.mjs
already created for you.
Paste this code into the file:
export const handler = async (event) => {
console.log(JSON.stringify(event));
};
Hit Deploy to save your changes.

Letâs attach our Lambda trigger to DynamoDB Stream
After you create your Lambda function, letâs switch back to the DynamoDB tab. You should be back on the Create a trigger page.
Now you can simply search for your function and select it from the drop down.
Keep the batch size to 1. It simply means our Lambda function will receive at most 1 item.
If you put 4, your Lambda can receive at most 4 items at a time.
Example: When performing BatchWrite
or BatchDelete
operations on your DynamoDB table.
Make sure to keep the box Turn on trigger checked.
Now click on Create Trigger.

Youâll be greeted with DynamoDB permissions error saying:
InvalidParameterValueException: Cannot access stream arn:aws:dynamodb:us-east-1:240XXXXX9274:table/Cloud9 /stream/2023-07-16T09:00:05.000.
Please ensure the role can perform the GetRecords, GetSharditerator, DescribeStream, and ListStreams Actions on your stream in IAM.

Donât fret. Weâll fix it. Itâs simply a permission error which our Lambda needs in order to receive the DynamoDB stream.
Add DynamoDB Stream Policy to AWS Lambda
Go to your Lambda function and then to Configuration tab.
Select Permissions from the left side navigator.
Youâll see the Execution role section with a role attached to it. By default the role gets only the CloudWatch permissions.
Click on the Role name link in blue. In my case itâs DB-Stream-Handler-role.

Youâll be redirected to IAM role page.

If you scroll down to Permissions policies in the Permissions tab: Youâll see a policy already attached by the name AWSLambdaBasicExecutionRole.

To add a new policy, click on the Add permissions dropdown button and select Create inline policy.

Youâll be redirected to the Specify Permissions page, also known as Policy Editor.
Give Lambda role the permissions to read DynamoDB table for the following operations which we saw in the error
GetRecords
GetSharditerator
DescribeStream
ListStreams
If you donât want to manually search and add all the permissions, skip to the next section.

First we select the service, in our case itâs DynamoDB. You can simply search for DynamoDB and click on it.

Now we select all the operations by searching their names in the search box. Example: Type in GetRecords, then select the checkbox to automatically add the permission.

Once youâre done adding the 4 permissions, expand the Resources section and select Specific.
Click on Add Arn button.
This will make sure that our Lambda canât access any other DynamoDB tables or perform actions (Least privilege mechanism).

Youâll see a popup like this:

Get DynamoDB table ARN
You can find the ARN in the error message itself.

In case you ran away from the error, here is the ideal way to get the DynamoDB table ARN.
- Go to DynamoDB
- Click on Tables on the left navigation pane.
- Now select your DynamoDB table from the list.
- Go to the Exports and Streams tab.
- Scroll down to DynamoDB stream details section.
- Youâll see your Amazon Resource Name (ARN) at the bottom, click the copy button.
Letâs go back to Specify ARNs on the IAM Policy editor page.

Now that we have the ARN, simply paste it into the last input field. Youâll see it magically fill in the other fields like region and table name.
Finally click on the Add ARNs button.

Save yourself the trouble â Use JSON!
There are two tabs right to the Policy editor â Visual and JSON. Switch to the JSON tab.
Youâll see something like this if you have made the changes visually by following the previous steps.

If not, then simply paste the JSON below into your policy editor.
Make sure to replace your ARN in the resource attribute.
Donât confuse DynamoDB table ARN with DynamoDB Stream ARN.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"dynamodb:GetShardIterator",
"dynamodb:DescribeStream",
"dynamodb:GetRecords"
],
"Resource": "your-dynamodb-stream-url"
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": "dynamodb:ListStreams",
"Resource": "*"
}
]
}
Click the Next button.
Youâll be now prompted to type in your Policy name. In the snapshot below, I have kept it dynamodb-stream-access.
Finally, click on Create Policy.

Youâll now see your custom IAM policy too, in addition to the default Lambda execution role.

Now that your Lambda function has all the permissions, head back to the Create a trigger page and hit the Create trigger button.

This time, you should get zero errors.

Youâll see your new Lambda trigger being created successfully.

Testing out our Lambda trigger
We discussed in the first few sections that we will get DynamoDB stream records in our CloudWatch logs when we insert, modify or remove the DynamoDB table records.
We can modify the attributes of our table item and see if it works.

Example: You can check the box for a particular item from the table like APPLE
, and update its stock from 39 to 33.
Hit Save.

Now go to CloudWatch and click on Log groups from the left navigation pane.
If you have multiple log groups, type in your Lambda function name (Example: DB-Stream-Handler) and click on the blue link for your log group when you see it.

Youâll see that your log group directory is further classified into Log streams.
Click on the most recent one by sorting it in descending using the Last event time attribute.

Youâll see the individual Log events for your DynamoDB stream here.
Youâll see a lot of expand arrow buttons.
Expand the one below START RequestId. This log message contains the INFO on the records that were changed.

Hereâs a complete breakdown of the JSON changelog which we got from DynamoDB stream event:
{
"Records": [
{
"eventID": "5055524747c831370d982c2d2b59b780",
"eventName": "MODIFY",
"eventVersion": "1.1",
"eventSource": "aws:dynamodb",
"awsRegion": "us-east-1",
"dynamodb": {
"ApproximateCreationDateTime": 1689508624,
"Keys": {
"SK": {
"S": "APPLE#WASHINGTON"
},
"PK": {
"S": "FRUIT"
}
},
"NewImage": {
"Price": {
"M": {
"USD": {
"N": "0.8"
},
"INR": {
"N": "50"
}
}
},
"Stocks": {
"N": "32"
},
"SK": {
"S": "APPLE#WASHINGTON"
},
"PK": {
"S": "FRUIT"
}
},
"OldImage": {
"Price": {
"M": {
"USD": {
"N": "0.8"
},
"INR": {
"N": "50"
}
}
},
"Stocks": {
"N": "33"
},
"SK": {
"S": "APPLE#WASHINGTON"
},
"PK": {
"S": "FRUIT"
}
},
"SequenceNumber": "686242300000000012474772171",
"SizeBytes": 131,
"StreamViewType": "NEW_AND_OLD_IMAGES"
},
"eventSourceARN": "arn:aws:dynamodb:us-east-1:240425629274:table/Cloud9/stream/2023-07-16T09:00:05.000"
}
]
}
Notice the "StreamViewType": "NEW_AND_OLD_IMAGES"
which we selected in an earlier step.
On further inspection, you will find the key-value pairs for NewImage and OldImage.
Differentiating them tells you that the âStockâ value for APPLE#WASHINGTON
went up from 32 to 33, while the other attribute âPriceâ remained unchanged.
Perfect way to know when your favorite chair is back in stock, from 0 to any better number!
Decode DynamoDB JSON using unmarshall
To spice things up, letâs convert DynamoDB JSON to JSON which can be read by us ;)
For that conversion, we use the unmarshall function provided by the AWS DynamoDB SDK.
Replace the code in your Lambda handler function (index.mjs
) with the revised version below which will just show us the pretty NewImage (value after modifying the DynamoDB table item):
import { unmarshall } from "@aws-sdk/util-dynamodb";
export const handler = async (event) => {
event.Records.forEach(record => {
const item = record.dynamodb.NewImage;
console.log(unmarshall(item));
});
};
After unmarshall
youâll get this little JSON instead of DynamoDB JSON:
{
Price: { USD: 0.8, INR: 50 },
Stocks: 69,
SK: 'APPLE#WASHINGTON',
PK: 'FRUIT'
}
Pat yourself on the back if you could do this!
AWS is fascinating, sometimes hard, but always rewarding.
DynamoDB streams are an awesome way to skip the manual differentiation and do it the prettier way.
Just turn it on â create a Lambda trigger â give it the DynamoDB permissions â take action.
You can even use it with a DynamoDB TTL for subscription-based models (send emails when the payment is due).
Now, whatâs stopping you from tracking down your favorite things in life?
You can set up a SNS topic where you would get notified on a Amazon Prime series you left watching, but the show says it is Leaving Soon.
With DynamoDB streams, you can continue to break free from the chain and live your life without worrying about missing out on a thing.
The possibilities are endless.
Finish your coffee, get up from that chair youâre sitting on for so long, and automate the things you donât want to keep refreshing your page for.
Stream.