Amazon Kinesis Data Streams Hands-On

Demo

In this demo, we will:

  1. Create an Amazon Kinesis Data Stream.
  2. Set up a Lambda function to generate sample data.
  3. Create a DynamoDB table for storing processed data.
  4. Create a Lambda function to process Kinesis data and store in DynamoDB.
  5. Run the 1st Lambda Function to generate the sample data, which goes Kinesis Data Stream.
  6. Kinesis Data Stream triggers the 2nd Lambda Function which stores it in DynamoDB.
  7. Clean up resources

Agenda

Demo Overview

Create data stream

demo-stream

Data stream capacity

Data stream settings

Create data stream

Create 1st Lambda function

KinesisDataGenerator

Advanced settings

import json
import boto3
import random
import time

def lambda_handler(event, context):
    kinesis = boto3.client('kinesis')
    
    for _ in range(100):  # Generate 100 records
        data = {
            'sensor_id': random.randint(1, 10),
            'temperature': round(random.uniform(20, 30), 2),
            'humidity': round(random.uniform(30, 70), 2),
            'timestamp': int(time.time())
        }
        
        response = kinesis.put_record(
            StreamName='demo-stream',
            Data=json.dumps(data),
            PartitionKey=str(data['sensor_id'])
        )
        
        print(f"Put record in stream: {data}")
        time.sleep(0.1)  # Wait for 0.1 second before sending next record

    return {
        'statusCode': 200,
        'body': json.dumps('Data generation complete')
    }

Code source

1

Edit Timeout

Assign Necessary Permissions to Lambda Role 

AmazonKinesisFullAccess

Create DynamoDB table

KinesisProcessedData
sensor_id
timestamp

Table settings

Create table

Create 2nd Lambda function

ProcessKinesisData

Advanced settings

import json
import boto3
from decimal import Decimal
import base64

dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('KinesisProcessedData')

def lambda_handler(event, context):
    for record in event['Records']:
        # Decode and load the Kinesis data
        payload = json.loads(base64.b64decode(record['kinesis']['data']).decode('utf-8'))
        
        # Convert float to Decimal for DynamoDB
        payload['temperature'] = Decimal(str(payload['temperature']))
        payload['humidity'] = Decimal(str(payload['humidity']))
        
        table.put_item(Item=payload)
        
        print(f"Processed and stored record: {payload}")

    return {
        'statusCode': 200,
        'body': json.dumps('Processing complete')
    }

Code source

Edit Lambda Role to necessary permission policies

AmazonKinesisFullAccess
AmazonDynamoDBFullAccess

Final Role Should Look Like Below

kinesis/demo-stream

Add trigger to Lambda Function

Test

Configure test event

Test-Event

Check the DynamoDB Table

Check an Individual Item 

Clean Up

Delete Data Stream 

Delete Lambda Functions 

Delete DynamoDB Table

Delete log group(s)

Delete the Roles 

🙏

Thanks

for

Watching

Amazon Kinesis Data Streams - Hands-On Demo

By Deepak Dubey

Amazon Kinesis Data Streams - Hands-On Demo

Amazon Kinesis Data Streams - Hands-On Demo

  • 129