Home >Backend Development >Python Tutorial >Parsing & Loading Data from So DynamoDB with Lambda Function
Many scenarios require you to work with data formatted as JSON, and you want to extract and process the data then save it into table for future use
In this article we are going to discuss loading JSON formatted data from S3 bucket into DynamoDB table using Lambda function
The architecture below shows we are using 3 AWS services
A brief description of services below as refreshment:
We will walk through the steps & configuration for deploying the above diagram
1- Create Lambda Function with below Configuration
Author from Scratch
Function Name: ParserDemo
Runtime: Python 3.1x
Leave the rest as default
After Lambda created, you will need to modify the timeout configuration & Execution role as below:
I wrote this python code to perform the logic
import json import boto3 s3_client = boto3.client('s3') dynamodb = boto3.resource('dynamodb') def lambda_handler(event, context): bucket_name = event['Records'][0]['s3']['bucket']['name'] # Getting the bucket name from the event triggered by S3 object_key = event['Records'][0]['s3']['object']['key'] # Getting the Key of the item when the data is uploaded to S3 print(f"Bucket: {bucket_name}, Key: {object_key}") response = s3_client.get_object( Bucket=bucket_name, Key=object_key ) # We will convert the streamed data into bytes json_data = response['Body'].read() string_formatted = json_data.decode('UTF-8') #Converting data into string dict_format_data = json.loads(string_formatted) #Converting Data into Dictionary # Inserting Data Into DynamoDB table = dynamodb.Table('DemoTable') if isinstance(dict_format_data, list): #check if the file contains single record for record in dict_format_data: table.put_item(Item=record) elif isinstance(dict_format_data, dict): # check if the file contains multiple records table.put_item(Item=data) else: raise ValueError("Not Supported Format") # Raise error if nothing matched
2- Create S3 bucket
BucketName: use a unique name
leave the rest of configuration as default
Add the created S3 bucket as a trigger to lambda function as below:
3- Create a Table in the DynamoDB with the below configuration
Table Name: DemoTable
Partition Key: UserId
Table Settings: Customized
Capacity Mode: Provisioned
To Save costs configure the provisioned capacity units for read/write with low value (1 or 2 units)
Now the setup is ready, you can test it by uploading a file to the S3, then you will find items created on the DynamoDB table with the records you have uploaded into the file.
CloudWatch Logs for Lambda Function
DynamoDB Items
I hope you found this interesting and please let me know if you have any comments.
S3 API
DynamoDB API
boto3 practice for AWS services
The above is the detailed content of Parsing & Loading Data from So DynamoDB with Lambda Function. For more information, please follow other related articles on the PHP Chinese website!