Basic lambda_handler programming
Basic lambda_handler programming used in aws
list out the checklist of all query of dynamodb
Basic lambda_handler programming
Hello World:
def lambda_handler(event, context):
return "Hello, World!"
Output:
"Hello, World!"
Sum of Two Numbers:
def lambda_handler(event, context):
num1 = event['num1']
num2 = event['num2']
result = num1 + num2
return result
Input:
{
"num1": 5,
"num2": 10
}
Output:
15
Test the Integration:
Now, you can test the integration by sending HTTP requests to the API Gateway endpoint.
For example, using cURL:
curl -X POST -H "Content-Type: application/json" -d '{"num1": 5, "num2": 10}' https://your-api-gateway-url/your-resource
Replace https://your-api-gateway-url/your-resource with the actual URL of your API Gateway resource.
This HTTP request will trigger the Lambda function through the API Gateway, and the Lambda function will perform the addition of 'num1' and 'num2'.
Remember to configure the API Gateway and Lambda function permissions appropriately to ensure they can communicate. Also, adapt the steps based on the specifics of your use case and the event source you want to integrate with.
List Manipulation:
def lambda_handler(event, context):
my_list = event['my_list']
return len(my_list)
Input:
{
"my_list": [1, 2, 3, 4, 5]
}
Output:
5
String Concatenation:
def lambda_handler(event, context):
str1 = event['str1']
str2 = event['str2']
return str1 + str2
Input:
{
"str1": "Hello",
"str2": "Lambda"
}
Output:
"HelloLambda"
Factorial Calculation:
def lambda_handler(event, context):
num = event['num']
result = 1
for i in range(1, num + 1):
result *= i
return result
Input:
{
"num": 5
}
Output:
120
Check Even or Odd:
def lambda_handler(event, context):
num = event['num']
return "Even" if num % 2 == 0 else "Odd"
Input:
{
"num": 7
}
Output:
"Odd"
Simple Calculator:
def lambda_handler(event, context):
num1 = event['num1']
num2 = event['num2']
operator = event['operator']
if operator == 'add':
result = num1 + num2
elif operator == 'subtract':
result = num1 - num2
elif operator == 'multiply':
result = num1 * num2
elif operator == 'divide':
result = num1 / num2
else:
result = "Invalid operator"
return result
Input:
{
"num1": 10,
"num2": 5,
"operator": "multiply"
}
Output:
50
List Filtering:
def lambda_handler(event, context):
my_list = event['my_list']
filtered_list = [x for x in my_list if x % 2 == 0]
return filtered_list
Input:
{
"my_list": [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
}
Output:
[2, 4, 6, 8, 10]
File Read and Return:
def lambda_handler(event, context):
file_key = event['file_key']
# Assume using AWS S3 for simplicity
file_content = read_file_from_s3(file_key)
return file_content
Input:
{
"file_key": "example.txt"
}
Output:
"Contents of the example.txt file"
Random Number Generator:
import random
def lambda_handler(event, context):
return random.randint(1, 100)
Output (Random):
42
Date and Time:
from datetime import datetime
def lambda_handler(event, context):
current_time = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
return current_time
Output:
"2023-12-03 15:30:00"
Basic lambda_handler programming used in aws
Here's a checklist with 20 basic programming examples of AWS Lambda functions commonly used in AWS, along with example code and expected output. These examples cover a range of AWS services and use cases:
1-5: S3 Bucket Operations
List Objects in an S3 Bucket:
import boto3
def lambda_handler(event, context):
s3 = boto3.client('s3')
bucket_name = event['bucket_name']
objects = s3.list_objects_v2(Bucket=bucket_name)
return objects
Input:
{
"bucket_name": "your-s3-bucket"
}
Output:
{
"Contents": [
{"Key": "file1.txt", "Size": 1024},
{"Key": "file2.jpg", "Size": 2048}
]
}
Upload File to S3 Bucket:
import boto3
def lambda_handler(event, context):
s3 = boto3.client('s3')
bucket_name = event['bucket_name']
file_key = event['file_key']
file_content = event['file_content']
s3.put_object(Bucket=bucket_name, Key=file_key, Body=file_content)
return f"File {file_key} uploaded successfully."
Input:
{
"bucket_name": "your-s3-bucket",
"file_key": "example.txt",
"file_content": "Hello, S3!"
}
Output:
"File example.txt uploaded successfully."
S3 Object Removal:
import boto3
def lambda_handler(event, context):
s3 = boto3.client('s3')
bucket_name = event['bucket_name']
file_key = event['file_key']
s3.delete_object(Bucket=bucket_name, Key=file_key)
return f"File {file_key} deleted successfully."
Input:
{
"bucket_name": "your-s3-bucket",
"file_key": "example.txt"
}
Output:
"File example.txt deleted successfully."
S3 Event Trigger:
Set up an S3 event trigger in the Lambda console.
Lambda function automatically gets triggered when a new file is uploaded to the specified S3 bucket.
S3 Image Processing (using Pillow):
from PIL import Image
import boto3
import os
def lambda_handler(event, context):
s3 = boto3.client('s3')
bucket_name = event['bucket_name']
file_key = event['file_key']
# Download image from S3
local_path = '/tmp/' + file_key
s3.download_file(bucket_name, file_key, local_path)
# Process image (resize)
image = Image.open(local_path)
resized_image = image.resize((100, 100))
# Save processed image back to S3
processed_key = 'processed/' + file_key
local_path_processed = '/tmp/' + processed_key
resized_image.save(local_path_processed)
s3.upload_file(local_path_processed, bucket_name, processed_key)
return f"Image {file_key} processed and saved as {processed_key}."
Input:
{
"bucket_name": "your-s3-bucket",
"file_key": "example.jpg"
}
Output:
"Image example.jpg processed and saved as processed/example.jpg."
6-10: DynamoDB Operations
Put Item into DynamoDB:
import boto3
def lambda_handler(event, context):
dynamodb = boto3.resource('dynamodb')
table_name = event['table_name']
item = event['item']
table = dynamodb.Table(table_name)
table.put_item(Item=item)
return f"Item {item} added to DynamoDB table {table_name}."
Input:
{
"table_name": "your-dynamodb-table",
"item": {"id": 1, "name": "John Doe", "age": 30}
}
Output:
"Item {'id': 1, 'name': 'John Doe', 'age': 30} added to DynamoDB table your-dynamodb-table."
Get Item from DynamoDB:
import boto3
def lambda_handler(event, context):
dynamodb = boto3.resource('dynamodb')
table_name = event['table_name']
key = event['key']
table = dynamodb.Table(table_name)
response = table.get_item(Key=key)
return response.get('Item', {})
Input:
{
"table_name": "your-dynamodb-table",
"key": {"id": 1}
}
Output:
{"id": 1, "name": "John Doe", "age": 30}
Update Item in DynamoDB:
import boto3
def lambda_handler(event, context):
dynamodb = boto3.resource('dynamodb')
table_name = event['table_name']
key = event['key']
update_expression = event['update_expression']
table = dynamodb.Table(table_name)
table.update_item(
Key=key,
UpdateExpression=update_expression,
ReturnValues="UPDATED_NEW"
)
return f"Item {key} updated in DynamoDB table {table_name}."
Input:
{
"table_name": "your-dynamodb-table",
"key": {"id": 1},
"update_expression": "set age = :new_age",
"expression_attribute_values": {":new_age": 31}
}
Output:
"Item {'id': 1} updated in DynamoDB table your-dynamodb-table."
Delete Item from DynamoDB:
import boto3
def lambda_handler(event, context):
dynamodb = boto3.resource('dynamodb')
table_name = event['table_name']
key = event['key']
table = dynamodb.Table(table_name)
table.delete_item(Key=key)
return f"Item {key} deleted from DynamoDB table {table_name}."
Input:
{
"table_name": "your-dynamodb-table",
"key": {"id": 1}
}
Output:
"Item {'id': 1} deleted from DynamoDB table your-dynamodb-table."
Query DynamoDB Table:
import boto3
def lambda_handler(event, context):
dynamodb = boto3.resource('dynamodb')
table_name = event['table_name']
key_condition_expression = event['key_condition']
complete code
import boto3
def lambda_handler(event, context):
# Extract parameters from the event
table_name = event.get('table_name')
key_condition_expression = event.get('key_condition_expression')
if not table_name or not key_condition_expression:
return {
'statusCode': 400,
'body': 'Missing required parameters: table_name and key_condition_expression'
}
try:
# Create DynamoDB resource
dynamodb = boto3.resource('dynamodb')
# Get reference to DynamoDB table
table = dynamodb.Table(table_name)
# Define the key condition expression and other query parameters
query_params = {
'KeyConditionExpression': key_condition_expression,
# Additional parameters can be added as needed
}
# Query DynamoDB table
response = table.query(**query_params)
# Extract and return the items from the response
items = response.get('Items', [])
return {
'statusCode': 200,
'body': {
'items': items,
'count': len(items)
}
}
except Exception as e:
return {
'statusCode': 500,
'body': f'Error: {str(e)}'
}
11-15: API Gateway and Lambda Integration
Simple API Gateway Integration:
Create an API Gateway and integrate it with a Lambda function.
Define a resource and method (e.g., GET).
Test the API endpoint.
API Gateway with Request Parameters:
Create an API Gateway with a resource that accepts query parameters.
Access parameters in the Lambda function.
def lambda_handler(event, context):
name = event['queryStringParameters']['name']
return f"Hello, {name}!"
Input:
GET /greet?name=John
Output:
"Hello, John!"
API Gateway POST Request Handling:
Create an API Gateway with a resource that accepts POST requests.
Access the request body in the Lambda function.
def lambda_handler(event, context):
data = json.loads(event['body'])
return f"Received data: {data}"
Input:
POST /submit
Body: {"key": "value"}
Output:
"Received data: {'key': 'value'}"
Secure API Gateway with API Key:
Configure API Gateway with an API key.
Test that the API key is required for access.
API Gateway Authorization:
Implement custom authorization in your Lambda function.
Authenticate users based on headers or tokens.
16-20: Other AWS Service Integrations
Send SNS Notification:
import boto3
def lambda_handler(event, context):
sns = boto3.client('sns')
topic_arn = 'your-sns-topic-arn'
message = 'This is a notification from Lambda!'
sns.publish(TopicArn=topic_arn, Message=message)
return "Notification sent successfully."
Output:
"Notification sent successfully."
Invoke Another Lambda Function:
import boto3
def lambda_handler(event, context):
lambda_client = boto3.client('lambda')
response = lambda_client.invoke(
FunctionName='another-lambda-function',
InvocationType='RequestResponse',
Payload=json.dumps(event)
)
return json.loads(response['Payload'].read())
Output:
Result from another Lambda function.
Read from SQS Queue:
import boto3
def lambda_handler(event, context):
sqs = boto3.client('sqs')
queue_url = 'your-sqs-queue-url'
response = sqs.receive_message(QueueUrl=queue_url, MaxNumberOfMessages=1)
message_body = response['Messages'][0]['Body']
return f"Received message: {message_body}"
Output:
"Received message: Your SQS message."
Write to Kinesis Stream:
import boto3
def lambda_handler(event, context):
kinesis = boto3.client('kinesis')
stream_name = 'your-kinesis-stream'
data = json.dumps({"key": "value"})
kinesis.put_record(StreamName=stream_name, Data=data, PartitionKey='key')
return "Record added to Kinesis stream."
Output:
"Record added to Kinesis stream."
Invoke Step Functions:
import boto3
def lambda_handler(event, context):
stepfunctions = boto3.client('stepfunctions')
state_machine_arn = 'your-step-functions-arn'
response = stepfunctions.start_execution(
stateMachineArn=state_machine_arn,
name='execution-name',
input=json.dumps(event)
)
return f"Step Functions execution started with ID: {response['executionArn']}"
Output:
"Step Functions execution started with ID: your-execution-id"
21-25: CloudWatch Events and Logs
Scheduled CloudWatch Event:
Set up a CloudWatch Events rule to trigger the Lambda function on a schedule.
CloudWatch Logs Processing:
Configure a CloudWatch Logs group to trigger the Lambda function when new logs are added.
Process and analyze log data.
import base64
import json
def lambda_handler(event, context):
for record in event['records']:
payload = base64.b64decode(record['data'])
log_data = json.loads(payload)
# Process log_data
return "Log processing complete."
CloudWatch Metric Aggregation:
Aggregate CloudWatch metrics using the Lambda function.
Calculate metrics and store results.
import boto3
def lambda_handler(event, context):
cloudwatch = boto3.client('cloudwatch')
metric_name = 'YourMetric'
namespace = 'YourNamespace'
period = 300 # 5 minutes
response = cloudwatch.get_metric_statistics(
Namespace=namespace,
MetricName=metric_name,
StartTime='2023-01-01T00:00:00Z',
EndTime='2023-01-02T00:00:00Z',
Period=period,
Statistics=['Sum']
)
return response['Datapoints']
CloudWatch Logs to S3 Archiving:
Create a Lambda function to archive CloudWatch Logs to an S3 bucket.
26-30: AWS SDK and CLI Integrations
AWS CLI Command Execution:
Execute AWS CLI commands from within a Lambda function.
import subprocess
def lambda_handler(event, context):
result = subprocess.run(['aws', 's3', 'ls'], capture_output=True, text=True)
return result.stdout
Boto3 SDK Usage:
Use the Boto3 SDK to interact with various AWS services.
import boto3
def lambda_handler(event, context):
s3 = boto3.client('s3')
response = s3.list_buckets()
return response['Buckets']
AWS SDK Error Handling:
Implement error handling for AWS SDK calls within the Lambda function.
import boto3
from botocore.exceptions import ClientError
def lambda_handler(event, context):
try:
# AWS SDK operation
except ClientError as e:
return f"Error: {e}"
Invoke Lambda Function from Another AWS Account:
Allow cross-account Lambda function invocation.
Lambda Function as AWS CLI Command Alias:
Create an AWS CLI command alias that maps to a Lambda function.
alias mylambda='aws lambda invoke --function-name MyLambdaFunction --log-t
list out the checklist of all query of dynamodb
Here's a concise checklist of common DynamoDB operations using the AWS SDK for Python (Boto3) in one line with examples:
Create Table:
dynamodb.create_table(...)
Put Item (Insert):
table.put_item(Item={'ID': 123, 'Name': 'John Doe'})
Get Item (Retrieve):
response = table.get_item(Key={'ID': 123})
Update Item:
table.update_item(Key={'ID': 123}, UpdateExpression='SET #attrName = :attrValue', ExpressionAttributeNames={'#attrName': 'Name'}, ExpressionAttributeValues={':attrValue': 'Jane Doe'})
Delete Item:
table.delete_item(Key={'ID': 123})
Query (with Index):
response = table.query(IndexName='MyIndex', KeyConditionExpression='AttributeName = :value', ExpressionAttributeValues={':value': 'SomeValue'})
Scan (Full Table Scan):
response = table.scan()
Batch Write:
table.batch_writer()
Batch Get:
response = dynamodb.batch_get_item(RequestItems={'MyTable': {'Keys': [{'ID': 123}, {'ID': 456}]}})
Atomic Counters:
table.update_item(Key={'ID': 123}, UpdateExpression='SET #attrName = #attrName + :attrValue', ExpressionAttributeNames={'#attrName': 'Counter'}, ExpressionAttributeValues={':attrValue': 1})
Conditional Writes:
table.put_item(Item={'ID': 123, 'Name': 'John Doe'}, ConditionExpression='attribute_not_exists(ID)')
Transactions:
with table.batch_writer() as batch: batch.put_item(Item={'ID': 123, 'Name': 'John Doe'})
Time to Live (TTL):
table.update_time_to_live(TimeToLiveSpecification={'Enabled': True, 'AttributeName': 'ExpireTime'})
Global Secondary Index (GSI):
table.update(TableIndex='MyGSI', ProvisionedThroughput={'ReadCapacityUnits': 5, 'WriteCapacityUnits': 5})
Local Secondary Index (LSI):
table.update(TableIndex='MyLSI', ProvisionedThroughput={'ReadCapacityUnits': 5, 'WriteCapacityUnits': 5})
DAX (DynamoDB Accelerator):
Use DAX for caching to accelerate read operations (configuration is not done through Boto3).
PartiQL:
dynamodb.execute_statement(Statement="SELECT * FROM MyTable WHERE ID = 123")
Streams and Triggers:
Set up DynamoDB Streams and triggers for real-time event processing (configuration is not done through Boto3).
Backup and Restore:
dynamodb.create_backup(TableName='MyTable', BackupName='MyBackup')
dynamodb.restore_table_from_backup(TargetTableName='MyRestoredTable', BackupArn='arn:aws:dynamodb:region:account-id:table/MyTable/backup/backup-id')
Cross-Region Replication:
Enable global tables for cross-region replication (configuration is not done through Boto3).
Security:
Implement fine-grained access control using IAM roles and policies (configuration is not done through Boto3).
Metrics and Monitoring:
Monitor DynamoDB metrics using CloudWatch (configuration is not done through Boto3).
Encryption:
Enable server-side encryption for data at rest (configuration is not done through Boto3).
This checklist provides a quick reference for various DynamoDB operations using the AWS SDK for Python (Boto3). Adjust examples based on your specific requirements and configurations. Always refer to the Boto3 documentation for the most up-to-date and detailed information.
Top comments (0)