S3 (Simple Storage Service) Events
Amazon S3 can trigger Lambda functions in response to object-level operations like uploads, deletions, and more.
Event Structure
class S3Event:
records: List[S3Record] # List of S3 event records
class S3Record:
event_name: str # Type of event (e.g., "ObjectCreated:Put")
event_time: str # Event timestamp
bucket: S3Bucket # Bucket information
s3_object: S3Object # Object information
class S3Bucket:
name: str # Bucket name
arn: str # Bucket ARN
class S3Object:
key: str # Object key (path)
size: int # Object size in bytes
etag: str # Object ETag
Usage Examples
Basic Object Processing
from lambda_universal_router import Router
from lambda_universal_router.events import S3Event
router = Router()
@router.s3()
def process_uploads(event: S3Event, context):
for record in event.records:
print(f"Event: {record.event_name}")
print(f"Bucket: {record.bucket.name}")
print(f"Key: {record.s3_object.key}")
print(f"Size: {record.s3_object.size} bytes")
File Type Processing
@router.s3()
def process_by_type(event: S3Event, context):
for record in event.records:
if record.event_name.startswith('ObjectCreated:'):
key = record.s3_object.key.lower()
if key.endswith('.jpg') or key.endswith('.png'):
process_image(record)
elif key.endswith('.pdf'):
process_document(record)
elif key.endswith('.csv'):
process_data(record)
Event-Specific Handling
@router.s3()
def handle_events(event: S3Event, context):
for record in event.records:
if record.event_name.startswith('ObjectCreated:'):
handle_creation(record)
elif record.event_name.startswith('ObjectRemoved:'):
handle_deletion(record)
elif record.event_name.startswith('ObjectTagging:'):
handle_tagging(record)
Cross-Account Processing
@router.s3()
def process_multi_bucket(event: S3Event, context):
for record in event.records:
bucket_arn = record.bucket.arn
if bucket_arn.startswith('arn:aws:s3:::prod-'):
process_prod_object(record)
else:
process_dev_object(record)
Event Examples
- Object Created Event
{ "Records": [ { "eventVersion": "2.1", "eventSource": "aws:s3", "awsRegion": "us-east-1", "eventTime": "2024-03-17T12:00:00.000Z", "eventName": "ObjectCreated:Put", "s3": { "bucket": { "name": "my-bucket", "arn": "arn:aws:s3:::my-bucket" }, "object": { "key": "uploads/image.jpg", "size": 1024, "eTag": "d41d8cd98f00b204e9800998ecf8427e" } } } ] }
- Object Deleted Event
{ "Records": [ { "eventName": "ObjectRemoved:Delete", "s3": { "bucket": { "name": "my-bucket" }, "object": { "key": "path/to/file.txt" } } } ] }
Common Event Types
- Object Created Events
- ObjectCreated:Put
- ObjectCreated:Post
- ObjectCreated:Copy
- ObjectCreated:CompleteMultipartUpload
- Object Removed Events
- ObjectRemoved:Delete
- ObjectRemoved:DeleteMarkerCreated
- Object Tagging Events
- ObjectTagging:Put
- ObjectTagging:Delete
- Object Restore Events
- ObjectRestore:Post
- ObjectRestore:Completed
Best Practices
- Error Handling
@router.s3() def safe_process(event: S3Event, context): for record in event.records: try: process_object(record) except ObjectNotFoundError: # Object was deleted before processing log_missing_object(record) except Exception as e: # Unexpected error log_error(record, e) raise
- Large Object Handling
@router.s3() def handle_large_files(event: S3Event, context): for record in event.records: if record.s3_object.size > 100_000_000: # 100MB # Queue large file for async processing queue_large_file(record) else: process_small_file(record)
- Monitoring and Logging
@router.s3() def monitored_process(event: S3Event, context): metrics = { "processed": 0, "total_size": 0, "errors": 0 } for record in event.records: try: process_object(record) metrics["processed"] += 1 metrics["total_size"] += record.s3_object.size except Exception: metrics["errors"] += 1 log_metrics(metrics)
Configuration Tips
- S3 Bucket Settings
- Configure appropriate event types
- Set up bucket notifications
- Consider versioning for critical data
- Lambda Settings
- Set appropriate memory and timeout
- Configure concurrent executions
- Enable X-Ray tracing if needed
- Security
- Use least privilege permissions
- Encrypt sensitive data
- Validate file types and sizes