DevToolBoxGRATIS
Blog

AWS Lambda Guide: Serverless Functions, API Gateway, DynamoDB, Step Functions & Performance Tuning

20 min readoleh DevToolBox Team

AWS Lambda lets you run code without provisioning or managing servers. You pay only for the compute time you consume, and Lambda scales automatically from a few requests per day to thousands per second. This guide covers Lambda fundamentals, handler patterns in Node.js and Python, API Gateway integration, event sources, layers, DynamoDB operations, Step Functions, performance tuning, testing, and monitoring.

TL;DR: AWS Lambda is a serverless compute service that runs your code in response to events. It supports Node.js, Python, Java, Go, .NET, and custom runtimes. Key concepts: handler functions process events, cold starts occur on new containers, API Gateway provides HTTP endpoints, and event sources like S3, DynamoDB Streams, and SQS trigger functions automatically. Use provisioned concurrency for latency-sensitive workloads, Lambda Layers for shared dependencies, and SAM for infrastructure as code.

Key Takeaways

  • Lambda charges per request and per GB-second of compute. The free tier includes 1M requests and 400,000 GB-seconds per month.
  • Cold starts range from 100ms (Node.js/Python) to 1-2s (Java/.NET). Use provisioned concurrency to eliminate them for critical paths.
  • Initialize SDK clients and database connections outside the handler to reuse them across warm invocations.
  • API Gateway + Lambda is the most common pattern for building serverless REST APIs.
  • Lambda Layers let you share code and dependencies across multiple functions without bundling them in each deployment.
  • Use AWS SAM or CDK for infrastructure as code. SAM extends CloudFormation with serverless-specific shortcuts.

Lambda Fundamentals

Lambda runs your code in short-lived containers managed by AWS. When a request arrives, Lambda either reuses a warm container or creates a new one (cold start). The execution environment persists between invocations, so variables declared outside the handler are reused.

Execution Model

Each Lambda function has an init phase (cold start) and an invoke phase. During init, Lambda downloads your code, starts the runtime, and runs initialization code. The invoke phase executes your handler. Cold starts add 100ms to 2s of latency depending on runtime and package size.

// Lambda execution lifecycle
// INIT PHASE (cold start only)
const { DynamoDBClient } = require("@aws-sdk/client-dynamodb");
const client = new DynamoDBClient({}); // reused across invocations

// INVOKE PHASE (every request)
exports.handler = async (event, context) => {
  console.log("Request ID:", context.awsRequestId);
  console.log("Time left:", context.getRemainingTimeInMillis(), "ms");
  return { statusCode: 200, body: "OK" };
};

Cold Start Factors

FactorImpactMitigation
RuntimeNode.js/Python: ~100-200ms, Java: ~1-2sChoose lightweight runtimes
Package sizeLarger packages increase download timeTree-shake, use layers
VPCVPC-attached adds 1-5s (improved with Hyperplane)Avoid VPC unless needed
MemoryMore memory = more CPU = faster initAllocate 512MB+ for faster starts

Lambda with Node.js

Node.js is the most popular Lambda runtime. The handler receives an event object and a context object. Return a value or throw an error.

// index.mjs (ES module handler)
export const handler = async (event, context) => {
  const { httpMethod, path, body, queryStringParameters } = event;

  try {
    const data = JSON.parse(body || "{}");
    return {
      statusCode: 200,
      headers: { "Content-Type": "application/json" },
      body: JSON.stringify({ message: "Success", input: data }),
    };
  } catch (err) {
    return { statusCode: 500, body: JSON.stringify({ error: err.message }) };
  }
};

Lambda with Python

Python is the second most popular Lambda runtime. The handler pattern is similar to Node.js but uses Python conventions.

# lambda_function.py
import json
import boto3

# Init outside handler (reused on warm starts)
s3 = boto3.client("s3")

def handler(event, context):
    body = json.loads(event.get("body", "{}"))
    return {
        "statusCode": 200,
        "headers": {"Content-Type": "application/json"},
        "body": json.dumps({"message": "Hello", "input": body})
    }

API Gateway Integration

API Gateway provides HTTP endpoints that trigger Lambda functions. It handles routing, request validation, authorization, throttling, and CORS. The most common setup is a REST API with Lambda proxy integration.

REST API Configuration

With Lambda proxy integration, API Gateway passes the entire HTTP request to Lambda and expects a specific response format.

// API Gateway Lambda Proxy - event structure
// event.httpMethod    -> "GET", "POST", etc.
// event.path          -> "/users/123"
// event.headers       -> { "Content-Type": "..." }
// event.queryStringParameters -> { "page": "1" }
// event.body          -> "{...}" (string)

exports.handler = async (event) => {
  if (event.httpMethod === "GET" && event.path === "/users") {
    return {
      statusCode: 200,
      headers: {
        "Content-Type": "application/json",
        "Access-Control-Allow-Origin": "*"
      },
      body: JSON.stringify([{ id: 1, name: "Alice" }])
    };
  }
  return { statusCode: 404, body: "Not Found" };
};

Event Sources

Lambda can be triggered by many AWS services. Each event source provides a different event structure.

S3 Events

Trigger Lambda when objects are created, modified, or deleted in an S3 bucket.

// S3 event handler
exports.handler = async (event) => {
  for (const record of event.Records) {
    const bucket = record.s3.bucket.name;
    const key = decodeURIComponent(record.s3.object.key);
    console.log("New object:", bucket, key);
    // Process the uploaded file
  }
};

DynamoDB Streams

Process changes to DynamoDB tables in real-time. Each stream record contains the old and new item images.

// DynamoDB Streams handler
exports.handler = async (event) => {
  for (const record of event.Records) {
    const { eventName, dynamodb } = record;
    if (eventName === "INSERT") {
      console.log("New item:", JSON.stringify(dynamodb.NewImage));
    } else if (eventName === "MODIFY") {
      console.log("Updated:", JSON.stringify(dynamodb.NewImage));
    }
  }
};

SQS

Process messages from an SQS queue. Lambda polls the queue and invokes your function with batches of messages.

// SQS handler with batch processing
exports.handler = async (event) => {
  const failures = [];
  for (const record of event.Records) {
    try {
      const body = JSON.parse(record.body);
      await processMessage(body);
    } catch (err) {
      failures.push({ itemIdentifier: record.messageId });
    }
  }
  return { batchItemFailures: failures };
};

Lambda Layers and Dependencies

Layers let you package libraries, custom runtimes, and configuration files separately from your function code. A function can use up to 5 layers. Layers are extracted to /opt in the execution environment.

# Create a Lambda Layer
mkdir -p layer/nodejs && cd layer/nodejs
npm init -y
npm install sharp axios
cd ..
zip -r my-layer.zip nodejs/

# Publish the layer
aws lambda publish-layer-version \
  --layer-name my-deps \
  --zip-file fileb://my-layer.zip \
  --compatible-runtimes nodejs20.x

# Attach layer to function
aws lambda update-function-configuration \
  --function-name my-func \
  --layers arn:aws:lambda:us-east-1:123:layer:my-deps:1

Environment Variables and Secrets

Use environment variables for configuration and AWS Systems Manager Parameter Store or Secrets Manager for sensitive values. Environment variables are encrypted at rest and can be encrypted in transit with KMS.

// Access env vars and secrets
const { SSMClient, GetParameterCommand } = require("@aws-sdk/client-ssm");
const ssm = new SSMClient({});

// Simple env var (set in Lambda config)
const TABLE = process.env.TABLE_NAME;

// Cached secret from Parameter Store
let dbPassword;
async function getSecret() {
  if (!dbPassword) {
    const resp = await ssm.send(new GetParameterCommand({
      Name: "/myapp/db-password",
      WithDecryption: true
    }));
    dbPassword = resp.Parameter.Value;
  }
  return dbPassword;
}

Lambda with DynamoDB

DynamoDB is the most common database for Lambda because it scales seamlessly and has single-digit millisecond latency. Use the AWS SDK v3 for smaller bundle size.

const { DynamoDBClient } = require("@aws-sdk/client-dynamodb");
const { DynamoDBDocumentClient, PutCommand, GetCommand,
  QueryCommand, DeleteCommand } = require("@aws-sdk/lib-dynamodb");

const ddb = DynamoDBDocumentClient.from(new DynamoDBClient({}));
const TABLE = process.env.TABLE_NAME;

exports.handler = async (event) => {
  // CREATE
  await ddb.send(new PutCommand({
    TableName: TABLE,
    Item: { pk: "user#1", sk: "profile", name: "Alice" }
  }));
  // READ
  const { Item } = await ddb.send(new GetCommand({
    TableName: TABLE, Key: { pk: "user#1", sk: "profile" }
  }));
  // QUERY
  const { Items } = await ddb.send(new QueryCommand({
    TableName: TABLE,
    KeyConditionExpression: "pk = :pk",
    ExpressionAttributeValues: { ":pk": "user#1" }
  }));
  return { statusCode: 200, body: JSON.stringify(Items) };
};

Step Functions

Step Functions orchestrate multiple Lambda functions into workflows. They handle retries, error handling, parallel execution, and state management. Define workflows using Amazon States Language (ASL).

{
  "Comment": "Order processing workflow",
  "StartAt": "ValidateOrder",
  "States": {
    "ValidateOrder": {
      "Type": "Task",
      "Resource": "arn:aws:lambda:us-east-1:123:function:validate",
      "Next": "ProcessPayment",
      "Catch": [{
        "ErrorEquals": ["ValidationError"],
        "Next": "OrderFailed"
      }]
    },
    "ProcessPayment": {
      "Type": "Task",
      "Resource": "arn:aws:lambda:us-east-1:123:function:payment",
      "Next": "SendConfirmation"
    },
    "SendConfirmation": {
      "Type": "Task",
      "Resource": "arn:aws:lambda:us-east-1:123:function:notify",
      "End": true
    },
    "OrderFailed": { "Type": "Fail", "Error": "OrderFailed" }
  }
}

Performance Optimization

Lambda performance depends on memory allocation, package size, and initialization code. Memory scales linearly with CPU power.

Provisioned Concurrency

Pre-warms execution environments to eliminate cold starts. Use it for latency-sensitive APIs. Costs apply even when idle.

# Enable provisioned concurrency
aws lambda put-provisioned-concurrency-config \
  --function-name my-api \
  --qualifier prod \
  --provisioned-concurrent-executions 10

# Auto-scale provisioned concurrency
# Use Application Auto Scaling to adjust
# based on utilization (target: 70%)

Memory Tuning

Lambda allocates CPU proportional to memory. At 1769 MB you get 1 full vCPU. More memory often reduces duration and total cost.

MemoryvCPUBest For
128 MB0.07Simple routing/proxy
512 MB0.29API + DB queries
1769 MB1.0Data processing
3008 MB1.7Image/video processing
10240 MB6.0ML inference

Testing Lambda Functions

Use SAM CLI for local testing and the AWS SDK for integration tests. SAM CLI emulates the Lambda runtime locally.

# Install SAM CLI
brew install aws-sam-cli

# Invoke locally with test event
sam local invoke MyFunction -e events/api.json

# Start local API Gateway
sam local start-api
# => http://127.0.0.1:3000/users

# Generate sample events
sam local generate-event s3 put \
  --bucket my-bucket --key test.txt > events/s3.json

# Run unit tests (Jest example)
# test/handler.test.js
# const { handler } = require("../index");
# test("returns 200", async () => {
#   const event = { httpMethod: "GET", path: "/users" };
#   const result = await handler(event);
#   expect(result.statusCode).toBe(200);
# });

Infrastructure as Code with SAM

AWS SAM (Serverless Application Model) extends CloudFormation with serverless-specific resource types. A SAM template defines functions, APIs, event sources, and permissions in a single YAML file.

# template.yaml
AWSTemplateFormatVersion: "2010-09-09"
Transform: AWS::Serverless-2016-10-31

Globals:
  Function:
    Timeout: 30
    Runtime: nodejs20.x
    MemorySize: 512

Resources:
  ApiFunction:
    Type: AWS::Serverless::Function
    Properties:
      Handler: src/index.handler
      Events:
        GetUsers:
          Type: Api
          Properties:
            Path: /users
            Method: get
      Environment:
        Variables:
          TABLE_NAME: !Ref UsersTable
      Policies:
        - DynamoDBCrudPolicy:
            TableName: !Ref UsersTable

  UsersTable:
    Type: AWS::DynamoDB::Table
    Properties:
      TableName: users
      BillingMode: PAY_PER_REQUEST
      AttributeDefinitions:
        - AttributeName: pk
          AttributeType: S
      KeySchema:
        - AttributeName: pk
          KeyType: HASH

Monitoring and Debugging

Lambda integrates with CloudWatch for logs and metrics, and X-Ray for distributed tracing. Every invocation automatically logs to CloudWatch.

CloudWatch Metrics

Lambda emits metrics for invocations, duration, errors, throttles, and concurrent executions. Set alarms on error rate and duration.

// Structured logging for CloudWatch
exports.handler = async (event, context) => {
  console.log(JSON.stringify({
    level: "INFO",
    requestId: context.awsRequestId,
    message: "Processing request",
    path: event.path,
    method: event.httpMethod
  }));

  // CloudWatch Embedded Metric Format
  console.log(JSON.stringify({
    _aws: { Timestamp: Date.now(),
      CloudWatchMetrics: [{
        Namespace: "MyApp",
        Dimensions: [["Function"]],
        Metrics: [{ Name: "ProcessingTime", Unit: "Milliseconds" }]
      }]
    },
    Function: context.functionName,
    ProcessingTime: 42
  }));
};

X-Ray Tracing

X-Ray traces requests across Lambda, API Gateway, DynamoDB, and other AWS services. Enable active tracing in the function configuration.

# Enable X-Ray tracing via CLI
aws lambda update-function-configuration \
  --function-name my-func \
  --tracing-config Mode=Active

# SAM template
# Properties:
#   Tracing: Active

# In code: wrap AWS SDK calls for tracing
# const AWSXRay = require("aws-xray-sdk-core");
# const AWS = AWSXRay.captureAWS(require("aws-sdk"));

Lambda Limits and Quotas

Understanding Lambda limits helps you design functions that stay within service boundaries. These are the most commonly encountered limits.

ResourceDefault LimitAdjustable
Concurrent executions1,000 per regionYes (up to tens of thousands)
Timeout15 minutes (900 seconds)No (hard limit)
Memory128 MB to 10,240 MBNo
Deployment package (zip)50 MB direct, 250 MB unzippedNo (use container images for larger)
Ephemeral storage (/tmp)512 MB to 10,240 MBConfigurable
Environment variable size4 KB totalNo
Layers per function5No

Error Handling Patterns

Proper error handling is critical for Lambda reliability. The approach differs between synchronous invocations (API Gateway) and asynchronous invocations (S3, SNS, EventBridge). For sync, return structured error responses. For async, configure dead-letter queues and Lambda destinations.

// Robust error handling pattern
class AppError extends Error {
  constructor(message, statusCode, errorCode) {
    super(message);
    this.statusCode = statusCode;
    this.errorCode = errorCode;
  }
}

exports.handler = async (event, context) => {
  try {
    const result = await processRequest(event);
    return {
      statusCode: 200,
      body: JSON.stringify(result)
    };
  } catch (err) {
    console.error(JSON.stringify({
      level: "ERROR",
      requestId: context.awsRequestId,
      error: err.message,
      code: err.errorCode || "INTERNAL_ERROR"
    }));
    const statusCode = err.statusCode || 500;
    return {
      statusCode,
      body: JSON.stringify({
        error: err.errorCode || "INTERNAL_ERROR",
        message: statusCode < 500 ? err.message : "Internal error"
      })
    };
  }
};

Deployment Best Practices

Use aliases and versions for safe deployments. Lambda versions are immutable snapshots of your function code and configuration. Aliases are pointers to versions that enable traffic shifting and rollback.

  • Use $LATEST for development, numbered versions for staging/production
  • Create aliases (dev, staging, prod) pointing to specific versions
  • Use weighted aliases for canary deployments (90% v1, 10% v2)
  • Enable CodeDeploy hooks for pre/post traffic shift validation
  • Pin API Gateway stages to Lambda aliases, not $LATEST
  • Automate deployments with SAM pipelines or GitHub Actions
# Publish a new version
aws lambda publish-version --function-name my-func

# Create/update alias pointing to version 5
aws lambda create-alias \
  --function-name my-func \
  --name prod --function-version 5

# Canary deployment: shift 10% traffic to v6
aws lambda update-alias \
  --function-name my-func \
  --name prod --function-version 6 \
  --routing-config AdditionalVersionWeights={"5"=0.9}

Frequently Asked Questions

What is the maximum execution time for Lambda?

Lambda functions can run for up to 15 minutes (900 seconds). For longer workloads, use Step Functions to chain multiple Lambda invocations, or consider ECS/Fargate for long-running tasks.

How much does AWS Lambda cost?

Lambda pricing has two components: $0.20 per 1M requests and $0.0000166667 per GB-second. The free tier includes 1M requests and 400,000 GB-seconds per month, which is enough for many small applications.

How do I reduce cold start times?

Use lightweight runtimes (Node.js or Python), minimize package size with tree-shaking, keep functions out of VPCs unless necessary, allocate more memory for faster init, and use provisioned concurrency for critical paths.

Can Lambda connect to a relational database?

Yes, but use RDS Proxy to manage connection pooling. Without it, Lambda can exhaust database connections during traffic spikes because each concurrent execution opens its own connection.

What is the maximum deployment package size?

The deployment package limit is 50 MB (zipped) for direct upload or 250 MB (unzipped) including layers. For larger packages, use container images up to 10 GB.

How do Lambda Layers work?

Layers are ZIP archives containing libraries or dependencies. They are extracted to /opt in the execution environment. A function can reference up to 5 layers. Layers are versioned and can be shared across functions and accounts.

What is the difference between synchronous and asynchronous invocation?

Synchronous invocation waits for the function to complete and returns the result (API Gateway, SDK invoke). Asynchronous invocation queues the event and returns immediately (S3, SNS, EventBridge). Lambda handles retries for async failures.

How do I handle errors in Lambda?

For synchronous invocations, return appropriate HTTP status codes. For async invocations, configure a dead-letter queue (SQS or SNS) or a Lambda destination for failed events. Use structured logging with request IDs for debugging.

𝕏 Twitterin LinkedIn
Apakah ini membantu?

Tetap Update

Dapatkan tips dev mingguan dan tool baru.

Tanpa spam. Berhenti kapan saja.

Coba Alat Terkait

{ }JSON FormatterY→YAML to JSON ConverterJWTJWT Decoder

Artikel Terkait

DevOps Pipeline Guide: CI/CD, GitHub Actions, Docker, Infrastructure as Code & Deployment Strategies

Complete DevOps pipeline guide covering CI/CD fundamentals, GitHub Actions, GitLab CI, Docker multi-stage builds, Terraform, Pulumi, deployment strategies, secrets management, GitOps, and pipeline security.

Event-Driven Architecture Guide: Kafka, RabbitMQ, Event Sourcing, CQRS & Saga Patterns

Complete event-driven architecture guide covering Apache Kafka, RabbitMQ, event sourcing, CQRS, Saga pattern, domain events, async messaging patterns, schema evolution, serverless event processing, stream processing, and monitoring.

Redis Complete Guide: Caching, Pub/Sub, Streams, and Production Patterns

Master Redis with this complete guide. Covers data types, Node.js ioredis, caching patterns, session storage, Pub/Sub, Streams, Python redis-py, rate limiting, transactions, and production setup.