Tools: Essential Guide: Getting started testing AWS with MiniStack in under 10 minutes
The problem
The setup
What the guide covers
1. S3 — Create buckets and upload files
2. DynamoDB — Create tables and query data
3. SQS — Send and receive messages
4. Lambda — Deploy and invoke functions
Why this matters
The full guide You don't need an AWS account to test your AWS code. We just published an AWS Testing 101 guide — a step-by-step tutorial that takes you from zero to testing S3, DynamoDB, SQS, and Lambda on your laptop. No credentials, no cloud costs, no cleanup. Every developer who writes AWS code hits this wall: The answer is a local AWS emulator. Run it on your machine, point your SDK at localhost:4566, and your code thinks it's talking to AWS. That's it. 35+ AWS services running locally. The complete tutorial with more examples and explanations is at: 👉 ministack.org/getting-started.html MiniStack is the best alternative to LocalStack, it's open-source, MIT licensed, and free forever: github.com/Nahuel990/ministack One command. 35+ services. Zero cost. Templates let you quickly answer FAQs or store snippets for re-use. Hide child comments as well For further actions, you may consider blocking this person and/or reporting abuse
$ -weight: 500;">docker run -p 4566:4566 nahuelnucera/ministack
-weight: 500;">docker run -p 4566:4566 nahuelnucera/ministack
-weight: 500;">docker run -p 4566:4566 nahuelnucera/ministack
import boto3 s3 = boto3.client("s3", endpoint_url="http://localhost:4566", aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1",
) s3.create_bucket(Bucket="my-bucket")
s3.put_object(Bucket="my-bucket", Key="hello.txt", Body=b"Hello, local AWS!")
import boto3 s3 = boto3.client("s3", endpoint_url="http://localhost:4566", aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1",
) s3.create_bucket(Bucket="my-bucket")
s3.put_object(Bucket="my-bucket", Key="hello.txt", Body=b"Hello, local AWS!")
import boto3 s3 = boto3.client("s3", endpoint_url="http://localhost:4566", aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1",
) s3.create_bucket(Bucket="my-bucket")
s3.put_object(Bucket="my-bucket", Key="hello.txt", Body=b"Hello, local AWS!")
ddb = boto3.client("dynamodb", endpoint_url="http://localhost:4566", aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1") ddb.create_table( TableName="users", KeySchema=[{"AttributeName": "userId", "KeyType": "HASH"}], AttributeDefinitions=[{"AttributeName": "userId", "AttributeType": "S"}], BillingMode="PAY_PER_REQUEST",
) ddb.put_item(TableName="users", Item={ "userId": {"S": "user-001"}, "name": {"S": "Alice"},
})
ddb = boto3.client("dynamodb", endpoint_url="http://localhost:4566", aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1") ddb.create_table( TableName="users", KeySchema=[{"AttributeName": "userId", "KeyType": "HASH"}], AttributeDefinitions=[{"AttributeName": "userId", "AttributeType": "S"}], BillingMode="PAY_PER_REQUEST",
) ddb.put_item(TableName="users", Item={ "userId": {"S": "user-001"}, "name": {"S": "Alice"},
})
ddb = boto3.client("dynamodb", endpoint_url="http://localhost:4566", aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1") ddb.create_table( TableName="users", KeySchema=[{"AttributeName": "userId", "KeyType": "HASH"}], AttributeDefinitions=[{"AttributeName": "userId", "AttributeType": "S"}], BillingMode="PAY_PER_REQUEST",
) ddb.put_item(TableName="users", Item={ "userId": {"S": "user-001"}, "name": {"S": "Alice"},
})
sqs = boto3.client("sqs", endpoint_url="http://localhost:4566", aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1") queue = sqs.create_queue(QueueName="my-queue")
sqs.send_message(QueueUrl=queue["QueueUrl"], MessageBody="Hello from SQS!")
msgs = sqs.receive_message(QueueUrl=queue["QueueUrl"])
print(msgs["Messages"][0]["Body"]) # Hello from SQS!
sqs = boto3.client("sqs", endpoint_url="http://localhost:4566", aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1") queue = sqs.create_queue(QueueName="my-queue")
sqs.send_message(QueueUrl=queue["QueueUrl"], MessageBody="Hello from SQS!")
msgs = sqs.receive_message(QueueUrl=queue["QueueUrl"])
print(msgs["Messages"][0]["Body"]) # Hello from SQS!
sqs = boto3.client("sqs", endpoint_url="http://localhost:4566", aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1") queue = sqs.create_queue(QueueName="my-queue")
sqs.send_message(QueueUrl=queue["QueueUrl"], MessageBody="Hello from SQS!")
msgs = sqs.receive_message(QueueUrl=queue["QueueUrl"])
print(msgs["Messages"][0]["Body"]) # Hello from SQS!
import zipfile, io # Package a function
buf = io.BytesIO()
with zipfile.ZipFile(buf, "w") as zf: zf.writestr("index.py", 'def handler(event, ctx): return {"message": "Hello from Lambda!"}\n') lam = boto3.client("lambda", endpoint_url="http://localhost:4566", aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1") lam.create_function( FunctionName="my-function", Runtime="python3.12", Handler="index.handler", Role="arn:aws:iam::000000000000:role/fake-role", Code={"ZipFile": buf.getvalue()},
) resp = lam.invoke(FunctionName="my-function")
print(resp["Payload"].read()) # {"message": "Hello from Lambda!"}
import zipfile, io # Package a function
buf = io.BytesIO()
with zipfile.ZipFile(buf, "w") as zf: zf.writestr("index.py", 'def handler(event, ctx): return {"message": "Hello from Lambda!"}\n') lam = boto3.client("lambda", endpoint_url="http://localhost:4566", aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1") lam.create_function( FunctionName="my-function", Runtime="python3.12", Handler="index.handler", Role="arn:aws:iam::000000000000:role/fake-role", Code={"ZipFile": buf.getvalue()},
) resp = lam.invoke(FunctionName="my-function")
print(resp["Payload"].read()) # {"message": "Hello from Lambda!"}
import zipfile, io # Package a function
buf = io.BytesIO()
with zipfile.ZipFile(buf, "w") as zf: zf.writestr("index.py", 'def handler(event, ctx): return {"message": "Hello from Lambda!"}\n') lam = boto3.client("lambda", endpoint_url="http://localhost:4566", aws_access_key_id="test", aws_secret_access_key="test", region_name="us-east-1") lam.create_function( FunctionName="my-function", Runtime="python3.12", Handler="index.handler", Role="arn:aws:iam::000000000000:role/fake-role", Code={"ZipFile": buf.getvalue()},
) resp = lam.invoke(FunctionName="my-function")
print(resp["Payload"].read()) # {"message": "Hello from Lambda!"}
-weight: 500;">docker run -p 4566:4566 nahuelnucera/ministack
-weight: 500;">docker run -p 4566:4566 nahuelnucera/ministack
-weight: 500;">docker run -p 4566:4566 nahuelnucera/ministack - "I don't want to create resources in a real AWS account just to test"
- "My CI pipeline needs AWS but I don't want to pay for it"
- "I accidentally left a DynamoDB table running and got billed" - No AWS account needed — test on day one
- No cost — run thousands of operations for free
- No cleanup — -weight: 500;">stop the container, everything's gone
- CI/CD ready — same Docker image in GitHub Actions, GitLab CI, Jenkins
- Terraform compatible — point your provider at localhost:4566