LocalStack archived its GitHub repo on March 23, 2026. The read-only notice went up quietly. No banner. No grace period announcement. Just a new requirement: LOCALSTACK_AUTH_TOKEN in your environment — or your CI pipeline fails.
If you're in a commercial team on the Hobby tier, you're also hit with a non-commercial-use restriction. The paid plans start at $39/month.
Update (March 24, 2026): LocalStack has since posted a blog announcement (blog.localstack.cloud) with staged concessions: unlimited CI, and a free Hobby tier for non-commercial use. Auth token is still required, and Hobby tier needs account creation. The core complaint — 'your CI breaks if you don't have a vendor account' — hasn't changed. The alternatives below don't require either.
I'm not going to relitigate whether this was the right call. It's their product. But I've been burned by this category of problem before: tool solves a real pain point with a good free tier, community adopts it into their CI configs and READMEs and onboarding docs, adoption metrics eventually justify monetization pressure, free tier gets changed, and your pipeline is broken at 3 AM before a release.
The failure mode isn't LocalStack specifically. It's vendor-gated infrastructure in your test suite. When the vendor changes something — pricing, auth, terms — your code is correct, your tests pass locally if you have the token, and your CI is broken for a reason that has nothing to do with the code.
Here's what I've migrated to, with working code you can drop in today.
Option 1: moto — Python-native, no Docker, no auth
For pure unit testing that was using LocalStack's S3, SQS, DynamoDB, or Lambda mocking, moto covers most of it.
moto is MIT-licensed, pure Python, no external service, no auth token, no account required. It intercepts boto3 calls in-process and tears down after each test.
pip install moto[s3,dynamodb,sqs,lambda]
Basic usage:
import boto3
import pytest
from moto import mock_aws
@mock_aws
def test_upload_and_retrieve():
s3 = boto3.client("s3", region_name="us-east-1")
s3.create_bucket(Bucket="test-bucket")
s3.put_object(Bucket="test-bucket", Key="test.txt", Body=b"hello")
response = s3.get_object(Bucket="test-bucket", Key="test.txt")
assert response["Body"].read() == b"hello"
No auth token. No service running in the background. No Docker dependency. The decorator handles setup and teardown.
moto covers 200+ AWS services. For the common test suite needs — S3, DynamoDB, SQS, SNS, Lambda, IAM — it's a complete drop-in.
What it doesn't cover: complex cross-service workflows that depend on real network behavior, anything that required LocalStack Pro's multi-region state persistence, or services outside its 200+ API coverage. For the vast majority of Python unit tests, those edge cases don't apply.
The complete pytest conftest.py drop-in
If your project had a conftest.py using LocalStack fixtures, here's a complete replacement using moto. Copy this in, run your tests, they should pass.
# conftest.py — Drop-in AWS mocking for pytest
# Replaces LocalStack for unit/integration tests
# Covers: S3, DynamoDB, SQS
import boto3
import pytest
from moto import mock_aws
@pytest.fixture(scope="function")
def aws_credentials(monkeypatch):
"""Mock AWS credentials for moto."""
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "testing")
monkeypatch.setenv("AWS_SECRET_ACCESS_KEY", "testing")
monkeypatch.setenv("AWS_SECURITY_TOKEN", "testing")
monkeypatch.setenv("AWS_SESSION_TOKEN", "testing")
monkeypatch.setenv("AWS_DEFAULT_REGION", "us-east-1")
@pytest.fixture(scope="function")
@mock_aws
def s3_client(aws_credentials):
"""S3 client with mocked AWS backend."""
return boto3.client("s3", region_name="us-east-1")
@pytest.fixture(scope="function")
@mock_aws
def dynamodb_client(aws_credentials):
"""DynamoDB client with mocked AWS backend."""
return boto3.client("dynamodb", region_name="us-east-1")
@pytest.fixture(scope="function")
@mock_aws
def sqs_client(aws_credentials):
"""SQS client with mocked AWS backend."""
return boto3.client("sqs", region_name="us-east-1")
@pytest.fixture(scope="function")
@mock_aws
def test_s3_bucket(s3_client):
"""Pre-created S3 bucket for tests."""
s3_client.create_bucket(Bucket="test-bucket")
return "test-bucket"
@pytest.fixture(scope="function")
@mock_aws
def test_queue(sqs_client):
"""Pre-created SQS queue for tests."""
response = sqs_client.create_queue(QueueName="test-queue")
return response["QueueUrl"]
This covers the 80% case: S3 + DynamoDB + SQS together. Add Lambda and SNS fixtures following the same pattern if needed.
Free download: I've packaged this with the extended version (SNS, Lambda, Secrets Manager fixtures included) as a free script. Download the full conftest.py here → (Kit landing page — coming soon)
Option 2: Floci — if you need a running container
Python un
Tags:
#0
Want to run a more efficient business?
Mewayz gives you CRM, HR, Accounting, Projects & eCommerce — all in one workspace. 14-day free trial, no credit card needed.
Try Mewayz Free →