Tools: Latest: How I Set Up Integration Tests for a Node.js + PostgreSQL App (with Zero Flakiness)

Tools: Latest: How I Set Up Integration Tests for a Node.js + PostgreSQL App (with Zero Flakiness)

What We're Actually Testing

Project Structure

Step 1 — Install Dependencies

Step 2 — The App We're Testing

Step 3 — The Integration Test Setup

Step 4 — Writing the Integration Tests

Step 5 — Jest Config

Step 6 — Running the Tests

Step 7 — CI/CD with GitHub Actions

The Three Rules That Eliminated My Flakiness

Taking It Further: Automated Mock Generation with Keploy

Summary I spent three weeks being haunted by a test suite that passed locally and failed in CI. Not sometimes — randomly. A different test each time. No stack trace that made sense. Pure chaos. After way too much coffee and one very long Saturday, I figured out the root cause: my integration tests were sharing database state, spinning up connections that weren't being closed, and relying on mock data that didn't reflect how PostgreSQL actually behaves. This is the guide I wish I had back then. By the end, you'll have a Node.js + PostgreSQL integration test setup that is isolated, fast, deterministic, and doesn't randomly implode in your CI pipeline. Let's build it from scratch. Before we write a single line of code, let's be clear about what integration testing means in this context. Unit tests check a function in isolation — you mock the database, mock the HTTP client, mock everything. Integration tests check that your code works with real dependencies. That means a real PostgreSQL instance, real queries, real connection pooling behavior. The problem most people run into: they treat integration tests like unit tests. They share a single DB connection across test files. They don't clean up between tests. They hardcode ports. Then they wonder why the tests are flaky. Here's the stack we'll use: Why Testcontainers? Because it spins up a real, isolated PostgreSQL instance inside Docker for each test suite, then tears it down when done. No shared state. No "but it works on my machine." Every test run starts clean. The only prerequisite: Docker must be running on your machine and in CI. Keep it simple. A users API with two endpoints — create a user and fetch all users. src/db.js — connection pool factory: Notice the closePool() function. This is not optional. If you don't close the pool at the end of your tests, Jest hangs forever because open DB connections keep the Node process alive. src/routes/users.js — user routes: src/app.js — Express app (exported so Supertest can use it without starting a server): This is the most important file. The setup.js handles spinning up PostgreSQL in Docker, running your schema migrations, setting environment variables so the app connects to the test DB, and tearing everything down after. tests/integration/setup.js: Why TRUNCATE ... RESTART IDENTITY CASCADE instead of DELETE FROM? TRUNCATE is much faster than DELETE on large datasets and resets the auto-increment sequence, so your id values are predictable (1, 2, 3...) across tests. CASCADE handles foreign key relationships automatically. tests/integration/users.test.js: Notice the 409 test — this is exactly the kind of thing a unit test with mocks cannot catch reliably. The unique constraint lives in PostgreSQL. You either test against a real database or you're guessing. The maxWorkers: 1 setting is important. Each test file spins up its own Docker container, which is already isolated. Running files in parallel can exhaust Docker resources and cause unpredictable failures — exactly the kind of flakiness we're trying to eliminate. Add scripts to package.json: First run will pull the postgres:15-alpine image from Docker Hub — takes 30–60 seconds. Every run after that uses the cached image and starts in about 3–5 seconds. The setup above works locally. Here's how to make it work in GitHub Actions — no extra configuration needed since Testcontainers handles Docker automatically. .github/workflows/integration-tests.yml: That's it. GitHub Actions runners have Docker installed by default. Testcontainers detects it automatically. Looking back, every flaky test I ever had came from violating one of these: Rule 1 — Never share state between tests. Use beforeEach with TRUNCATE to reset. A test that passes because the previous test seeded data is a test that will randomly fail when you reorder files. Rule 2 — Always close your connections. Call closePool() in afterAll. Open connections = hanging Jest process = CI timeout = false failure. Rule 3 — Test against real PostgreSQL, not in-memory fakes. SQLite and in-memory databases behave differently from PostgreSQL. Unique constraints, specific data types, RETURNING clauses, transaction isolation — these all behave subtly differently. Mock them and you're testing a fiction. The setup above is solid for testing your own API endpoints. But in real applications you have external dependencies — third-party APIs, payment services, email providers. You can't spin those up in Docker. The traditional answer is to write mocks manually. The problem: manual mocks drift from reality. The service changes its response format, your mock doesn't, your tests keep passing, production breaks. This is where Keploy takes a different approach. Instead of writing mocks by hand, Keploy records real API traffic during development or staging runs, then replays those recorded interactions as deterministic stubs during testing. Your mocks are always based on real data, not what you thought the API would return. For a Node.js + PostgreSQL app like the one we built here, Keploy captures the actual DB queries and external calls during a real run, then replays them in CI without needing a live database or live external services at all. It's the closest thing to testing against production without actually hitting production. If you want to understand the full picture of what integration testing is, the different types (top-down, bottom-up, sandwich), and how to fit it into a CI/CD pipeline, I'd recommend reading Keploy's comprehensive integration testing guide — it covers the theory behind everything we implemented here. Here's what we built: The result: a test suite that behaves identically on your laptop, your colleague's laptop, and your CI server. No more random failures. No more "works on my machine." If you have questions or a different approach that's worked well for you, drop it in the comments — always curious to hear how other teams handle this. Templates let you quickly answer FAQs or store snippets for re-use. Hide child comments as well For further actions, you may consider blocking this person and/or reporting abuse

Command

Copy

$ my-app/ ├── src/ │ ├── app.js # Express app │ ├── db.js # DB connection pool │ └── routes/ │ └── users.js # User routes ├── tests/ │ └── integration/ │ ├── setup.js # Test DB setup/teardown │ └── users.test.js ├── package.json └── jest.config.js my-app/ ├── src/ │ ├── app.js # Express app │ ├── db.js # DB connection pool │ └── routes/ │ └── users.js # User routes ├── tests/ │ └── integration/ │ ├── setup.js # Test DB setup/teardown │ └── users.test.js ├── package.json └── jest.config.js my-app/ ├── src/ │ ├── app.js # Express app │ ├── db.js # DB connection pool │ └── routes/ │ └── users.js # User routes ├── tests/ │ └── integration/ │ ├── setup.js # Test DB setup/teardown │ └── users.test.js ├── package.json └── jest.config.js -weight: 500;">npm -weight: 500;">install express pg -weight: 500;">npm -weight: 500;">install --save-dev jest supertest testcontainers @testcontainers/postgresql -weight: 500;">npm -weight: 500;">install express pg -weight: 500;">npm -weight: 500;">install --save-dev jest supertest testcontainers @testcontainers/postgresql -weight: 500;">npm -weight: 500;">install express pg -weight: 500;">npm -weight: 500;">install --save-dev jest supertest testcontainers @testcontainers/postgresql const { Pool } = require('pg'); let pool; function getPool() { if (!pool) { pool = new Pool({ host: process.env.DB_HOST || 'localhost', port: parseInt(process.env.DB_PORT || '5432'), database: process.env.DB_NAME || 'myapp', user: process.env.DB_USER || 'postgres', password: process.env.DB_PASSWORD || 'postgres', max: 10, idleTimeoutMillis: 30000, }); } return pool; } async function closePool() { if (pool) { await pool.end(); pool = null; } } module.exports = { getPool, closePool }; const { Pool } = require('pg'); let pool; function getPool() { if (!pool) { pool = new Pool({ host: process.env.DB_HOST || 'localhost', port: parseInt(process.env.DB_PORT || '5432'), database: process.env.DB_NAME || 'myapp', user: process.env.DB_USER || 'postgres', password: process.env.DB_PASSWORD || 'postgres', max: 10, idleTimeoutMillis: 30000, }); } return pool; } async function closePool() { if (pool) { await pool.end(); pool = null; } } module.exports = { getPool, closePool }; const { Pool } = require('pg'); let pool; function getPool() { if (!pool) { pool = new Pool({ host: process.env.DB_HOST || 'localhost', port: parseInt(process.env.DB_PORT || '5432'), database: process.env.DB_NAME || 'myapp', user: process.env.DB_USER || 'postgres', password: process.env.DB_PASSWORD || 'postgres', max: 10, idleTimeoutMillis: 30000, }); } return pool; } async function closePool() { if (pool) { await pool.end(); pool = null; } } module.exports = { getPool, closePool }; const express = require('express'); const { getPool } = require('../db'); const router = express.Router(); // GET /users — fetch all users router.get('/', async (req, res) => { try { const pool = getPool(); const result = await pool.query( 'SELECT id, name, email, created_at FROM users ORDER BY created_at DESC' ); res.json(result.rows); } catch (err) { console.error('Error fetching users:', err.message); res.-weight: 500;">status(500).json({ error: 'Internal server error' }); } }); // POST /users — create a user router.post('/', async (req, res) => { const { name, email } = req.body; if (!name || !email) { return res.-weight: 500;">status(400).json({ error: 'name and email are required' }); } // Basic email format check const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/; if (!emailRegex.test(email)) { return res.-weight: 500;">status(400).json({ error: 'Invalid email format' }); } try { const pool = getPool(); const result = await pool.query( 'INSERT INTO users (name, email) VALUES ($1, $2) RETURNING id, name, email, created_at', [name, email] ); res.-weight: 500;">status(201).json(result.rows[0]); } catch (err) { // PostgreSQL unique constraint violation if (err.code === '23505') { return res.-weight: 500;">status(409).json({ error: 'Email already exists' }); } console.error('Error creating user:', err.message); res.-weight: 500;">status(500).json({ error: 'Internal server error' }); } }); module.exports = router; const express = require('express'); const { getPool } = require('../db'); const router = express.Router(); // GET /users — fetch all users router.get('/', async (req, res) => { try { const pool = getPool(); const result = await pool.query( 'SELECT id, name, email, created_at FROM users ORDER BY created_at DESC' ); res.json(result.rows); } catch (err) { console.error('Error fetching users:', err.message); res.-weight: 500;">status(500).json({ error: 'Internal server error' }); } }); // POST /users — create a user router.post('/', async (req, res) => { const { name, email } = req.body; if (!name || !email) { return res.-weight: 500;">status(400).json({ error: 'name and email are required' }); } // Basic email format check const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/; if (!emailRegex.test(email)) { return res.-weight: 500;">status(400).json({ error: 'Invalid email format' }); } try { const pool = getPool(); const result = await pool.query( 'INSERT INTO users (name, email) VALUES ($1, $2) RETURNING id, name, email, created_at', [name, email] ); res.-weight: 500;">status(201).json(result.rows[0]); } catch (err) { // PostgreSQL unique constraint violation if (err.code === '23505') { return res.-weight: 500;">status(409).json({ error: 'Email already exists' }); } console.error('Error creating user:', err.message); res.-weight: 500;">status(500).json({ error: 'Internal server error' }); } }); module.exports = router; const express = require('express'); const { getPool } = require('../db'); const router = express.Router(); // GET /users — fetch all users router.get('/', async (req, res) => { try { const pool = getPool(); const result = await pool.query( 'SELECT id, name, email, created_at FROM users ORDER BY created_at DESC' ); res.json(result.rows); } catch (err) { console.error('Error fetching users:', err.message); res.-weight: 500;">status(500).json({ error: 'Internal server error' }); } }); // POST /users — create a user router.post('/', async (req, res) => { const { name, email } = req.body; if (!name || !email) { return res.-weight: 500;">status(400).json({ error: 'name and email are required' }); } // Basic email format check const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/; if (!emailRegex.test(email)) { return res.-weight: 500;">status(400).json({ error: 'Invalid email format' }); } try { const pool = getPool(); const result = await pool.query( 'INSERT INTO users (name, email) VALUES ($1, $2) RETURNING id, name, email, created_at', [name, email] ); res.-weight: 500;">status(201).json(result.rows[0]); } catch (err) { // PostgreSQL unique constraint violation if (err.code === '23505') { return res.-weight: 500;">status(409).json({ error: 'Email already exists' }); } console.error('Error creating user:', err.message); res.-weight: 500;">status(500).json({ error: 'Internal server error' }); } }); module.exports = router; const express = require('express'); const usersRouter = require('./routes/users'); const app = express(); app.use(express.json()); app.use('/users', usersRouter); module.exports = app; const express = require('express'); const usersRouter = require('./routes/users'); const app = express(); app.use(express.json()); app.use('/users', usersRouter); module.exports = app; const express = require('express'); const usersRouter = require('./routes/users'); const app = express(); app.use(express.json()); app.use('/users', usersRouter); module.exports = app; const { PostgreSqlContainer } = require('@testcontainers/postgresql'); const { Pool } = require('pg'); let container; let pool; async function setupTestDatabase() { // Spin up a real PostgreSQL instance in Docker // Each test suite gets its own isolated database container = await new PostgreSqlContainer('postgres:15-alpine') .withDatabase('testdb') .withUsername('testuser') .withPassword('testpass') .-weight: 500;">start(); // Point the app to this container process.env.DB_HOST = container.getHost(); process.env.DB_PORT = String(container.getMappedPort(5432)); process.env.DB_NAME = container.getDatabase(); process.env.DB_USER = container.getUsername(); process.env.DB_PASSWORD = container.getPassword(); // Create a pool directly to run migrations pool = new Pool({ host: container.getHost(), port: container.getMappedPort(5432), database: container.getDatabase(), user: container.getUsername(), password: container.getPassword(), }); // Run schema — in production you'd use a migration tool like Flyway or node-pg-migrate await pool.query(` CREATE TABLE IF NOT EXISTS users ( id SERIAL PRIMARY KEY, name VARCHAR(255) NOT NULL, email VARCHAR(255) NOT NULL UNIQUE, created_at TIMESTAMP DEFAULT NOW() ) `); return pool; } async function teardownTestDatabase() { if (pool) await pool.end(); if (container) await container.-weight: 500;">stop(); } // Wipe all rows between tests — faster than dropping/recreating tables async function clearDatabase() { await pool.query('TRUNCATE TABLE users RESTART IDENTITY CASCADE'); } module.exports = { setupTestDatabase, teardownTestDatabase, clearDatabase }; const { PostgreSqlContainer } = require('@testcontainers/postgresql'); const { Pool } = require('pg'); let container; let pool; async function setupTestDatabase() { // Spin up a real PostgreSQL instance in Docker // Each test suite gets its own isolated database container = await new PostgreSqlContainer('postgres:15-alpine') .withDatabase('testdb') .withUsername('testuser') .withPassword('testpass') .-weight: 500;">start(); // Point the app to this container process.env.DB_HOST = container.getHost(); process.env.DB_PORT = String(container.getMappedPort(5432)); process.env.DB_NAME = container.getDatabase(); process.env.DB_USER = container.getUsername(); process.env.DB_PASSWORD = container.getPassword(); // Create a pool directly to run migrations pool = new Pool({ host: container.getHost(), port: container.getMappedPort(5432), database: container.getDatabase(), user: container.getUsername(), password: container.getPassword(), }); // Run schema — in production you'd use a migration tool like Flyway or node-pg-migrate await pool.query(` CREATE TABLE IF NOT EXISTS users ( id SERIAL PRIMARY KEY, name VARCHAR(255) NOT NULL, email VARCHAR(255) NOT NULL UNIQUE, created_at TIMESTAMP DEFAULT NOW() ) `); return pool; } async function teardownTestDatabase() { if (pool) await pool.end(); if (container) await container.-weight: 500;">stop(); } // Wipe all rows between tests — faster than dropping/recreating tables async function clearDatabase() { await pool.query('TRUNCATE TABLE users RESTART IDENTITY CASCADE'); } module.exports = { setupTestDatabase, teardownTestDatabase, clearDatabase }; const { PostgreSqlContainer } = require('@testcontainers/postgresql'); const { Pool } = require('pg'); let container; let pool; async function setupTestDatabase() { // Spin up a real PostgreSQL instance in Docker // Each test suite gets its own isolated database container = await new PostgreSqlContainer('postgres:15-alpine') .withDatabase('testdb') .withUsername('testuser') .withPassword('testpass') .-weight: 500;">start(); // Point the app to this container process.env.DB_HOST = container.getHost(); process.env.DB_PORT = String(container.getMappedPort(5432)); process.env.DB_NAME = container.getDatabase(); process.env.DB_USER = container.getUsername(); process.env.DB_PASSWORD = container.getPassword(); // Create a pool directly to run migrations pool = new Pool({ host: container.getHost(), port: container.getMappedPort(5432), database: container.getDatabase(), user: container.getUsername(), password: container.getPassword(), }); // Run schema — in production you'd use a migration tool like Flyway or node-pg-migrate await pool.query(` CREATE TABLE IF NOT EXISTS users ( id SERIAL PRIMARY KEY, name VARCHAR(255) NOT NULL, email VARCHAR(255) NOT NULL UNIQUE, created_at TIMESTAMP DEFAULT NOW() ) `); return pool; } async function teardownTestDatabase() { if (pool) await pool.end(); if (container) await container.-weight: 500;">stop(); } // Wipe all rows between tests — faster than dropping/recreating tables async function clearDatabase() { await pool.query('TRUNCATE TABLE users RESTART IDENTITY CASCADE'); } module.exports = { setupTestDatabase, teardownTestDatabase, clearDatabase }; const request = require('supertest'); const app = require('../../src/app'); const { closePool } = require('../../src/db'); const { setupTestDatabase, teardownTestDatabase, clearDatabase, } = require('./setup'); describe('Users API — Integration Tests', () => { // Runs once before all tests in this file beforeAll(async () => { await setupTestDatabase(); }, 60000); // 60s timeout — Docker pull can take a moment first run // Runs once after all tests complete afterAll(async () => { await closePool(); // Close the app's connection pool await teardownTestDatabase(); // Stop the Docker container }); // Runs before each individual test — wipes DB state beforeEach(async () => { await clearDatabase(); }); // ─── GET /users ────────────────────────────────────────────── describe('GET /users', () => { it('returns an empty array when no users exist', async () => { const res = await request(app).get('/users'); expect(res.-weight: 500;">status).toBe(200); expect(res.body).toEqual([]); }); it('returns all users ordered by created_at descending', async () => { // Seed two users directly into the DB await request(app) .post('/users') .send({ name: 'Alice', email: '[email protected]' }); await request(app) .post('/users') .send({ name: 'Bob', email: '[email protected]' }); const res = await request(app).get('/users'); expect(res.-weight: 500;">status).toBe(200); expect(res.body).toHaveLength(2); // Bob was created last, should appear first expect(res.body[0].name).toBe('Bob'); expect(res.body[1].name).toBe('Alice'); }); }); // ─── POST /users ───────────────────────────────────────────── describe('POST /users', () => { it('creates a user and returns 201 with the created record', async () => { const res = await request(app) .post('/users') .send({ name: 'Charlie', email: '[email protected]' }); expect(res.-weight: 500;">status).toBe(201); expect(res.body).toMatchObject({ id: expect.any(Number), name: 'Charlie', email: '[email protected]', created_at: expect.any(String), }); }); it('returns 400 when name is missing', async () => { const res = await request(app) .post('/users') .send({ email: '[email protected]' }); expect(res.-weight: 500;">status).toBe(400); expect(res.body.error).toMatch(/name/i); }); it('returns 400 when email format is invalid', async () => { const res = await request(app) .post('/users') .send({ name: 'Dave', email: 'not-an-email' }); expect(res.-weight: 500;">status).toBe(400); expect(res.body.error).toMatch(/email/i); }); it('returns 409 when email already exists — tests real DB unique constraint', async () => { // First insert succeeds await request(app) .post('/users') .send({ name: 'Eve', email: '[email protected]' }); // Second insert with same email hits PostgreSQL unique constraint const res = await request(app) .post('/users') .send({ name: 'Eve Again', email: '[email protected]' }); expect(res.-weight: 500;">status).toBe(409); expect(res.body.error).toMatch(/already exists/i); }); }); }); const request = require('supertest'); const app = require('../../src/app'); const { closePool } = require('../../src/db'); const { setupTestDatabase, teardownTestDatabase, clearDatabase, } = require('./setup'); describe('Users API — Integration Tests', () => { // Runs once before all tests in this file beforeAll(async () => { await setupTestDatabase(); }, 60000); // 60s timeout — Docker pull can take a moment first run // Runs once after all tests complete afterAll(async () => { await closePool(); // Close the app's connection pool await teardownTestDatabase(); // Stop the Docker container }); // Runs before each individual test — wipes DB state beforeEach(async () => { await clearDatabase(); }); // ─── GET /users ────────────────────────────────────────────── describe('GET /users', () => { it('returns an empty array when no users exist', async () => { const res = await request(app).get('/users'); expect(res.-weight: 500;">status).toBe(200); expect(res.body).toEqual([]); }); it('returns all users ordered by created_at descending', async () => { // Seed two users directly into the DB await request(app) .post('/users') .send({ name: 'Alice', email: '[email protected]' }); await request(app) .post('/users') .send({ name: 'Bob', email: '[email protected]' }); const res = await request(app).get('/users'); expect(res.-weight: 500;">status).toBe(200); expect(res.body).toHaveLength(2); // Bob was created last, should appear first expect(res.body[0].name).toBe('Bob'); expect(res.body[1].name).toBe('Alice'); }); }); // ─── POST /users ───────────────────────────────────────────── describe('POST /users', () => { it('creates a user and returns 201 with the created record', async () => { const res = await request(app) .post('/users') .send({ name: 'Charlie', email: '[email protected]' }); expect(res.-weight: 500;">status).toBe(201); expect(res.body).toMatchObject({ id: expect.any(Number), name: 'Charlie', email: '[email protected]', created_at: expect.any(String), }); }); it('returns 400 when name is missing', async () => { const res = await request(app) .post('/users') .send({ email: '[email protected]' }); expect(res.-weight: 500;">status).toBe(400); expect(res.body.error).toMatch(/name/i); }); it('returns 400 when email format is invalid', async () => { const res = await request(app) .post('/users') .send({ name: 'Dave', email: 'not-an-email' }); expect(res.-weight: 500;">status).toBe(400); expect(res.body.error).toMatch(/email/i); }); it('returns 409 when email already exists — tests real DB unique constraint', async () => { // First insert succeeds await request(app) .post('/users') .send({ name: 'Eve', email: '[email protected]' }); // Second insert with same email hits PostgreSQL unique constraint const res = await request(app) .post('/users') .send({ name: 'Eve Again', email: '[email protected]' }); expect(res.-weight: 500;">status).toBe(409); expect(res.body.error).toMatch(/already exists/i); }); }); }); const request = require('supertest'); const app = require('../../src/app'); const { closePool } = require('../../src/db'); const { setupTestDatabase, teardownTestDatabase, clearDatabase, } = require('./setup'); describe('Users API — Integration Tests', () => { // Runs once before all tests in this file beforeAll(async () => { await setupTestDatabase(); }, 60000); // 60s timeout — Docker pull can take a moment first run // Runs once after all tests complete afterAll(async () => { await closePool(); // Close the app's connection pool await teardownTestDatabase(); // Stop the Docker container }); // Runs before each individual test — wipes DB state beforeEach(async () => { await clearDatabase(); }); // ─── GET /users ────────────────────────────────────────────── describe('GET /users', () => { it('returns an empty array when no users exist', async () => { const res = await request(app).get('/users'); expect(res.-weight: 500;">status).toBe(200); expect(res.body).toEqual([]); }); it('returns all users ordered by created_at descending', async () => { // Seed two users directly into the DB await request(app) .post('/users') .send({ name: 'Alice', email: '[email protected]' }); await request(app) .post('/users') .send({ name: 'Bob', email: '[email protected]' }); const res = await request(app).get('/users'); expect(res.-weight: 500;">status).toBe(200); expect(res.body).toHaveLength(2); // Bob was created last, should appear first expect(res.body[0].name).toBe('Bob'); expect(res.body[1].name).toBe('Alice'); }); }); // ─── POST /users ───────────────────────────────────────────── describe('POST /users', () => { it('creates a user and returns 201 with the created record', async () => { const res = await request(app) .post('/users') .send({ name: 'Charlie', email: '[email protected]' }); expect(res.-weight: 500;">status).toBe(201); expect(res.body).toMatchObject({ id: expect.any(Number), name: 'Charlie', email: '[email protected]', created_at: expect.any(String), }); }); it('returns 400 when name is missing', async () => { const res = await request(app) .post('/users') .send({ email: '[email protected]' }); expect(res.-weight: 500;">status).toBe(400); expect(res.body.error).toMatch(/name/i); }); it('returns 400 when email format is invalid', async () => { const res = await request(app) .post('/users') .send({ name: 'Dave', email: 'not-an-email' }); expect(res.-weight: 500;">status).toBe(400); expect(res.body.error).toMatch(/email/i); }); it('returns 409 when email already exists — tests real DB unique constraint', async () => { // First insert succeeds await request(app) .post('/users') .send({ name: 'Eve', email: '[email protected]' }); // Second insert with same email hits PostgreSQL unique constraint const res = await request(app) .post('/users') .send({ name: 'Eve Again', email: '[email protected]' }); expect(res.-weight: 500;">status).toBe(409); expect(res.body.error).toMatch(/already exists/i); }); }); }); module.exports = { testEnvironment: 'node', testMatch: ['**/tests/integration/**/*.test.js'], testTimeout: 60000, // Docker container startup maxWorkers: 1, // Run test files sequentially — prevents port conflicts }; module.exports = { testEnvironment: 'node', testMatch: ['**/tests/integration/**/*.test.js'], testTimeout: 60000, // Docker container startup maxWorkers: 1, // Run test files sequentially — prevents port conflicts }; module.exports = { testEnvironment: 'node', testMatch: ['**/tests/integration/**/*.test.js'], testTimeout: 60000, // Docker container startup maxWorkers: 1, // Run test files sequentially — prevents port conflicts }; { "scripts": { "test:integration": "jest --config jest.config.js", "test:integration:watch": "jest --config jest.config.js --watch" } } { "scripts": { "test:integration": "jest --config jest.config.js", "test:integration:watch": "jest --config jest.config.js --watch" } } { "scripts": { "test:integration": "jest --config jest.config.js", "test:integration:watch": "jest --config jest.config.js --watch" } } -weight: 500;">npm run test:integration -weight: 500;">npm run test:integration -weight: 500;">npm run test:integration PASS tests/integration/users.test.js Users API — Integration Tests GET /users ✓ returns an empty array when no users exist (48ms) ✓ returns all users ordered by created_at descending (61ms) POST /users ✓ creates a user and returns 201 with the created record (42ms) ✓ returns 400 when name is missing (12ms) ✓ returns 400 when email format is invalid (11ms) ✓ returns 409 when email already exists (39ms) Test Suites: 1 passed, 1 total Tests: 6 passed, 6 total Time: 8.3s PASS tests/integration/users.test.js Users API — Integration Tests GET /users ✓ returns an empty array when no users exist (48ms) ✓ returns all users ordered by created_at descending (61ms) POST /users ✓ creates a user and returns 201 with the created record (42ms) ✓ returns 400 when name is missing (12ms) ✓ returns 400 when email format is invalid (11ms) ✓ returns 409 when email already exists (39ms) Test Suites: 1 passed, 1 total Tests: 6 passed, 6 total Time: 8.3s PASS tests/integration/users.test.js Users API — Integration Tests GET /users ✓ returns an empty array when no users exist (48ms) ✓ returns all users ordered by created_at descending (61ms) POST /users ✓ creates a user and returns 201 with the created record (42ms) ✓ returns 400 when name is missing (12ms) ✓ returns 400 when email format is invalid (11ms) ✓ returns 409 when email already exists (39ms) Test Suites: 1 passed, 1 total Tests: 6 passed, 6 total Time: 8.3s name: Integration Tests on: push: branches: [main, develop] pull_request: branches: [main] jobs: integration-tests: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Set up Node.js uses: actions/setup-node@v4 with: node-version: '20' cache: '-weight: 500;">npm' - name: Install dependencies run: -weight: 500;">npm ci - name: Run integration tests run: -weight: 500;">npm run test:integration # No need to manually -weight: 500;">start Postgres — Testcontainers handles it name: Integration Tests on: push: branches: [main, develop] pull_request: branches: [main] jobs: integration-tests: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Set up Node.js uses: actions/setup-node@v4 with: node-version: '20' cache: '-weight: 500;">npm' - name: Install dependencies run: -weight: 500;">npm ci - name: Run integration tests run: -weight: 500;">npm run test:integration # No need to manually -weight: 500;">start Postgres — Testcontainers handles it name: Integration Tests on: push: branches: [main, develop] pull_request: branches: [main] jobs: integration-tests: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Set up Node.js uses: actions/setup-node@v4 with: node-version: '20' cache: '-weight: 500;">npm' - name: Install dependencies run: -weight: 500;">npm ci - name: Run integration tests run: -weight: 500;">npm run test:integration # No need to manually -weight: 500;">start Postgres — Testcontainers handles it - Node.js (Express API) - PostgreSQL (via pg pool) - Jest (test runner) - Testcontainers (spins up a real Postgres Docker container per test suite) - Supertest (HTTP assertion) - A real Express + PostgreSQL app - Integration tests using Testcontainers (real Docker-based Postgres per suite) - Proper setup/teardown with beforeAll, afterAll, beforeEach - Fast state reset with TRUNCATE RESTART IDENTITY - Proper connection pool cleanup to prevent hanging Jest processes - A GitHub Actions CI config that works without any extra setup