Creating a database connection is expensive—TCP handshake, authentication, protocol negotiation. Connection pooling reuses connections, dramatically improving performance.
The Problem#
Without pooling:
Request 1 → Create connection → Query → Close connection
Request 2 → Create connection → Query → Close connection
Request 3 → Create connection → Query → Close connection
Each connection: ~50-100ms overhead
With pooling:
Request 1 → Get connection from pool → Query → Return to pool
Request 2 → Get connection from pool → Query → Return to pool
Request 3 → Get connection from pool → Query → Return to pool
Each connection: ~1ms overhead
Node.js with PostgreSQL#
1import { Pool } from 'pg';
2
3const pool = new Pool({
4 host: process.env.DB_HOST,
5 port: parseInt(process.env.DB_PORT || '5432'),
6 database: process.env.DB_NAME,
7 user: process.env.DB_USER,
8 password: process.env.DB_PASSWORD,
9
10 // Pool configuration
11 min: 2, // Minimum connections
12 max: 20, // Maximum connections
13 idleTimeoutMillis: 30000, // Close idle connections after 30s
14 connectionTimeoutMillis: 2000, // Fail if can't connect in 2s
15});
16
17// Using the pool
18async function getUser(id: string) {
19 const client = await pool.connect();
20 try {
21 const result = await client.query('SELECT * FROM users WHERE id = $1', [id]);
22 return result.rows[0];
23 } finally {
24 client.release(); // Return to pool - CRITICAL!
25 }
26}
27
28// Simpler approach for single queries
29async function getUserSimple(id: string) {
30 const result = await pool.query('SELECT * FROM users WHERE id = $1', [id]);
31 return result.rows[0];
32}Pool Size Calculation#
Formula: connections = (core_count * 2) + effective_spindle_count
For SSDs (no spindles): connections ≈ core_count * 2
Example:
- 4 CPU cores
- SSD storage
- connections = 4 * 2 = 8
But also consider:
- Number of application instances
- Database server limits
- Query patterns
1// Dynamic pool sizing
2const cpuCount = require('os').cpus().length;
3const poolSize = Math.min(cpuCount * 2, 20); // Cap at 20
4
5const pool = new Pool({
6 max: poolSize,
7 // ...other config
8});Connection Lifecycle#
1const pool = new Pool({
2 max: 10,
3 idleTimeoutMillis: 30000,
4 connectionTimeoutMillis: 2000,
5});
6
7// Event handlers for monitoring
8pool.on('connect', (client) => {
9 console.log('New connection established');
10});
11
12pool.on('acquire', (client) => {
13 console.log('Connection acquired from pool');
14});
15
16pool.on('release', (client) => {
17 console.log('Connection returned to pool');
18});
19
20pool.on('remove', (client) => {
21 console.log('Connection removed from pool');
22});
23
24pool.on('error', (err, client) => {
25 console.error('Pool error:', err.message);
26});Transaction Handling#
1// Proper transaction with pooled connection
2async function transferFunds(fromId: string, toId: string, amount: number) {
3 const client = await pool.connect();
4
5 try {
6 await client.query('BEGIN');
7
8 await client.query(
9 'UPDATE accounts SET balance = balance - $1 WHERE id = $2',
10 [amount, fromId]
11 );
12
13 await client.query(
14 'UPDATE accounts SET balance = balance + $1 WHERE id = $2',
15 [amount, toId]
16 );
17
18 await client.query('COMMIT');
19 } catch (error) {
20 await client.query('ROLLBACK');
21 throw error;
22 } finally {
23 client.release(); // Always release!
24 }
25}
26
27// Helper for transactions
28async function withTransaction<T>(
29 callback: (client: PoolClient) => Promise<T>
30): Promise<T> {
31 const client = await pool.connect();
32
33 try {
34 await client.query('BEGIN');
35 const result = await callback(client);
36 await client.query('COMMIT');
37 return result;
38 } catch (error) {
39 await client.query('ROLLBACK');
40 throw error;
41 } finally {
42 client.release();
43 }
44}
45
46// Usage
47const result = await withTransaction(async (client) => {
48 await client.query('UPDATE accounts SET balance = balance - $1 WHERE id = $2', [100, 'acc1']);
49 await client.query('UPDATE accounts SET balance = balance + $1 WHERE id = $2', [100, 'acc2']);
50 return { success: true };
51});Connection Pool Monitoring#
1// Pool statistics
2function getPoolStats() {
3 return {
4 totalCount: pool.totalCount, // Total connections
5 idleCount: pool.idleCount, // Available connections
6 waitingCount: pool.waitingCount, // Requests waiting for connection
7 };
8}
9
10// Expose metrics endpoint
11app.get('/metrics', (req, res) => {
12 const stats = getPoolStats();
13 res.json({
14 db_pool_total: stats.totalCount,
15 db_pool_idle: stats.idleCount,
16 db_pool_waiting: stats.waitingCount,
17 db_pool_used: stats.totalCount - stats.idleCount,
18 });
19});
20
21// Alert on pool exhaustion
22setInterval(() => {
23 const stats = getPoolStats();
24 if (stats.waitingCount > 0) {
25 console.warn(`Database pool exhausted: ${stats.waitingCount} requests waiting`);
26 }
27 if (stats.idleCount === 0 && stats.totalCount === pool.options.max) {
28 console.warn('Database pool at maximum capacity');
29 }
30}, 5000);External Pool (PgBouncer)#
1# pgbouncer.ini
2[databases]
3mydb = host=localhost port=5432 dbname=mydb
4
5[pgbouncer]
6listen_port = 6432
7listen_addr = *
8auth_type = md5
9auth_file = /etc/pgbouncer/userlist.txt
10
11# Pool settings
12pool_mode = transaction
13max_client_conn = 1000
14default_pool_size = 20
15min_pool_size = 5
16reserve_pool_size = 5Pool modes:
- session: Connection per session (like no pooling)
- transaction: Connection per transaction (recommended)
- statement: Connection per statement (limited use)
Common Pitfalls#
1// ❌ Forgetting to release connection
2async function badExample() {
3 const client = await pool.connect();
4 const result = await client.query('SELECT * FROM users');
5 return result.rows; // Connection leaked!
6}
7
8// ✅ Always release in finally
9async function goodExample() {
10 const client = await pool.connect();
11 try {
12 const result = await client.query('SELECT * FROM users');
13 return result.rows;
14 } finally {
15 client.release();
16 }
17}
18
19// ❌ Holding connections too long
20async function holdingTooLong() {
21 const client = await pool.connect();
22 try {
23 await client.query('SELECT * FROM users');
24 await someSlowExternalAPI(); // Connection held during API call!
25 await client.query('UPDATE users SET ...');
26 } finally {
27 client.release();
28 }
29}
30
31// ✅ Release between operations
32async function releaseBetween() {
33 const users = await pool.query('SELECT * FROM users');
34 await someSlowExternalAPI(); // No connection held
35 await pool.query('UPDATE users SET ...');
36}Graceful Shutdown#
1process.on('SIGTERM', async () => {
2 console.log('Shutting down...');
3
4 // Stop accepting new requests
5 server.close();
6
7 // Wait for pool to drain
8 await pool.end();
9
10 console.log('Pool closed');
11 process.exit(0);
12});Conclusion#
Connection pooling is essential for database performance at scale. Configure pool sizes based on your workload, monitor for exhaustion, and always release connections properly.
For high-traffic applications, consider an external pooler like PgBouncer between your app and database.