Back to Blog
File UploadStorageS3Security

File Upload Handling Best Practices

Handle file uploads securely and efficiently. From validation to storage to processing and streaming.

B
Bootspring Team
Engineering
October 28, 2023
6 min read

File uploads introduce security risks and performance challenges. Here's how to handle them properly—from validation to storage to serving files.

Basic Upload with Multer#

1import multer from 'multer'; 2import path from 'path'; 3import crypto from 'crypto'; 4 5// Configure storage 6const storage = multer.diskStorage({ 7 destination: (req, file, cb) => { 8 cb(null, 'uploads/temp'); 9 }, 10 filename: (req, file, cb) => { 11 const uniqueName = crypto.randomBytes(16).toString('hex'); 12 const ext = path.extname(file.originalname); 13 cb(null, `${uniqueName}${ext}`); 14 }, 15}); 16 17// File filter 18const fileFilter = (req: Request, file: Express.Multer.File, cb: multer.FileFilterCallback) => { 19 const allowedMimes = ['image/jpeg', 'image/png', 'image/webp', 'application/pdf']; 20 21 if (allowedMimes.includes(file.mimetype)) { 22 cb(null, true); 23 } else { 24 cb(new Error('Invalid file type')); 25 } 26}; 27 28const upload = multer({ 29 storage, 30 fileFilter, 31 limits: { 32 fileSize: 10 * 1024 * 1024, // 10MB 33 files: 5, 34 }, 35}); 36 37// Single file upload 38app.post('/upload', upload.single('file'), async (req, res) => { 39 if (!req.file) { 40 return res.status(400).json({ error: 'No file provided' }); 41 } 42 43 const { filename, mimetype, size } = req.file; 44 45 // Process and move to permanent storage 46 const url = await processAndStore(req.file); 47 48 res.json({ url, filename, mimetype, size }); 49}); 50 51// Multiple files 52app.post('/upload-multiple', upload.array('files', 5), async (req, res) => { 53 const files = req.files as Express.Multer.File[]; 54 const results = await Promise.all(files.map(processAndStore)); 55 res.json({ files: results }); 56});

Direct Upload to S3#

1import { S3Client, PutObjectCommand, GetObjectCommand } from '@aws-sdk/client-s3'; 2import { getSignedUrl } from '@aws-sdk/s3-request-presigner'; 3 4const s3 = new S3Client({ 5 region: process.env.AWS_REGION, 6 credentials: { 7 accessKeyId: process.env.AWS_ACCESS_KEY_ID!, 8 secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!, 9 }, 10}); 11 12// Generate presigned URL for direct upload 13app.post('/upload/presigned', authenticate, async (req, res) => { 14 const { filename, contentType } = req.body; 15 16 // Validate 17 if (!isAllowedContentType(contentType)) { 18 return res.status(400).json({ error: 'Invalid content type' }); 19 } 20 21 const key = `uploads/${req.user.id}/${Date.now()}-${sanitizeFilename(filename)}`; 22 23 const command = new PutObjectCommand({ 24 Bucket: process.env.S3_BUCKET, 25 Key: key, 26 ContentType: contentType, 27 Metadata: { 28 'user-id': req.user.id, 29 'original-name': filename, 30 }, 31 }); 32 33 const uploadUrl = await getSignedUrl(s3, command, { expiresIn: 300 }); 34 35 res.json({ 36 uploadUrl, 37 key, 38 expiresIn: 300, 39 }); 40}); 41 42// Confirm upload 43app.post('/upload/confirm', authenticate, async (req, res) => { 44 const { key } = req.body; 45 46 // Verify file exists and belongs to user 47 try { 48 const command = new GetObjectCommand({ 49 Bucket: process.env.S3_BUCKET, 50 Key: key, 51 }); 52 53 const response = await s3.send(command); 54 55 if (response.Metadata?.['user-id'] !== req.user.id) { 56 return res.status(403).json({ error: 'Unauthorized' }); 57 } 58 59 // Save to database 60 const file = await prisma.file.create({ 61 data: { 62 key, 63 bucket: process.env.S3_BUCKET, 64 contentType: response.ContentType, 65 size: response.ContentLength, 66 userId: req.user.id, 67 }, 68 }); 69 70 res.json({ file }); 71 } catch (error) { 72 res.status(404).json({ error: 'File not found' }); 73 } 74});

Security Validation#

1import fileType from 'file-type'; 2import { createHash } from 'crypto'; 3 4async function validateFile(filePath: string, expectedMime: string): Promise<ValidationResult> { 5 const errors: string[] = []; 6 7 // Check actual file type (not just extension) 8 const type = await fileType.fromFile(filePath); 9 10 if (!type) { 11 errors.push('Could not determine file type'); 12 return { valid: false, errors }; 13 } 14 15 if (type.mime !== expectedMime) { 16 errors.push(`File type mismatch: expected ${expectedMime}, got ${type.mime}`); 17 } 18 19 // Check for malicious content in images 20 if (type.mime.startsWith('image/')) { 21 const isSafe = await scanImageForMalware(filePath); 22 if (!isSafe) { 23 errors.push('File failed security scan'); 24 } 25 } 26 27 // Check file size 28 const stats = await fs.stat(filePath); 29 if (stats.size > MAX_FILE_SIZE) { 30 errors.push(`File too large: ${stats.size} bytes`); 31 } 32 33 return { 34 valid: errors.length === 0, 35 errors, 36 fileType: type, 37 size: stats.size, 38 hash: await hashFile(filePath), 39 }; 40} 41 42async function hashFile(filePath: string): Promise<string> { 43 return new Promise((resolve, reject) => { 44 const hash = createHash('sha256'); 45 const stream = fs.createReadStream(filePath); 46 47 stream.on('data', (data) => hash.update(data)); 48 stream.on('end', () => resolve(hash.digest('hex'))); 49 stream.on('error', reject); 50 }); 51} 52 53// Sanitize filename 54function sanitizeFilename(filename: string): string { 55 return filename 56 .replace(/[^a-zA-Z0-9._-]/g, '_') 57 .replace(/_{2,}/g, '_') 58 .substring(0, 255); 59}

Image Processing#

1import sharp from 'sharp'; 2 3async function processImage( 4 inputPath: string, 5 outputDir: string 6): Promise<ProcessedImage> { 7 const metadata = await sharp(inputPath).metadata(); 8 9 const variants = [ 10 { name: 'thumbnail', width: 150, height: 150 }, 11 { name: 'small', width: 320 }, 12 { name: 'medium', width: 640 }, 13 { name: 'large', width: 1280 }, 14 ]; 15 16 const results: Record<string, string> = {}; 17 18 for (const variant of variants) { 19 const outputPath = path.join(outputDir, `${variant.name}.webp`); 20 21 await sharp(inputPath) 22 .resize(variant.width, variant.height, { 23 fit: variant.height ? 'cover' : 'inside', 24 withoutEnlargement: true, 25 }) 26 .webp({ quality: 80 }) 27 .toFile(outputPath); 28 29 results[variant.name] = outputPath; 30 } 31 32 // Create blur placeholder 33 const placeholder = await sharp(inputPath) 34 .resize(20) 35 .blur() 36 .toBuffer(); 37 38 return { 39 original: inputPath, 40 variants: results, 41 placeholder: `data:image/webp;base64,${placeholder.toString('base64')}`, 42 metadata: { 43 width: metadata.width, 44 height: metadata.height, 45 format: metadata.format, 46 }, 47 }; 48}

Streaming Large Files#

1// Stream upload to S3 2import { Upload } from '@aws-sdk/lib-storage'; 3import { PassThrough } from 'stream'; 4 5app.post('/upload/stream', authenticate, async (req, res) => { 6 const passThrough = new PassThrough(); 7 const key = `uploads/${req.user.id}/${Date.now()}`; 8 9 const upload = new Upload({ 10 client: s3, 11 params: { 12 Bucket: process.env.S3_BUCKET, 13 Key: key, 14 Body: passThrough, 15 ContentType: req.headers['content-type'], 16 }, 17 }); 18 19 req.pipe(passThrough); 20 21 try { 22 const result = await upload.done(); 23 res.json({ key, location: result.Location }); 24 } catch (error) { 25 res.status(500).json({ error: 'Upload failed' }); 26 } 27}); 28 29// Stream download 30app.get('/files/:id', authenticate, async (req, res) => { 31 const file = await prisma.file.findUnique({ 32 where: { id: req.params.id }, 33 }); 34 35 if (!file || file.userId !== req.user.id) { 36 return res.status(404).json({ error: 'Not found' }); 37 } 38 39 const command = new GetObjectCommand({ 40 Bucket: file.bucket, 41 Key: file.key, 42 }); 43 44 const response = await s3.send(command); 45 46 res.setHeader('Content-Type', file.contentType); 47 res.setHeader('Content-Length', file.size); 48 res.setHeader( 49 'Content-Disposition', 50 `attachment; filename="${file.originalName}"` 51 ); 52 53 (response.Body as Readable).pipe(res); 54});

Cleanup#

1// Clean up temp files 2async function cleanupTempFiles(): Promise<void> { 3 const tempDir = 'uploads/temp'; 4 const maxAge = 60 * 60 * 1000; // 1 hour 5 6 const files = await fs.readdir(tempDir); 7 const now = Date.now(); 8 9 for (const file of files) { 10 const filePath = path.join(tempDir, file); 11 const stats = await fs.stat(filePath); 12 13 if (now - stats.mtimeMs > maxAge) { 14 await fs.unlink(filePath); 15 logger.info('Cleaned up temp file', { file }); 16 } 17 } 18} 19 20// Run periodically 21setInterval(cleanupTempFiles, 15 * 60 * 1000);

Best Practices#

Security: ✓ Validate file type by content, not extension ✓ Scan for malware ✓ Use presigned URLs for direct upload ✓ Never trust user-provided filenames ✓ Set appropriate file size limits Performance: ✓ Use streaming for large files ✓ Process images asynchronously ✓ Generate variants/thumbnails ✓ Use CDN for serving files Storage: ✓ Store files outside web root ✓ Use unique filenames (UUID) ✓ Organize by date/user ✓ Clean up temporary files

Conclusion#

File uploads require careful handling of security, storage, and performance. Validate thoroughly, store securely, and process asynchronously for the best user experience.

Never trust uploaded files—always validate content type and scan for threats.

Share this article

Help spread the word about Bootspring