Tools
Tools: Stop Running Puppeteer on Your Main Server: The Serverless Approach to Screenshots
2026-02-16
0 views
admin
Introduction ## The Trap: "It works on my machine" ## Wall #1: The Lazy Loading Problem ## Wall #2: The "Cookie Banner" Apocalypse ## Wall #3: Server Costs & Zombie Processes ## The Solution: Going Serverless (AWS Lambda) ## Introducing FlashCapture ## Trying it out ## Conclusion We've all been there. You're building a side project—maybe a link preview generator, an SEO tool, or a dashboard—and you think: "I just need to take a quick screenshot of this URL." So you npm install puppeteer, write 10 lines of code, and it works locally. Great! Then you deploy it to production (Docker, Ubuntu, or Heroku), and hell breaks loose. I spent the last month fighting these battles while building a screenshot microservice. Here is what I learned about doing it the hard way, and why I eventually turned it into a dedicated API. Basic Puppeteer is deceptive. Here is the code everyone starts with: This works fine for example.com. But try running this against a modern Single Page Application (SPA) or a news site, and you will hit three major walls. Modern web performance relies on lazy loading. Images only load when they enter the viewport. If you take a screenshot immediately after page.goto, you get a page full of placeholders. The Fix: You need to simulate a user scrolling down, or wait for network activity to settle. In 2026, the web is 50% content and 50% GDPR popups. A screenshot tool that captures the cookie banner is useless. The Fix: You have to inject CSS or JS to nuke these elements before the shutter clicks. But maintaining a list of selectors for every site on the internet? Impossible. Chromium is heavy. Running it in a standard container requires significant RAM. If your script crashes before browser.close() is called, you are left with "zombie" Chrome processes eating up your CPU until the server dies. To solve the crashing and scaling issues, I moved the architecture to AWS Lambda. This ensures that: However, getting Puppeteer on Lambda is tricky (binary sizes, font packages). I used puppeteer-core and @sparticuz/chromium to keep the package size under the 50MB limit. After refining this architecture to handle ad-blocking, dark mode, and full-page stitching automatically, I realized this was too valuable to keep as a messy internal script. So, I wrapped it into a clean, public API called FlashCapture. It handles all the edge cases I mentioned above: If you are tired of maintaining your own Puppeteer instance, you can use the API directly via RapidAPI. There is a free tier for developers. Here is how simple it is compared to the 50 lines of Puppeteer code: If you are building a production app, think twice before running a headless browser on your primary web server. It's a resource hog that introduces security risks and stability issues. Whether you build your own microservice on AWS Lambda (like I did initially) or use a managed API like FlashCapture, decoupling this heavy task is the best architectural decision you can make. 👉 Check out FlashCapture on RapidAPI here Templates let you quickly answer FAQs or store snippets for re-use. Are you sure you want to hide this comment? It will become hidden in your post, but will still be visible via the comment's permalink. Hide child comments as well For further actions, you may consider blocking this person and/or reporting abuse COMMAND_BLOCK:
const puppeteer = require('puppeteer'); (async () => { const browser = await puppeteer.launch(); const page = await browser.newPage(); await page.goto('https://example.com'); await page.screenshot({ path: 'example.png' }); await browser.close();
})(); Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK:
const puppeteer = require('puppeteer'); (async () => { const browser = await puppeteer.launch(); const page = await browser.newPage(); await page.goto('https://example.com'); await page.screenshot({ path: 'example.png' }); await browser.close();
})(); COMMAND_BLOCK:
const puppeteer = require('puppeteer'); (async () => { const browser = await puppeteer.launch(); const page = await browser.newPage(); await page.goto('https://example.com'); await page.screenshot({ path: 'example.png' }); await browser.close();
})(); CODE_BLOCK:
// Waiting for networkidle0 is reliable but SLOW (can take 10s+)
await page.goto(url, { waitUntil: 'networkidle0' }); Enter fullscreen mode Exit fullscreen mode CODE_BLOCK:
// Waiting for networkidle0 is reliable but SLOW (can take 10s+)
await page.goto(url, { waitUntil: 'networkidle0' }); CODE_BLOCK:
// Waiting for networkidle0 is reliable but SLOW (can take 10s+)
await page.goto(url, { waitUntil: 'networkidle0' }); CODE_BLOCK:
await page.addStyleTag({ content: '#onetrust-banner-sdk, .cookie-popup { display: none !important; }'
}); Enter fullscreen mode Exit fullscreen mode CODE_BLOCK:
await page.addStyleTag({ content: '#onetrust-banner-sdk, .cookie-popup { display: none !important; }'
}); CODE_BLOCK:
await page.addStyleTag({ content: '#onetrust-banner-sdk, .cookie-popup { display: none !important; }'
}); CODE_BLOCK:
const axios = require('axios'); const options = { method: 'POST', url: 'https://flashcapture.p.rapidapi.com/capture', headers: { 'content-type': 'application/json', 'X-RapidAPI-Key': 'YOUR_API_KEY', 'X-RapidAPI-Host': 'flashcapture.p.rapidapi.com' }, data: { url: 'https://www.reddit.com', options: { fullPage: true, darkMode: true, // Automagically renders in dark mode width: 1920 } }
}; try { const response = await axios.request(options); console.log("Job ID:", response.data.id); // Then just poll the /status endpoint to get your image!
} catch (error) { console.error(error);
} Enter fullscreen mode Exit fullscreen mode CODE_BLOCK:
const axios = require('axios'); const options = { method: 'POST', url: 'https://flashcapture.p.rapidapi.com/capture', headers: { 'content-type': 'application/json', 'X-RapidAPI-Key': 'YOUR_API_KEY', 'X-RapidAPI-Host': 'flashcapture.p.rapidapi.com' }, data: { url: 'https://www.reddit.com', options: { fullPage: true, darkMode: true, // Automagically renders in dark mode width: 1920 } }
}; try { const response = await axios.request(options); console.log("Job ID:", response.data.id); // Then just poll the /status endpoint to get your image!
} catch (error) { console.error(error);
} CODE_BLOCK:
const axios = require('axios'); const options = { method: 'POST', url: 'https://flashcapture.p.rapidapi.com/capture', headers: { 'content-type': 'application/json', 'X-RapidAPI-Key': 'YOUR_API_KEY', 'X-RapidAPI-Host': 'flashcapture.p.rapidapi.com' }, data: { url: 'https://www.reddit.com', options: { fullPage: true, darkMode: true, // Automagically renders in dark mode width: 1920 } }
}; try { const response = await axios.request(options); console.log("Job ID:", response.data.id); // Then just poll the /status endpoint to get your image!
} catch (error) { console.error(error);
} - The fonts are broken (rectangles instead of text).
- The memory usage spikes to 2GB and crashes your server.
- The target website shows a giant "Accept Cookies" banner covering the content.
- Half the images are missing because of lazy-loading. - Each screenshot gets a fresh, isolated environment.
- If it crashes, it doesn't take down my main server.
- I only pay when a screenshot is taken. - ✅ Smart Ad-Blocker: Automatically hides banners and trackers.
- ✅ Async Processing: No HTTP timeouts on large pages.
- ✅ Lazy Loading Support: We handle the wait logic.
how-totutorialguidedev.toaiubuntuservernetworkdocker