Tools: How I Built a 24/7 Automation Bot That Runs on a VPS (2026)

Tools: How I Built a 24/7 Automation Bot That Runs on a VPS (2026)

How I Built a 24/7 Automation Bot That Runs on a $5 VPS

The Problem

The Stack

Step 1: The Core Loop

Step 2: Browser Automation

Step 3: Multi-Platform Messaging

Step 4: GitHub Integration

Step 5: Making It Resilient

Auto-restart with systemd

Health Check Script

Memory Watchdog

What This Bot Actually Does For Me

Things I Learned the Hard Way

The Bigger Picture No cloud functions. No serverless complexity. Just a Node.js process that's been running for 6 months straight. Like many developers, I had a list of repetitive tasks that needed doing every day: I tried everything: cron jobs on my laptop (only works when it's on), GitHub Actions (great but limited to repo events), Zapier (expensive at scale), and even scheduled Lambda functions (overkill for simple tasks). What I really wanted was a persistent background worker that could: Here's how I built it. Total cost: ~$5/month (Vultr/DigitalOcean/LightSail basic tier) The heart of the system is a simple scheduler: Nothing fancy. node-cron is battle-tested, zero-dependency, and does exactly what it says. Pro tip: Use fs.writeFileSync to persist your last-run timestamp. If your process crashes and restarts, you don't want to re-fire jobs that already ran: The real power comes from browser automation. I use Playwright (though Puppeteer works too): Why not just use fetch? Because half the modern web is JavaScript-rendered. If you need to click buttons, fill forms, or wait for lazy-loaded content, you need a real browser. Memory tip: Always await browser.close(). A leaked browser instance eats ~100-200MB of RAM. On a $5 VPS with 4GB RAM, that adds up fast. I learned this the hard way after crashing my server 3 times in the first week. The bot needs to talk to humans. Here's my lightweight approach using webhooks: Each platform has its own webhook/API pattern, but they're all just HTTP POST requests under the hood. Since I do open source work, monitoring PRs is critical: Rate limit warning: GitHub allows 5,000 authenticated requests/hour. With 30 PRs checked every 30 minutes, that's ~48 requests/day. You won't hit limits unless you're doing something much bigger. A 24/7 process WILL crash. Here's my survival kit: After 6 months of running, here's what it handles automatically: That's basically a full-time job's worth of work, automated for $5/month. Log everything. When something breaks at 3 AM, you need to know WHAT was happening, not just THAT it broke. Structured JSON logs > console.log. Don't poll too aggressively. Every API has rate limits. Start conservative (every 5-15 min), speed up only when you need to. Graceful shutdown. Handle SIGTERM so you can save state before dying: Secrets belong in environment variables. Never hardcode tokens. Use a .env file (gitignored) or a secrets manager. Start small. My first version did ONE thing: check one GitHub repo every hour. Once that worked reliably for a week, I added another task. Rinse and repeat. This isn't just about saving time. It's about building a personal automation infrastructure that grows with you. Once you have a 24/7 bot that can browse the web, call APIs, and send messages, you can layer on top of it: The $5 VPS is just the beginning. What are you automating? Drop a comment — I'd love to hear what other developers are building. If you found this useful, follow me on DEV for more posts about practical automation and building things that pay for themselves. Templates let you quickly answer FAQs or store snippets for re-use. Hide child comments as well For further actions, you may consider blocking this person and/or reporting abuse

Code Block

Copy

┌─────────────────────────────────┐ │ $5 VPS (Ubuntu) │ │ │ │ ┌───────────┐ ┌──────────┐ │ │ │ Node.js │ │ Puppeteer│ │ │ │ Process │──▶│ / Playwright│ │ │ │ (Gateway) │ │ (Browser)│ │ │ └─────┬─────┘ └──────────┘ │ │ │ │ │ ┌─────▼──────────────────────┐│ │ │ Cron Scheduler ││ │ │ (every 5min / 30min / 1h) ││ │ └─────┬──────────────────────┘│ │ │ │ │ ┌─────▼──────────────────────┐│ │ │ Message Routing ││ │ │ → Feishu / Telegram / Slack││ │ └────────────────────────────┘│ └─────────────────────────────────┘ ┌─────────────────────────────────┐ │ $5 VPS (Ubuntu) │ │ │ │ ┌───────────┐ ┌──────────┐ │ │ │ Node.js │ │ Puppeteer│ │ │ │ Process │──▶│ / Playwright│ │ │ │ (Gateway) │ │ (Browser)│ │ │ └─────┬─────┘ └──────────┘ │ │ │ │ │ ┌─────▼──────────────────────┐│ │ │ Cron Scheduler ││ │ │ (every 5min / 30min / 1h) ││ │ └─────┬──────────────────────┘│ │ │ │ │ ┌─────▼──────────────────────┐│ │ │ Message Routing ││ │ │ → Feishu / Telegram / Slack││ │ └────────────────────────────┘│ └─────────────────────────────────┘ ┌─────────────────────────────────┐ │ $5 VPS (Ubuntu) │ │ │ │ ┌───────────┐ ┌──────────┐ │ │ │ Node.js │ │ Puppeteer│ │ │ │ Process │──▶│ / Playwright│ │ │ │ (Gateway) │ │ (Browser)│ │ │ └─────┬─────┘ └──────────┘ │ │ │ │ │ ┌─────▼──────────────────────┐│ │ │ Cron Scheduler ││ │ │ (every 5min / 30min / 1h) ││ │ └─────┬──────────────────────┘│ │ │ │ │ ┌─────▼──────────────────────┐│ │ │ Message Routing ││ │ │ → Feishu / Telegram / Slack││ │ └────────────────────────────┘│ └─────────────────────────────────┘ const cron = require('node-cron'); // Run every 5 minutes — check for urgent stuff cron.schedule('*/5 * * * *', async () => { await checkUrgentAlerts(); }); // Run every 30 minutes — routine monitoring cron.schedule('*/30 * * * *', async () => { await monitorGitHubPRs(); await scanForNewTasks(); }); // Run every hour — reports and summaries cron.schedule('0 * * * *', async () => { await sendHourlyDigest(); }); const cron = require('node-cron'); // Run every 5 minutes — check for urgent stuff cron.schedule('*/5 * * * *', async () => { await checkUrgentAlerts(); }); // Run every 30 minutes — routine monitoring cron.schedule('*/30 * * * *', async () => { await monitorGitHubPRs(); await scanForNewTasks(); }); // Run every hour — reports and summaries cron.schedule('0 * * * *', async () => { await sendHourlyDigest(); }); const cron = require('node-cron'); // Run every 5 minutes — check for urgent stuff cron.schedule('*/5 * * * *', async () => { await checkUrgentAlerts(); }); // Run every 30 minutes — routine monitoring cron.schedule('*/30 * * * *', async () => { await monitorGitHubPRs(); await scanForNewTasks(); }); // Run every hour — reports and summaries cron.schedule('0 * * * *', async () => { await sendHourlyDigest(); }); function shouldRun(jobName, intervalMinutes) { const stateFile = `./cron-state/${jobName}.json`; const now = Date.now(); try { const { lastRun } = JSON.parse(fs.readFileSync(stateFile)); if (now - lastRun < intervalMinutes * 60 * 1000) return false; } catch { /* first run */ } fs.writeFileSync(stateFile, JSON.stringify({ lastRun: now })); return true; } function shouldRun(jobName, intervalMinutes) { const stateFile = `./cron-state/${jobName}.json`; const now = Date.now(); try { const { lastRun } = JSON.parse(fs.readFileSync(stateFile)); if (now - lastRun < intervalMinutes * 60 * 1000) return false; } catch { /* first run */ } fs.writeFileSync(stateFile, JSON.stringify({ lastRun: now })); return true; } function shouldRun(jobName, intervalMinutes) { const stateFile = `./cron-state/${jobName}.json`; const now = Date.now(); try { const { lastRun } = JSON.parse(fs.readFileSync(stateFile)); if (now - lastRun < intervalMinutes * 60 * 1000) return false; } catch { /* first run */ } fs.writeFileSync(stateFile, JSON.stringify({ lastRun: now })); return true; } const { chromium } = require('playwright'); async function checkWebsite(url, selector) { const browser = await chromium.launch({ headless: true }); const page = await browser.newPage(); await page.goto(url, { waitUntil: 'networkidle' }); // Check if a specific element exists (price drop? new item?) const element = await page.$(selector); const found = !!element; if (found) { const text = await element.innerText(); await notify(`Found match on ${url}: ${text}`); } await browser.close(); return found; } const { chromium } = require('playwright'); async function checkWebsite(url, selector) { const browser = await chromium.launch({ headless: true }); const page = await browser.newPage(); await page.goto(url, { waitUntil: 'networkidle' }); // Check if a specific element exists (price drop? new item?) const element = await page.$(selector); const found = !!element; if (found) { const text = await element.innerText(); await notify(`Found match on ${url}: ${text}`); } await browser.close(); return found; } const { chromium } = require('playwright'); async function checkWebsite(url, selector) { const browser = await chromium.launch({ headless: true }); const page = await browser.newPage(); await page.goto(url, { waitUntil: 'networkidle' }); // Check if a specific element exists (price drop? new item?) const element = await page.$(selector); const found = !!element; if (found) { const text = await element.innerText(); await notify(`Found match on ${url}: ${text}`); } await browser.close(); return found; } async function sendFeishuMessage(webhookUrl, text) { await fetch(webhookUrl, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ msg_type: "text", content: { text } }) }); } async function sendTelegramMessage(botToken, chatId, text) { await fetch( `https://api.telegram.org/bot${botToken}/sendMessage`, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ chat_id: chatId, text, parse_mode: 'Markdown' }) } ); } async function sendFeishuMessage(webhookUrl, text) { await fetch(webhookUrl, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ msg_type: "text", content: { text } }) }); } async function sendTelegramMessage(botToken, chatId, text) { await fetch( `https://api.telegram.org/bot${botToken}/sendMessage`, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ chat_id: chatId, text, parse_mode: 'Markdown' }) } ); } async function sendFeishuMessage(webhookUrl, text) { await fetch(webhookUrl, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ msg_type: "text", content: { text } }) }); } async function sendTelegramMessage(botToken, chatId, text) { await fetch( `https://api.telegram.org/bot${botToken}/sendMessage`, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ chat_id: chatId, text, parse_mode: 'Markdown' }) } ); } async function monitorPRs() { // Using GitHub REST API (no library needed) const response = await fetch( 'https://api.github.com/repos/:owner/:repo/pulls?state=open&per_page=30', { headers: { 'Authorization': `token ${process.env.GITHUB_TOKEN}`, 'User-Agent': 'automation-bot' } } ); const prs = await response.json(); for (const pr of prs) { const lastCheck = getLastCheckTime(pr.id); // Check for new comments/reviews since last check const comments = await fetch(pr.comments_url, { headers: { 'Authorization': `token ${process.env.GITHUB_TOKEN}` } }).then(r => r.json()); const newComments = comments.filter(c => new Date(c.created_at) > new Date(lastCheck) ); if (newComments.length > 0) { await notify(`PR #${pr.number} has ${newComments.length} new comment(s)!`); } updateCheckTime(pr.id); } } async function monitorPRs() { // Using GitHub REST API (no library needed) const response = await fetch( 'https://api.github.com/repos/:owner/:repo/pulls?state=open&per_page=30', { headers: { 'Authorization': `token ${process.env.GITHUB_TOKEN}`, 'User-Agent': 'automation-bot' } } ); const prs = await response.json(); for (const pr of prs) { const lastCheck = getLastCheckTime(pr.id); // Check for new comments/reviews since last check const comments = await fetch(pr.comments_url, { headers: { 'Authorization': `token ${process.env.GITHUB_TOKEN}` } }).then(r => r.json()); const newComments = comments.filter(c => new Date(c.created_at) > new Date(lastCheck) ); if (newComments.length > 0) { await notify(`PR #${pr.number} has ${newComments.length} new comment(s)!`); } updateCheckTime(pr.id); } } async function monitorPRs() { // Using GitHub REST API (no library needed) const response = await fetch( 'https://api.github.com/repos/:owner/:repo/pulls?state=open&per_page=30', { headers: { 'Authorization': `token ${process.env.GITHUB_TOKEN}`, 'User-Agent': 'automation-bot' } } ); const prs = await response.json(); for (const pr of prs) { const lastCheck = getLastCheckTime(pr.id); // Check for new comments/reviews since last check const comments = await fetch(pr.comments_url, { headers: { 'Authorization': `token ${process.env.GITHUB_TOKEN}` } }).then(r => r.json()); const newComments = comments.filter(c => new Date(c.created_at) > new Date(lastCheck) ); if (newComments.length > 0) { await notify(`PR #${pr.number} has ${newComments.length} new comment(s)!`); } updateCheckTime(pr.id); } } # /etc/systemd/system/automation-bot.service [Unit] Description=Automation Bot After=network.target [Service] Type=simple User=ubuntu WorkingDirectory=/home/ubuntu/bot ExecStart=/usr/bin/node index.js Restart=always RestartSec=10 [Install] WantedBy=multi-user.target # /etc/systemd/system/automation-bot.service [Unit] Description=Automation Bot After=network.target [Service] Type=simple User=ubuntu WorkingDirectory=/home/ubuntu/bot ExecStart=/usr/bin/node index.js Restart=always RestartSec=10 [Install] WantedBy=multi-user.target # /etc/systemd/system/automation-bot.service [Unit] Description=Automation Bot After=network.target [Service] Type=simple User=ubuntu WorkingDirectory=/home/ubuntu/bot ExecStart=/usr/bin/node index.js Restart=always RestartSec=10 [Install] WantedBy=multi-user.target sudo systemctl enable automation-bot sudo systemctl start automation-bot # Now it auto-restarts on crash! sudo systemctl enable automation-bot sudo systemctl start automation-bot # Now it auto-restarts on crash! sudo systemctl enable automation-bot sudo systemctl start automation-bot # Now it auto-restarts on crash! #!/bin/bash # healthcheck.sh — run via system cron every 5 min if ! pgrep -f "node index.js" > /dev/null; then echo "$(date): Bot is DOWN! Restarting..." >> /var/log/bot-health.log sudo systemctl restart automation-bot fi # Also check: is it actually responding? curl -sf http://localhost:3000/health || { echo "$(date): Bot not responding!" >> /var/log/bot-health.log sudo systemctl restart automation-bot } #!/bin/bash # healthcheck.sh — run via system cron every 5 min if ! pgrep -f "node index.js" > /dev/null; then echo "$(date): Bot is DOWN! Restarting..." >> /var/log/bot-health.log sudo systemctl restart automation-bot fi # Also check: is it actually responding? curl -sf http://localhost:3000/health || { echo "$(date): Bot not responding!" >> /var/log/bot-health.log sudo systemctl restart automation-bot } #!/bin/bash # healthcheck.sh — run via system cron every 5 min if ! pgrep -f "node index.js" > /dev/null; then echo "$(date): Bot is DOWN! Restarting..." >> /var/log/bot-health.log sudo systemctl restart automation-bot fi # Also check: is it actually responding? curl -sf http://localhost:3000/health || { echo "$(date): Bot not responding!" >> /var/log/bot-health.log sudo systemctl restart automation-bot } #!/bin/bash # Check memory usage, restart if > 80% MEM_PCT=$(free | awk '/Mem/{printf "%.0f", $3/$2*100}') if [ "$MEM_PCT" -gt 80 ]; then echo "$(date): Memory at ${MEM_PCT}%! Restarting..." >> /var/log/bot-health.log sudo systemctl restart automation-bot fi #!/bin/bash # Check memory usage, restart if > 80% MEM_PCT=$(free | awk '/Mem/{printf "%.0f", $3/$2*100}') if [ "$MEM_PCT" -gt 80 ]; then echo "$(date): Memory at ${MEM_PCT}%! Restarting..." >> /var/log/bot-health.log sudo systemctl restart automation-bot fi #!/bin/bash # Check memory usage, restart if > 80% MEM_PCT=$(free | awk '/Mem/{printf "%.0f", $3/$2*100}') if [ "$MEM_PCT" -gt 80 ]; then echo "$(date): Memory at ${MEM_PCT}%! Restarting..." >> /var/log/bot-health.log sudo systemctl restart automation-bot fi process.on('SIGTERM', async () => { console.log('Shutting down gracefully...'); await saveState(); process.exit(0); }); process.on('SIGTERM', async () => { console.log('Shutting down gracefully...'); await saveState(); process.exit(0); }); process.on('SIGTERM', async () => { console.log('Shutting down gracefully...'); await saveState(); process.exit(0); }); - Check GitHub repos for new issues I could contribute to - Monitor my pull requests for review comments - Send daily status reports to my team - Scan websites for price changes or content updates - Push notifications when something important happened - Run 24/7 without me thinking about it - Interact with web pages like a real browser - Send messages to multiple platforms (Slack, Discord, email) - Execute code and shell commands - Cost less than a cup of coffee per month - Log everything. When something breaks at 3 AM, you need to know WHAT was happening, not just THAT it broke. Structured JSON logs > console.log. - Don't poll too aggressively. Every API has rate limits. Start conservative (every 5-15 min), speed up only when you need to. - Graceful shutdown. Handle SIGTERM so you can save state before dying: - Secrets belong in environment variables. Never hardcode tokens. Use a .env file (gitignored) or a secrets manager. - Start small. My first version did ONE thing: check one GitHub repo every hour. Once that worked reliably for a week, I added another task. Rinse and repeat. - Content generation (auto-publish blog posts) - Price monitoring (track products across sites) - Competitor analysis (scrape and compare) - Customer support (auto-respond to common queries) - Data pipelines (collect → transform → store)