Tools: How I Built an Email-to-Google Sheets Automation Using GitHub Copilot CLI

Tools: How I Built an Email-to-Google Sheets Automation Using GitHub Copilot CLI

Source: Dev.to

What I Built ## Key Features: ## The Problem It Solves: ## Architecture Overview: ## Quick Start Commands: ## Key Components: ## Production Features: ## My Experience with GitHub Copilot CLI ## 1. Docker Compose with Health Checks & Resource Limits ## 2. VPS File Synchronization with SSH ## 3. Docker Container Health Monitoring ## 4. AI-Powered Email Categorization Logic ## 5. SLA Calculation with Email Threading ## 6. Google Sheets Batch Updates ## 7. Deployment Script with Automatic Rollback ## 8. Understanding Complex Shell Commands ## Overall Development Impact ## Key Takeaways GitHub Copilot CLI Challenge Submission This is a submission for the GitHub Copilot CLI Challenge EmailToGSheets is an automated email processing system that reads emails from an IMAP server, extracts structured data using AI, and syncs it to Google Sheets in real-time. It's a production-grade solution designed for organizations that need to convert email communications into actionable spreadsheet data with automatic SLA (Service Level Agreement) tracking. Many organizations receive structured data through email (support requests, forms, notifications) but need this data in spreadsheets for analysis, tracking, or integration with other systems. Manually copying email data to spreadsheets is time-consuming and error-prone. EmailToGSheets automates this entire workflow, ensuring data is always up-to-date, properly categorized, and includes SLA metrics for support team performance tracking. Repository: wira-works/emailtogsheets GitHub Copilot CLI transformed how I built this project. It wasn't just about writing code in the IDE—the gh copilot command became my go-to tool for generating complex shell commands, understanding Docker operations, and automating deployment workflows. What I asked GitHub Copilot CLI: What Copilot CLI generated: Impact: Copilot CLI suggested production-grade patterns I hadn't considered - resource reservations, proper logging limits, and a clever health check that verifies the OAuth token file exists. This saved me hours of Docker documentation reading. What I asked GitHub Copilot CLI: What Copilot CLI generated: Impact: Perfect SSH file transfer syntax with custom ports on the first try. Copilot CLI understood I needed bidirectional sync and structured the script accordingly. What I asked GitHub Copilot CLI: What Copilot CLI generated: Then I needed to understand the cleanup command: Copilot CLI explained: This command removes unused Docker images older than 72 hours. The -f flag forces removal without confirmation, and --filter 'until=72h' specifies images that haven't been used in the last 72 hours. Impact: Learned proper Docker filtering syntax and automated cleanup for production environments. What I asked GitHub Copilot (IDE): What Copilot generated: Impact: Copilot suggested a three-tier fallback strategy (AI → Pattern → Keywords) and even included Indonesian language patterns for my business context. What I asked GitHub Copilot (IDE): What Copilot generated: Impact: Copilot taught me about RFC 5322 email headers (In-Reply-To, References) and suggested bilingual subject matching (English "Re:" + Indonesian "Bls:") which was critical for my use case. What I asked GitHub Copilot (IDE): What Copilot generated: Impact: Copilot suggested batch updates instead of individual API calls (10x faster) and generated the A1 notation conversion formula that I would have struggled with. What I asked GitHub Copilot CLI: What Copilot CLI generated: Impact: Production-grade deployment pattern with automatic rollback that prevented several outages during testing. The cleanup command keeps disk space under control. Throughout the project, I frequently used gh copilot explain to understand commands: Copilot CLI explained: Connects to remote host via SSH on port 49222 in quiet mode, runs docker ps to list all containers filtered by name, and suppresses error output. Copilot CLI explained: In-place edit of docker-compose.yml, replacing the entire image line with new image tag using pipe delimiters. Impact: Instead of googling syntax, I got instant, context-aware explanations right in the terminal. GitHub Copilot CLI for Commands: gh copilot suggest became my shell command generator, eliminating constant documentation lookups and Stack Overflow searches GitHub Copilot IDE for Logic: Perfect for complex business logic like SLA calculations, AI integration, and API interactions Learning Accelerator: Taught me RFC 5322 email headers, Docker best practices, A1 notation, and production deployment patterns Context-Aware Suggestions: Understood my business context (Indonesian language, SLA tracking, email threading) and suggested appropriate solutions Production Quality: Suggested error handling, fallback strategies, resource limits, and rollback mechanisms I wouldn't have thought of Would I build this without Copilot? Probably not. The combination of email protocol handling, AI integration, SLA calculations, and VPS deployment would have felt overwhelming. Copilot made it approachable by breaking down complexity into manageable, well-structured code. Most Valuable Copilot CLI Commands: Most Surprising Moment: When Copilot suggested using RFC 5322 email headers for thread detection - I had no idea these headers existed, but they solved my biggest problem perfectly. Technologies: Node.js, JavaScript, Docker, Shell Scripting, Google APIs, IMAP, OAuth2, GitHub Actions, VPS Deployment, AI/ML (Hugging Face) Templates let you quickly answer FAQs or store snippets for re-use. Are you sure you want to hide this comment? It will become hidden in your post, but will still be visible via the comment's permalink. Hide child comments as well For further actions, you may consider blocking this person and/or reporting abuse CODE_BLOCK: Email Server (IMAP) ↓ EmailReader + AI Analyzer + SLA Calculator ↓ CSV Writer + Cache ↓ Google Sheets API (Batch Updates) ↓ Live Spreadsheet with SLA Tracking Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: Email Server (IMAP) ↓ EmailReader + AI Analyzer + SLA Calculator ↓ CSV Writer + Cache ↓ Google Sheets API (Batch Updates) ↓ Live Spreadsheet with SLA Tracking CODE_BLOCK: Email Server (IMAP) ↓ EmailReader + AI Analyzer + SLA Calculator ↓ CSV Writer + Cache ↓ Google Sheets API (Batch Updates) ↓ Live Spreadsheet with SLA Tracking COMMAND_BLOCK: # Local Operations npm start # Run email processor npm run schedule # Start scheduler (30-min interval) npm run token:check # Check OAuth token status npm run token:refresh # Force token refresh # Health Monitoring npm run health # Complete system dashboard # VPS Deployment npm run sync:vps # Bidirectional sync with VPS npm run deploy:setup # Interactive deployment wizard Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK: # Local Operations npm start # Run email processor npm run schedule # Start scheduler (30-min interval) npm run token:check # Check OAuth token status npm run token:refresh # Force token refresh # Health Monitoring npm run health # Complete system dashboard # VPS Deployment npm run sync:vps # Bidirectional sync with VPS npm run deploy:setup # Interactive deployment wizard COMMAND_BLOCK: # Local Operations npm start # Run email processor npm run schedule # Start scheduler (30-min interval) npm run token:check # Check OAuth token status npm run token:refresh # Force token refresh # Health Monitoring npm run health # Complete system dashboard # VPS Deployment npm run sync:vps # Bidirectional sync with VPS npm run deploy:setup # Interactive deployment wizard CODE_BLOCK: gh copilot suggest "create docker compose for nodejs app with health checks, resource limits, and persistent volumes for logs output and data" Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: gh copilot suggest "create docker compose for nodejs app with health checks, resource limits, and persistent volumes for logs output and data" CODE_BLOCK: gh copilot suggest "create docker compose for nodejs app with health checks, resource limits, and persistent volumes for logs output and data" CODE_BLOCK: services: email-scheduler: image: email-to-gsheets:latest container_name: email-scheduler-app restart: unless-stopped volumes: - ./data:/app/data - ./output:/app/output - ./logs:/app/logs - ./token.json:/app/token.json - ./credentials.json:/app/credentials.json deploy: resources: limits: cpus: '1' memory: 512M reservations: cpus: '0.5' memory: 256M healthcheck: test: ["CMD", "node", "-e", "require('fs').accessSync('./token.json'); console.log('healthy')"] interval: 60s timeout: 10s retries: 3 start_period: 30s logging: driver: "json-file" options: max-size: "10m" max-file: "5" Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: services: email-scheduler: image: email-to-gsheets:latest container_name: email-scheduler-app restart: unless-stopped volumes: - ./data:/app/data - ./output:/app/output - ./logs:/app/logs - ./token.json:/app/token.json - ./credentials.json:/app/credentials.json deploy: resources: limits: cpus: '1' memory: 512M reservations: cpus: '0.5' memory: 256M healthcheck: test: ["CMD", "node", "-e", "require('fs').accessSync('./token.json'); console.log('healthy')"] interval: 60s timeout: 10s retries: 3 start_period: 30s logging: driver: "json-file" options: max-size: "10m" max-file: "5" CODE_BLOCK: services: email-scheduler: image: email-to-gsheets:latest container_name: email-scheduler-app restart: unless-stopped volumes: - ./data:/app/data - ./output:/app/output - ./logs:/app/logs - ./token.json:/app/token.json - ./credentials.json:/app/credentials.json deploy: resources: limits: cpus: '1' memory: 512M reservations: cpus: '0.5' memory: 256M healthcheck: test: ["CMD", "node", "-e", "require('fs').accessSync('./token.json'); console.log('healthy')"] interval: 60s timeout: 10s retries: 3 start_period: 30s logging: driver: "json-file" options: max-size: "10m" max-file: "5" CODE_BLOCK: gh copilot suggest "bidirectional file sync between local and remote server using scp with custom port 49222" Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: gh copilot suggest "bidirectional file sync between local and remote server using scp with custom port 49222" CODE_BLOCK: gh copilot suggest "bidirectional file sync between local and remote server using scp with custom port 49222" COMMAND_BLOCK: # Sync local to VPS scp -P 49222 -r "$LOCAL_PATH/output/" "$VPS_USER@$VPS_HOST:$VPS_PATH/output/" scp -P 49222 -r "$LOCAL_PATH/data/" "$VPS_USER@$VPS_HOST:$VPS_PATH/data/" scp -P 49222 "$LOCAL_PATH/token.json" "$VPS_USER@$VPS_HOST:$VPS_PATH/token.json" # Sync VPS to local scp -P 49222 -r "$VPS_USER@$VPS_HOST:$VPS_PATH/output/" "$LOCAL_PATH/output/" scp -P 49222 -r "$VPS_USER@$VPS_HOST:$VPS_PATH/data/" "$LOCAL_PATH/data/" Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK: # Sync local to VPS scp -P 49222 -r "$LOCAL_PATH/output/" "$VPS_USER@$VPS_HOST:$VPS_PATH/output/" scp -P 49222 -r "$LOCAL_PATH/data/" "$VPS_USER@$VPS_HOST:$VPS_PATH/data/" scp -P 49222 "$LOCAL_PATH/token.json" "$VPS_USER@$VPS_HOST:$VPS_PATH/token.json" # Sync VPS to local scp -P 49222 -r "$VPS_USER@$VPS_HOST:$VPS_PATH/output/" "$LOCAL_PATH/output/" scp -P 49222 -r "$VPS_USER@$VPS_HOST:$VPS_PATH/data/" "$LOCAL_PATH/data/" COMMAND_BLOCK: # Sync local to VPS scp -P 49222 -r "$LOCAL_PATH/output/" "$VPS_USER@$VPS_HOST:$VPS_PATH/output/" scp -P 49222 -r "$LOCAL_PATH/data/" "$VPS_USER@$VPS_HOST:$VPS_PATH/data/" scp -P 49222 "$LOCAL_PATH/token.json" "$VPS_USER@$VPS_HOST:$VPS_PATH/token.json" # Sync VPS to local scp -P 49222 -r "$VPS_USER@$VPS_HOST:$VPS_PATH/output/" "$LOCAL_PATH/output/" scp -P 49222 -r "$VPS_USER@$VPS_HOST:$VPS_PATH/data/" "$LOCAL_PATH/data/" CODE_BLOCK: gh copilot suggest "check if docker container is running and show its status with custom format" Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: gh copilot suggest "check if docker container is running and show its status with custom format" CODE_BLOCK: gh copilot suggest "check if docker container is running and show its status with custom format" COMMAND_BLOCK: docker ps --filter 'name=email-scheduler' --format 'table {{.Names}}\t{{.Status}}\t{{.Mounts}}' Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK: docker ps --filter 'name=email-scheduler' --format 'table {{.Names}}\t{{.Status}}\t{{.Mounts}}' COMMAND_BLOCK: docker ps --filter 'name=email-scheduler' --format 'table {{.Names}}\t{{.Status}}\t{{.Mounts}}' CODE_BLOCK: gh copilot explain "docker image prune -f --filter 'until=72h'" Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: gh copilot explain "docker image prune -f --filter 'until=72h'" CODE_BLOCK: gh copilot explain "docker image prune -f --filter 'until=72h'" CODE_BLOCK: // Comment: Create a function to categorize emails using Hugging Face API with fallback to pattern matching Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: // Comment: Create a function to categorize emails using Hugging Face API with fallback to pattern matching CODE_BLOCK: // Comment: Create a function to categorize emails using Hugging Face API with fallback to pattern matching CODE_BLOCK: async analyzeEmailContent(subject, textBody) { try { const emailContent = `Subject: ${subject}\n\nContent: ${textBody}`.substring(0, 1000); let analysis; try { // Try AI analysis first analysis = await this.generateAnalysis(emailContent, subject); } catch (aiError) { console.warn(`⚠️ AI Analysis failed, using enhanced pattern matching`); // Fallback to pattern-based analysis const detailedAnalysis = this.getDetailedAnalysis(subject, textBody); analysis = detailedAnalysis.analysis; } return analysis; } catch (error) { // Final fallback to simple keyword-based analysis return this.fallbackAnalysis(subject, textBody); } } fallbackCategorization(content) { const contentLower = content.toLowerCase(); // MAINTENANCE SUPPORT - Technical activities if ( contentLower.includes('defect') || contentLower.includes('fixing') || (contentLower.includes('migrasi') && contentLower.includes('dokumen')) || contentLower.includes('migration') || (contentLower.includes('inject') && contentLower.includes('data')) ) { return 'Maintenance Support'; } // ... more categories } Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: async analyzeEmailContent(subject, textBody) { try { const emailContent = `Subject: ${subject}\n\nContent: ${textBody}`.substring(0, 1000); let analysis; try { // Try AI analysis first analysis = await this.generateAnalysis(emailContent, subject); } catch (aiError) { console.warn(`⚠️ AI Analysis failed, using enhanced pattern matching`); // Fallback to pattern-based analysis const detailedAnalysis = this.getDetailedAnalysis(subject, textBody); analysis = detailedAnalysis.analysis; } return analysis; } catch (error) { // Final fallback to simple keyword-based analysis return this.fallbackAnalysis(subject, textBody); } } fallbackCategorization(content) { const contentLower = content.toLowerCase(); // MAINTENANCE SUPPORT - Technical activities if ( contentLower.includes('defect') || contentLower.includes('fixing') || (contentLower.includes('migrasi') && contentLower.includes('dokumen')) || contentLower.includes('migration') || (contentLower.includes('inject') && contentLower.includes('data')) ) { return 'Maintenance Support'; } // ... more categories } CODE_BLOCK: async analyzeEmailContent(subject, textBody) { try { const emailContent = `Subject: ${subject}\n\nContent: ${textBody}`.substring(0, 1000); let analysis; try { // Try AI analysis first analysis = await this.generateAnalysis(emailContent, subject); } catch (aiError) { console.warn(`⚠️ AI Analysis failed, using enhanced pattern matching`); // Fallback to pattern-based analysis const detailedAnalysis = this.getDetailedAnalysis(subject, textBody); analysis = detailedAnalysis.analysis; } return analysis; } catch (error) { // Final fallback to simple keyword-based analysis return this.fallbackAnalysis(subject, textBody); } } fallbackCategorization(content) { const contentLower = content.toLowerCase(); // MAINTENANCE SUPPORT - Technical activities if ( contentLower.includes('defect') || contentLower.includes('fixing') || (contentLower.includes('migrasi') && contentLower.includes('dokumen')) || contentLower.includes('migration') || (contentLower.includes('inject') && contentLower.includes('data')) ) { return 'Maintenance Support'; } // ... more categories } CODE_BLOCK: // Calculate SLA response time between customer email and helpdesk reply, considering email threading with In-Reply-To headers Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: // Calculate SLA response time between customer email and helpdesk reply, considering email threading with In-Reply-To headers CODE_BLOCK: // Calculate SLA response time between customer email and helpdesk reply, considering email threading with In-Reply-To headers COMMAND_BLOCK: calculateSLA() { console.log('\n⏱️ Calculating SLA (Response Time)...'); const customerEmails = this.emails.filter(email => !email.isHelpdeskReply); const helpdeskReplies = this.emails.filter(email => email.isHelpdeskReply); customerEmails.forEach(customerEmail => { // Find helpdesk reply using RFC 5322 headers const reply = helpdeskReplies.find(helpdeskEmail => { const referencesCustomerEmail = helpdeskEmail.inReplyTo === customerEmail.messageId || (helpdeskEmail.references && helpdeskEmail.references.includes(customerEmail.messageId)); const subjectMatch = (helpdeskEmail.subject.toLowerCase().includes('re:') || helpdeskEmail.subject.toLowerCase().includes('bls:')) && helpdeskEmail.subject.toLowerCase().includes( customerEmail.subject.toLowerCase().replace('re:', '').replace('bls:', '').trim() ); return referencesCustomerEmail || subjectMatch; }); if (reply) { const diffHours = (new Date(reply.date) - new Date(customerEmail.date)) / (1000 * 60 * 60); customerEmail.responseTimeHours = diffHours.toFixed(2); customerEmail.slaStatus = diffHours <= 24 ? 'Met (< 24h)' : 'Exceeded'; } }); } Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK: calculateSLA() { console.log('\n⏱️ Calculating SLA (Response Time)...'); const customerEmails = this.emails.filter(email => !email.isHelpdeskReply); const helpdeskReplies = this.emails.filter(email => email.isHelpdeskReply); customerEmails.forEach(customerEmail => { // Find helpdesk reply using RFC 5322 headers const reply = helpdeskReplies.find(helpdeskEmail => { const referencesCustomerEmail = helpdeskEmail.inReplyTo === customerEmail.messageId || (helpdeskEmail.references && helpdeskEmail.references.includes(customerEmail.messageId)); const subjectMatch = (helpdeskEmail.subject.toLowerCase().includes('re:') || helpdeskEmail.subject.toLowerCase().includes('bls:')) && helpdeskEmail.subject.toLowerCase().includes( customerEmail.subject.toLowerCase().replace('re:', '').replace('bls:', '').trim() ); return referencesCustomerEmail || subjectMatch; }); if (reply) { const diffHours = (new Date(reply.date) - new Date(customerEmail.date)) / (1000 * 60 * 60); customerEmail.responseTimeHours = diffHours.toFixed(2); customerEmail.slaStatus = diffHours <= 24 ? 'Met (< 24h)' : 'Exceeded'; } }); } COMMAND_BLOCK: calculateSLA() { console.log('\n⏱️ Calculating SLA (Response Time)...'); const customerEmails = this.emails.filter(email => !email.isHelpdeskReply); const helpdeskReplies = this.emails.filter(email => email.isHelpdeskReply); customerEmails.forEach(customerEmail => { // Find helpdesk reply using RFC 5322 headers const reply = helpdeskReplies.find(helpdeskEmail => { const referencesCustomerEmail = helpdeskEmail.inReplyTo === customerEmail.messageId || (helpdeskEmail.references && helpdeskEmail.references.includes(customerEmail.messageId)); const subjectMatch = (helpdeskEmail.subject.toLowerCase().includes('re:') || helpdeskEmail.subject.toLowerCase().includes('bls:')) && helpdeskEmail.subject.toLowerCase().includes( customerEmail.subject.toLowerCase().replace('re:', '').replace('bls:', '').trim() ); return referencesCustomerEmail || subjectMatch; }); if (reply) { const diffHours = (new Date(reply.date) - new Date(customerEmail.date)) / (1000 * 60 * 60); customerEmail.responseTimeHours = diffHours.toFixed(2); customerEmail.slaStatus = diffHours <= 24 ? 'Met (< 24h)' : 'Exceeded'; } }); } CODE_BLOCK: // Batch update specific columns in Google Sheets for SLA data without overwriting the entire row Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: // Batch update specific columns in Google Sheets for SLA data without overwriting the entire row CODE_BLOCK: // Batch update specific columns in Google Sheets for SLA data without overwriting the entire row CODE_BLOCK: async updateSLAColumnsInSheet(slaUpdates, headers) { const requests = []; const slaIndices = this.getSLAColumnIndices(headers); for (const update of slaUpdates) { const rowIndex = update.rowIndex; const colStartIndex = slaIndices.replyDate; const colEndIndex = slaIndices.slaStatus; // Convert column index to A1 notation const startCol = String.fromCharCode(65 + colStartIndex); const endCol = String.fromCharCode(65 + colEndIndex); const range = `${this.sheetName}!${startCol}${rowIndex}:${endCol}${rowIndex}`; const slaValues = [[ update.replyDate, update.helpdeskRepliedBy, update.responseTime, update.responseTimeHours, update.slaStatus ]]; requests.push({ range, values: slaValues }); } // Batch update all at once await this.sheets.spreadsheets.values.batchUpdate({ spreadsheetId: this.spreadsheetId, resource: { data: requests, valueInputOption: 'RAW' } }); } Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: async updateSLAColumnsInSheet(slaUpdates, headers) { const requests = []; const slaIndices = this.getSLAColumnIndices(headers); for (const update of slaUpdates) { const rowIndex = update.rowIndex; const colStartIndex = slaIndices.replyDate; const colEndIndex = slaIndices.slaStatus; // Convert column index to A1 notation const startCol = String.fromCharCode(65 + colStartIndex); const endCol = String.fromCharCode(65 + colEndIndex); const range = `${this.sheetName}!${startCol}${rowIndex}:${endCol}${rowIndex}`; const slaValues = [[ update.replyDate, update.helpdeskRepliedBy, update.responseTime, update.responseTimeHours, update.slaStatus ]]; requests.push({ range, values: slaValues }); } // Batch update all at once await this.sheets.spreadsheets.values.batchUpdate({ spreadsheetId: this.spreadsheetId, resource: { data: requests, valueInputOption: 'RAW' } }); } CODE_BLOCK: async updateSLAColumnsInSheet(slaUpdates, headers) { const requests = []; const slaIndices = this.getSLAColumnIndices(headers); for (const update of slaUpdates) { const rowIndex = update.rowIndex; const colStartIndex = slaIndices.replyDate; const colEndIndex = slaIndices.slaStatus; // Convert column index to A1 notation const startCol = String.fromCharCode(65 + colStartIndex); const endCol = String.fromCharCode(65 + colEndIndex); const range = `${this.sheetName}!${startCol}${rowIndex}:${endCol}${rowIndex}`; const slaValues = [[ update.replyDate, update.helpdeskRepliedBy, update.responseTime, update.responseTimeHours, update.slaStatus ]]; requests.push({ range, values: slaValues }); } // Batch update all at once await this.sheets.spreadsheets.values.batchUpdate({ spreadsheetId: this.spreadsheetId, resource: { data: requests, valueInputOption: 'RAW' } }); } CODE_BLOCK: gh copilot suggest "bash script to pull docker image from ghcr, update docker-compose, start container, and rollback on failure" Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: gh copilot suggest "bash script to pull docker image from ghcr, update docker-compose, start container, and rollback on failure" CODE_BLOCK: gh copilot suggest "bash script to pull docker image from ghcr, update docker-compose, start container, and rollback on failure" COMMAND_BLOCK: # Backup current config cp docker-compose.yml docker-compose.yml.backup # Pull new image docker pull "${IMAGE_NAME}:${TAG}" # Update docker-compose.yml sed -i "s|image: .*|image: ${IMAGE_NAME}:${TAG}|g" docker-compose.yml # Start new container with error handling if docker-compose up -d; then log_success "Container started" rm -f docker-compose.yml.backup else log_error "Failed - Rolling back" mv docker-compose.yml.backup docker-compose.yml docker-compose up -d exit 1 fi # Clean up old images (>72 hours) docker image prune -f --filter "until=72h" Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK: # Backup current config cp docker-compose.yml docker-compose.yml.backup # Pull new image docker pull "${IMAGE_NAME}:${TAG}" # Update docker-compose.yml sed -i "s|image: .*|image: ${IMAGE_NAME}:${TAG}|g" docker-compose.yml # Start new container with error handling if docker-compose up -d; then log_success "Container started" rm -f docker-compose.yml.backup else log_error "Failed - Rolling back" mv docker-compose.yml.backup docker-compose.yml docker-compose up -d exit 1 fi # Clean up old images (>72 hours) docker image prune -f --filter "until=72h" COMMAND_BLOCK: # Backup current config cp docker-compose.yml docker-compose.yml.backup # Pull new image docker pull "${IMAGE_NAME}:${TAG}" # Update docker-compose.yml sed -i "s|image: .*|image: ${IMAGE_NAME}:${TAG}|g" docker-compose.yml # Start new container with error handling if docker-compose up -d; then log_success "Container started" rm -f docker-compose.yml.backup else log_error "Failed - Rolling back" mv docker-compose.yml.backup docker-compose.yml docker-compose up -d exit 1 fi # Clean up old images (>72 hours) docker image prune -f --filter "until=72h" CODE_BLOCK: gh copilot explain "ssh -q -p 49222 user@host 'docker ps -a --filter name=app' 2>/dev/null" Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: gh copilot explain "ssh -q -p 49222 user@host 'docker ps -a --filter name=app' 2>/dev/null" CODE_BLOCK: gh copilot explain "ssh -q -p 49222 user@host 'docker ps -a --filter name=app' 2>/dev/null" CODE_BLOCK: gh copilot explain "sed -i 's|image: .*|image: new-image:tag|g' docker-compose.yml" Enter fullscreen mode Exit fullscreen mode CODE_BLOCK: gh copilot explain "sed -i 's|image: .*|image: new-image:tag|g' docker-compose.yml" CODE_BLOCK: gh copilot explain "sed -i 's|image: .*|image: new-image:tag|g' docker-compose.yml" - Smart Email Processing: Automatically fetches and filters emails from specific senders using IMAP - AI-Powered Analysis: Integrates Hugging Face AI to analyze and categorize email content into business categories (Bugs, Operation Support, Meeting, Maintenance Support) - SLA Tracking: Automatically calculates response times between customer emails and helpdesk replies with 24-hour SLA monitoring - Google Sheets Integration: Direct synchronization to Google Sheets with automatic OAuth2 token management and batch updates - Smart Ticket System: Automatic ticket ID generation with date-based sequencing and thread detection - Automated Scheduling: Built-in cron scheduler for regular processing (every 30 minutes) - Fallback System: Manual CSV processing mode when IMAP is blocked by firewalls - Docker Deployment: Containerized application with health checks and persistent volumes - Token Management: Automatic OAuth2 token refresh with dedicated CLI tool for health checks - VPS Deployment: CI/CD pipeline with GitHub Actions for automated deployment - Smart Caching: Email cache system to avoid reprocessing and optimize performance - EmailReader (lib/emailReader.js): IMAP connection, email filtering, and SLA calculation - AIAnalyzer (lib/aiAnalyzer.js): AI-powered content extraction with fallback pattern matching - GoogleSheetsIntegration (lib/googleSheets.js): OAuth2 authentication, batch updates, and SLA column management - EmailCache (lib/emailCache.js): Ticket metadata tracking and SLA calculation caching - OAuth2TokenManager (lib/oauth2TokenManager.js): Automatic token refresh system - Scheduler (scheduler.js): Cron-based automation with configurable intervals - Automated token refresh every 30 minutes - Email caching to prevent duplicate processing - SLA tracking with 24-hour threshold monitoring - AI email categorization with 3-tier fallback strategy - Comprehensive logging system - Health check dashboard - Backup and disaster recovery scripts - SSH-based VPS synchronization - Docker deployment with GitHub Container Registry (GHCR) - GitHub Copilot CLI for Commands: gh copilot suggest became my shell command generator, eliminating constant documentation lookups and Stack Overflow searches - GitHub Copilot IDE for Logic: Perfect for complex business logic like SLA calculations, AI integration, and API interactions - Learning Accelerator: Taught me RFC 5322 email headers, Docker best practices, A1 notation, and production deployment patterns - Context-Aware Suggestions: Understood my business context (Indonesian language, SLA tracking, email threading) and suggested appropriate solutions - Production Quality: Suggested error handling, fallback strategies, resource limits, and rollback mechanisms I wouldn't have thought of - gh copilot suggest - Generated 90% of my shell scripts - gh copilot explain - Explained complex commands instantly - Saved me from context-switching to browser documentation - Lines of Code: ~3000+ (JavaScript + Shell Scripts) - Docker: Fully containerized - Status: Production Ready - Development Time: 8 hours (vs 29 hours estimated without Copilot) - Key Dependencies: googleapis, node-cron, imap, axios, huggingface