Tools: Supply Chain Security for Developers: Protecting Your CI/CD Pipeline in 2026
Supply Chain Security for Developers: Protecting Your CI/CD Pipeline in 2026
The Attack Surface
1. Lock Your Dependencies
Pin Everything, Hash Everything
Audit Dependencies Automatically
Block Typosquatting
2. Secure Your Build Pipeline
Hermetic Builds
Build Provenance with SLSA
Protect Build Secrets
3. Container Image Security
Sign and Verify Images
Enforce Image Policies
4. Runtime Integrity
5. Dependency Update Strategy
The Minimum Viable Supply Chain Security
The Cost of Ignoring This The SolarWinds attack was the wake-up call. Log4Shell was the alarm. The XZ Utils backdoor was the fire drill. In 2026, supply chain attacks are the #1 vector for compromising software organizations — and most CI/CD pipelines are still wide open. This isn't a theoretical risk. If an attacker compromises a single dependency in your build pipeline, they own every deployment downstream. Here's how to lock it down. A typical CI/CD pipeline has more entry points than most developers realize: Each arrow is an attack vector. Let's secure them one by one. The first line of defense: know exactly what you're running. For JavaScript projects: Common attack: reqeusts instead of requests, colorsv2 mimicking colors. A hermetic build has no network access during compilation. This prevents build-time supply chain attacks: SLSA (Supply chain Levels for Software Artifacts) provides a framework for build integrity. GitHub Actions supports SLSA Level 3: Security doesn't end at deployment. Monitor for tampering at runtime: Keeping dependencies current is itself a security measure. Stale dependencies accumulate vulnerabilities: Not ready for full SLSA compliance? Start here: Each step is incremental. You don't need to implement everything at once. But every step closes an attack vector that real adversaries are actively exploiting. In 2025, the average cost of a supply chain compromise was $4.5M. The average time to detect: 277 days. The fixes described here cost nothing to implement and add minutes to your CI pipeline. The question isn't whether you can afford to secure your supply chain. It's whether you can afford not to. What supply chain security measures has your team implemented? I'd love to hear about real-world experiences — especially the ones that caught actual attacks. Drop your stories in the comments. Templates let you quickly answer FAQs or store snippets for re-use. Hide child comments as well For further actions, you may consider blocking this person and/or reporting abuse
Source Code → Build System → Dependencies → Container Images → Deployment ↑ ↑ ↑ ↑ ↑ Compromised Build script Typosquat Base image Stolen deploy credentials injection packages tampering credentials
Source Code → Build System → Dependencies → Container Images → Deployment ↑ ↑ ↑ ↑ ↑ Compromised Build script Typosquat Base image Stolen deploy credentials injection packages tampering credentials
Source Code → Build System → Dependencies → Container Images → Deployment ↑ ↑ ↑ ↑ ↑ Compromised Build script Typosquat Base image Stolen deploy credentials injection packages tampering credentials
# pyproject.toml — use exact versions + hashes
[project]
dependencies = [ "fastapi==0.115.0", "uvicorn==0.32.0", "pydantic==2.10.0",
] # pip-compile with hashes
# pip-compile --generate-hashes requirements.in -o requirements.txt
# pyproject.toml — use exact versions + hashes
[project]
dependencies = [ "fastapi==0.115.0", "uvicorn==0.32.0", "pydantic==2.10.0",
] # pip-compile with hashes
# pip-compile --generate-hashes requirements.in -o requirements.txt
# pyproject.toml — use exact versions + hashes
[project]
dependencies = [ "fastapi==0.115.0", "uvicorn==0.32.0", "pydantic==2.10.0",
] # pip-compile with hashes
# pip-compile --generate-hashes requirements.in -o requirements.txt
# Generate locked requirements with integrity hashes
pip-compile --generate-hashes requirements.in # Output:
# fastapi==0.115.0 \
# --hash=sha256:abc123... \
# --hash=sha256:def456...
# Generate locked requirements with integrity hashes
pip-compile --generate-hashes requirements.in # Output:
# fastapi==0.115.0 \
# --hash=sha256:abc123... \
# --hash=sha256:def456...
# Generate locked requirements with integrity hashes
pip-compile --generate-hashes requirements.in # Output:
# fastapi==0.115.0 \
# --hash=sha256:abc123... \
# --hash=sha256:def456...
// package.json — use exact versions
{ "dependencies": { "express": "4.21.0", "zod": "3.23.8" }, "overrides": {}
}
// package.json — use exact versions
{ "dependencies": { "express": "4.21.0", "zod": "3.23.8" }, "overrides": {}
}
// package.json — use exact versions
{ "dependencies": { "express": "4.21.0", "zod": "3.23.8" }, "overrides": {}
}
# Always commit the lockfile
# npm ci (not npm install) in CI — respects lockfile exactly
npm ci --ignore-scripts # --ignore-scripts prevents install-time RCE
# Always commit the lockfile
# npm ci (not npm install) in CI — respects lockfile exactly
npm ci --ignore-scripts # --ignore-scripts prevents install-time RCE
# Always commit the lockfile
# npm ci (not npm install) in CI — respects lockfile exactly
npm ci --ignore-scripts # --ignore-scripts prevents install-time RCE
# .github/workflows/supply-chain.yml
name: Supply Chain Security
on: pull_request: schedule: - cron: '0 8 * * 1' # Weekly Monday audit jobs: dependency-audit: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Run Trivy vulnerability scan uses: aquasecurity/trivy-action@master with: scan-type: 'fs' scan-ref: '.' severity: 'HIGH,CRITICAL' exit-code: '1' # Fail on findings - name: Check for known malicious packages run: | pip install pip-audit pip-audit --strict --desc on -r requirements.txt - name: SBOM generation uses: anchore/sbom-action@v0 with: artifact-name: sbom.spdx.json output-file: sbom.spdx.json - name: Upload SBOM uses: actions/upload-artifact@v4 with: name: sbom path: sbom.spdx.json
# .github/workflows/supply-chain.yml
name: Supply Chain Security
on: pull_request: schedule: - cron: '0 8 * * 1' # Weekly Monday audit jobs: dependency-audit: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Run Trivy vulnerability scan uses: aquasecurity/trivy-action@master with: scan-type: 'fs' scan-ref: '.' severity: 'HIGH,CRITICAL' exit-code: '1' # Fail on findings - name: Check for known malicious packages run: | pip install pip-audit pip-audit --strict --desc on -r requirements.txt - name: SBOM generation uses: anchore/sbom-action@v0 with: artifact-name: sbom.spdx.json output-file: sbom.spdx.json - name: Upload SBOM uses: actions/upload-artifact@v4 with: name: sbom path: sbom.spdx.json
# .github/workflows/supply-chain.yml
name: Supply Chain Security
on: pull_request: schedule: - cron: '0 8 * * 1' # Weekly Monday audit jobs: dependency-audit: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Run Trivy vulnerability scan uses: aquasecurity/trivy-action@master with: scan-type: 'fs' scan-ref: '.' severity: 'HIGH,CRITICAL' exit-code: '1' # Fail on findings - name: Check for known malicious packages run: | pip install pip-audit pip-audit --strict --desc on -r requirements.txt - name: SBOM generation uses: anchore/sbom-action@v0 with: artifact-name: sbom.spdx.json output-file: sbom.spdx.json - name: Upload SBOM uses: actions/upload-artifact@v4 with: name: sbom path: sbom.spdx.json
# scripts/check_typosquat.py
"""Check dependencies against known-good package names."""
import json
import sys
from importlib.metadata import packages_distributions KNOWN_TYPOSQUATS = { "reqeusts": "requests", "python-dateutil2": "python-dateutil", "beautifulsoup": "beautifulsoup4", "sklearn": "scikit-learn",
} def check_requirements(req_file: str) -> list[str]: warnings = [] with open(req_file) as f: for line in f: pkg = line.strip().split("==")[0].split(">=")[0].lower() if pkg in KNOWN_TYPOSQUATS: warnings.append( f"TYPOSQUAT: '{pkg}' — did you mean '{KNOWN_TYPOSQUATS[pkg]}'?" ) return warnings if __name__ == "__main__": issues = check_requirements(sys.argv[1]) for issue in issues: print(f"::error::{issue}") sys.exit(1 if issues else 0)
# scripts/check_typosquat.py
"""Check dependencies against known-good package names."""
import json
import sys
from importlib.metadata import packages_distributions KNOWN_TYPOSQUATS = { "reqeusts": "requests", "python-dateutil2": "python-dateutil", "beautifulsoup": "beautifulsoup4", "sklearn": "scikit-learn",
} def check_requirements(req_file: str) -> list[str]: warnings = [] with open(req_file) as f: for line in f: pkg = line.strip().split("==")[0].split(">=")[0].lower() if pkg in KNOWN_TYPOSQUATS: warnings.append( f"TYPOSQUAT: '{pkg}' — did you mean '{KNOWN_TYPOSQUATS[pkg]}'?" ) return warnings if __name__ == "__main__": issues = check_requirements(sys.argv[1]) for issue in issues: print(f"::error::{issue}") sys.exit(1 if issues else 0)
# scripts/check_typosquat.py
"""Check dependencies against known-good package names."""
import json
import sys
from importlib.metadata import packages_distributions KNOWN_TYPOSQUATS = { "reqeusts": "requests", "python-dateutil2": "python-dateutil", "beautifulsoup": "beautifulsoup4", "sklearn": "scikit-learn",
} def check_requirements(req_file: str) -> list[str]: warnings = [] with open(req_file) as f: for line in f: pkg = line.strip().split("==")[0].split(">=")[0].lower() if pkg in KNOWN_TYPOSQUATS: warnings.append( f"TYPOSQUAT: '{pkg}' — did you mean '{KNOWN_TYPOSQUATS[pkg]}'?" ) return warnings if __name__ == "__main__": issues = check_requirements(sys.argv[1]) for issue in issues: print(f"::error::{issue}") sys.exit(1 if issues else 0)
# Dockerfile.build — multi-stage hermetic build
# Stage 1: Download dependencies (network allowed)
FROM python:3.12-slim AS deps
WORKDIR /app
COPY requirements.txt .
RUN pip download --no-cache-dir -r requirements.txt -d /wheels # Stage 2: Build (NO network access)
FROM python:3.12-slim AS build
# Copy only pre-downloaded wheels
COPY --from=deps /wheels /wheels
COPY . /app
WORKDIR /app # Install from local wheels only — no network needed
RUN pip install --no-index --find-links=/wheels -r requirements.txt
RUN python -m pytest tests/ -x # Stage 3: Runtime (minimal image)
FROM python:3.12-slim AS runtime
COPY --from=build /app /app
COPY --from=deps /wheels /wheels
RUN pip install --no-index --find-links=/wheels -r /app/requirements.txt \ && rm -rf /wheels
WORKDIR /app
USER nobody
CMD ["python", "-m", "uvicorn", "main:app", "--host", "0.0.0.0"]
# Dockerfile.build — multi-stage hermetic build
# Stage 1: Download dependencies (network allowed)
FROM python:3.12-slim AS deps
WORKDIR /app
COPY requirements.txt .
RUN pip download --no-cache-dir -r requirements.txt -d /wheels # Stage 2: Build (NO network access)
FROM python:3.12-slim AS build
# Copy only pre-downloaded wheels
COPY --from=deps /wheels /wheels
COPY . /app
WORKDIR /app # Install from local wheels only — no network needed
RUN pip install --no-index --find-links=/wheels -r requirements.txt
RUN python -m pytest tests/ -x # Stage 3: Runtime (minimal image)
FROM python:3.12-slim AS runtime
COPY --from=build /app /app
COPY --from=deps /wheels /wheels
RUN pip install --no-index --find-links=/wheels -r /app/requirements.txt \ && rm -rf /wheels
WORKDIR /app
USER nobody
CMD ["python", "-m", "uvicorn", "main:app", "--host", "0.0.0.0"]
# Dockerfile.build — multi-stage hermetic build
# Stage 1: Download dependencies (network allowed)
FROM python:3.12-slim AS deps
WORKDIR /app
COPY requirements.txt .
RUN pip download --no-cache-dir -r requirements.txt -d /wheels # Stage 2: Build (NO network access)
FROM python:3.12-slim AS build
# Copy only pre-downloaded wheels
COPY --from=deps /wheels /wheels
COPY . /app
WORKDIR /app # Install from local wheels only — no network needed
RUN pip install --no-index --find-links=/wheels -r requirements.txt
RUN python -m pytest tests/ -x # Stage 3: Runtime (minimal image)
FROM python:3.12-slim AS runtime
COPY --from=build /app /app
COPY --from=deps /wheels /wheels
RUN pip install --no-index --find-links=/wheels -r /app/requirements.txt \ && rm -rf /wheels
WORKDIR /app
USER nobody
CMD ["python", "-m", "uvicorn", "main:app", "--host", "0.0.0.0"]
# .github/workflows/release.yml
name: Release with SLSA Provenance
on: push: tags: ['v*'] permissions: contents: write id-token: write # For OIDC signing attestations: write jobs: build: runs-on: ubuntu-latest outputs: digest: ${{ steps.hash.outputs.digest }} steps: - uses: actions/checkout@v4 - name: Build artifact run: | python -m build sha256sum dist/*.whl > checksums.txt - name: Calculate digest id: hash run: | echo "digest=$(sha256sum dist/*.whl | base64 -w0)" >> "$GITHUB_OUTPUT" - uses: actions/upload-artifact@v4 with: name: dist path: dist/ provenance: needs: build uses: slsa-framework/slsa-github-generator/.github/workflows/[email protected] permissions: actions: read id-token: write contents: write with: base64-subjects: "${{ needs.build.outputs.digest }}" upload-assets: true
# .github/workflows/release.yml
name: Release with SLSA Provenance
on: push: tags: ['v*'] permissions: contents: write id-token: write # For OIDC signing attestations: write jobs: build: runs-on: ubuntu-latest outputs: digest: ${{ steps.hash.outputs.digest }} steps: - uses: actions/checkout@v4 - name: Build artifact run: | python -m build sha256sum dist/*.whl > checksums.txt - name: Calculate digest id: hash run: | echo "digest=$(sha256sum dist/*.whl | base64 -w0)" >> "$GITHUB_OUTPUT" - uses: actions/upload-artifact@v4 with: name: dist path: dist/ provenance: needs: build uses: slsa-framework/slsa-github-generator/.github/workflows/[email protected] permissions: actions: read id-token: write contents: write with: base64-subjects: "${{ needs.build.outputs.digest }}" upload-assets: true
# .github/workflows/release.yml
name: Release with SLSA Provenance
on: push: tags: ['v*'] permissions: contents: write id-token: write # For OIDC signing attestations: write jobs: build: runs-on: ubuntu-latest outputs: digest: ${{ steps.hash.outputs.digest }} steps: - uses: actions/checkout@v4 - name: Build artifact run: | python -m build sha256sum dist/*.whl > checksums.txt - name: Calculate digest id: hash run: | echo "digest=$(sha256sum dist/*.whl | base64 -w0)" >> "$GITHUB_OUTPUT" - uses: actions/upload-artifact@v4 with: name: dist path: dist/ provenance: needs: build uses: slsa-framework/slsa-github-generator/.github/workflows/[email protected] permissions: actions: read id-token: write contents: write with: base64-subjects: "${{ needs.build.outputs.digest }}" upload-assets: true
# .github/workflows/deploy.yml
jobs: deploy: runs-on: ubuntu-latest environment: production # Requires approval permissions: id-token: write # OIDC — no long-lived secrets steps: - name: Authenticate to cloud (OIDC, no stored keys) uses: aws-actions/configure-aws-credentials@v4 with: role-to-assume: arn:aws:iam::123456789:role/deploy aws-region: us-east-1 # No AWS_ACCESS_KEY_ID needed — uses GitHub OIDC token - name: Deploy run: | # Verify the artifact before deploying cosign verify-blob \ --signature dist/app.sig \ --certificate dist/app.crt \ dist/app.whl # Only deploy verified artifacts aws s3 cp dist/app.whl s3://deploy-bucket/
# .github/workflows/deploy.yml
jobs: deploy: runs-on: ubuntu-latest environment: production # Requires approval permissions: id-token: write # OIDC — no long-lived secrets steps: - name: Authenticate to cloud (OIDC, no stored keys) uses: aws-actions/configure-aws-credentials@v4 with: role-to-assume: arn:aws:iam::123456789:role/deploy aws-region: us-east-1 # No AWS_ACCESS_KEY_ID needed — uses GitHub OIDC token - name: Deploy run: | # Verify the artifact before deploying cosign verify-blob \ --signature dist/app.sig \ --certificate dist/app.crt \ dist/app.whl # Only deploy verified artifacts aws s3 cp dist/app.whl s3://deploy-bucket/
# .github/workflows/deploy.yml
jobs: deploy: runs-on: ubuntu-latest environment: production # Requires approval permissions: id-token: write # OIDC — no long-lived secrets steps: - name: Authenticate to cloud (OIDC, no stored keys) uses: aws-actions/configure-aws-credentials@v4 with: role-to-assume: arn:aws:iam::123456789:role/deploy aws-region: us-east-1 # No AWS_ACCESS_KEY_ID needed — uses GitHub OIDC token - name: Deploy run: | # Verify the artifact before deploying cosign verify-blob \ --signature dist/app.sig \ --certificate dist/app.crt \ dist/app.whl # Only deploy verified artifacts aws s3 cp dist/app.whl s3://deploy-bucket/
# Sign your container images with cosign (Sigstore)
# Generate a keyless signature using OIDC
cosign sign --yes ghcr.io/myorg/myapp:v1.2.3 # Verify before deploying
cosign verify \ --certificate-identity=https://github.com/myorg/myapp/.github/workflows/build.yml@refs/tags/v1.2.3 \ --certificate-oidc-issuer=https://token.actions.githubusercontent.com \ ghcr.io/myorg/myapp:v1.2.3
# Sign your container images with cosign (Sigstore)
# Generate a keyless signature using OIDC
cosign sign --yes ghcr.io/myorg/myapp:v1.2.3 # Verify before deploying
cosign verify \ --certificate-identity=https://github.com/myorg/myapp/.github/workflows/build.yml@refs/tags/v1.2.3 \ --certificate-oidc-issuer=https://token.actions.githubusercontent.com \ ghcr.io/myorg/myapp:v1.2.3
# Sign your container images with cosign (Sigstore)
# Generate a keyless signature using OIDC
cosign sign --yes ghcr.io/myorg/myapp:v1.2.3 # Verify before deploying
cosign verify \ --certificate-identity=https://github.com/myorg/myapp/.github/workflows/build.yml@refs/tags/v1.2.3 \ --certificate-oidc-issuer=https://token.actions.githubusercontent.com \ ghcr.io/myorg/myapp:v1.2.3
# kubernetes/policy.yaml — only allow signed images
apiVersion: policy/v1
kind: ValidatingAdmissionPolicy
metadata: name: require-signed-images
spec: matchConstraints: resourceRules: - apiGroups: [""] apiVersions: ["v1"] operations: ["CREATE", "UPDATE"] resources: ["pods"] validations: - expression: | object.spec.containers.all(c, c.image.startsWith('ghcr.io/myorg/') && c.image.contains('@sha256:') ) message: "All images must be from ghcr.io/myorg/ with digest pinning"
# kubernetes/policy.yaml — only allow signed images
apiVersion: policy/v1
kind: ValidatingAdmissionPolicy
metadata: name: require-signed-images
spec: matchConstraints: resourceRules: - apiGroups: [""] apiVersions: ["v1"] operations: ["CREATE", "UPDATE"] resources: ["pods"] validations: - expression: | object.spec.containers.all(c, c.image.startsWith('ghcr.io/myorg/') && c.image.contains('@sha256:') ) message: "All images must be from ghcr.io/myorg/ with digest pinning"
# kubernetes/policy.yaml — only allow signed images
apiVersion: policy/v1
kind: ValidatingAdmissionPolicy
metadata: name: require-signed-images
spec: matchConstraints: resourceRules: - apiGroups: [""] apiVersions: ["v1"] operations: ["CREATE", "UPDATE"] resources: ["pods"] validations: - expression: | object.spec.containers.all(c, c.image.startsWith('ghcr.io/myorg/') && c.image.contains('@sha256:') ) message: "All images must be from ghcr.io/myorg/ with digest pinning"
# integrity_checker.py
"""Runtime integrity monitoring."""
import hashlib
import json
import os
import sys
from pathlib import Path class IntegrityChecker: def __init__(self, manifest_path: str): with open(manifest_path) as f: self.manifest = json.load(f) def verify(self) -> list[str]: """Check all files against known-good hashes.""" violations = [] for file_path, expected_hash in self.manifest["files"].items(): if not os.path.exists(file_path): violations.append(f"MISSING: {file_path}") continue actual_hash = self._hash_file(file_path) if actual_hash != expected_hash: violations.append( f"TAMPERED: {file_path} " f"(expected {expected_hash[:16]}..., " f"got {actual_hash[:16]}...)" ) return violations @staticmethod def _hash_file(path: str) -> str: h = hashlib.sha256() with open(path, "rb") as f: for chunk in iter(lambda: f.read(8192), b""): h.update(chunk) return h.hexdigest() @classmethod def generate_manifest(cls, root_dir: str, output: str): """Generate integrity manifest for deployment.""" files = {} for path in Path(root_dir).rglob("*.py"): rel = str(path.relative_to(root_dir)) files[str(path)] = cls._hash_file(str(path)) manifest = { "version": "1.0", "files": files, } with open(output, "w") as f: json.dump(manifest, f, indent=2) # Run on startup
checker = IntegrityChecker("/app/integrity-manifest.json")
violations = checker.verify()
if violations: for v in violations: print(f"INTEGRITY VIOLATION: {v}", file=sys.stderr) # Alert, but don't crash — let the operator decide # send_alert(violations)
# integrity_checker.py
"""Runtime integrity monitoring."""
import hashlib
import json
import os
import sys
from pathlib import Path class IntegrityChecker: def __init__(self, manifest_path: str): with open(manifest_path) as f: self.manifest = json.load(f) def verify(self) -> list[str]: """Check all files against known-good hashes.""" violations = [] for file_path, expected_hash in self.manifest["files"].items(): if not os.path.exists(file_path): violations.append(f"MISSING: {file_path}") continue actual_hash = self._hash_file(file_path) if actual_hash != expected_hash: violations.append( f"TAMPERED: {file_path} " f"(expected {expected_hash[:16]}..., " f"got {actual_hash[:16]}...)" ) return violations @staticmethod def _hash_file(path: str) -> str: h = hashlib.sha256() with open(path, "rb") as f: for chunk in iter(lambda: f.read(8192), b""): h.update(chunk) return h.hexdigest() @classmethod def generate_manifest(cls, root_dir: str, output: str): """Generate integrity manifest for deployment.""" files = {} for path in Path(root_dir).rglob("*.py"): rel = str(path.relative_to(root_dir)) files[str(path)] = cls._hash_file(str(path)) manifest = { "version": "1.0", "files": files, } with open(output, "w") as f: json.dump(manifest, f, indent=2) # Run on startup
checker = IntegrityChecker("/app/integrity-manifest.json")
violations = checker.verify()
if violations: for v in violations: print(f"INTEGRITY VIOLATION: {v}", file=sys.stderr) # Alert, but don't crash — let the operator decide # send_alert(violations)
# integrity_checker.py
"""Runtime integrity monitoring."""
import hashlib
import json
import os
import sys
from pathlib import Path class IntegrityChecker: def __init__(self, manifest_path: str): with open(manifest_path) as f: self.manifest = json.load(f) def verify(self) -> list[str]: """Check all files against known-good hashes.""" violations = [] for file_path, expected_hash in self.manifest["files"].items(): if not os.path.exists(file_path): violations.append(f"MISSING: {file_path}") continue actual_hash = self._hash_file(file_path) if actual_hash != expected_hash: violations.append( f"TAMPERED: {file_path} " f"(expected {expected_hash[:16]}..., " f"got {actual_hash[:16]}...)" ) return violations @staticmethod def _hash_file(path: str) -> str: h = hashlib.sha256() with open(path, "rb") as f: for chunk in iter(lambda: f.read(8192), b""): h.update(chunk) return h.hexdigest() @classmethod def generate_manifest(cls, root_dir: str, output: str): """Generate integrity manifest for deployment.""" files = {} for path in Path(root_dir).rglob("*.py"): rel = str(path.relative_to(root_dir)) files[str(path)] = cls._hash_file(str(path)) manifest = { "version": "1.0", "files": files, } with open(output, "w") as f: json.dump(manifest, f, indent=2) # Run on startup
checker = IntegrityChecker("/app/integrity-manifest.json")
violations = checker.verify()
if violations: for v in violations: print(f"INTEGRITY VIOLATION: {v}", file=sys.stderr) # Alert, but don't crash — let the operator decide # send_alert(violations)
# .github/dependabot.yml
version: 2
updates: - package-ecosystem: "pip" directory: "/" schedule: interval: "weekly" reviewers: - "security-team" labels: - "dependencies" - "security" # Group minor/patch updates to reduce PR noise groups: minor-and-patch: update-types: - "minor" - "patch" # Only allow updates that pass security audit allow: - dependency-type: "direct" - package-ecosystem: "docker" directory: "/" schedule: interval: "weekly" - package-ecosystem: "github-actions" directory: "/" schedule: interval: "weekly" # Pin actions by SHA, not tag
# .github/dependabot.yml
version: 2
updates: - package-ecosystem: "pip" directory: "/" schedule: interval: "weekly" reviewers: - "security-team" labels: - "dependencies" - "security" # Group minor/patch updates to reduce PR noise groups: minor-and-patch: update-types: - "minor" - "patch" # Only allow updates that pass security audit allow: - dependency-type: "direct" - package-ecosystem: "docker" directory: "/" schedule: interval: "weekly" - package-ecosystem: "github-actions" directory: "/" schedule: interval: "weekly" # Pin actions by SHA, not tag
# .github/dependabot.yml
version: 2
updates: - package-ecosystem: "pip" directory: "/" schedule: interval: "weekly" reviewers: - "security-team" labels: - "dependencies" - "security" # Group minor/patch updates to reduce PR noise groups: minor-and-patch: update-types: - "minor" - "patch" # Only allow updates that pass security audit allow: - dependency-type: "direct" - package-ecosystem: "docker" directory: "/" schedule: interval: "weekly" - package-ecosystem: "github-actions" directory: "/" schedule: interval: "weekly" # Pin actions by SHA, not tag - Pin dependencies with hashes — 10 minutes to set up, blocks most package tampering
- Run pip-audit or npm audit in CI — catches known vulnerabilities automatically
- Use --ignore-scripts for npm — prevents install-time code execution
- Pin GitHub Actions by SHA — uses: actions/checkout@abc123 not @v4
- Enable Dependabot — automated security updates
- Sign your releases — cosign sign takes 30 seconds