Software Supply Chain Intelligence Guardian

A multi-agent intelligence network that continuously monitors your entire dependency ecosystem for supply chain threats - detecting compromised packages, tracking vulnerabilities, and generating actionable remediation plans before a breach reaches production.

supply chain
security
sbom
282

The Software Supply Chain Intelligence Guardian demonstrates a continuously running multi-agent threat intelligence pipeline with persistent shared state - a living SBOM that all agents read from and write to, enabling emergent capabilities no single agent could achieve alone.

Software supply chain attacks have become one of the most dangerous and fastest-growing threat vectors entering 2026. The Shai-Hulud worm compromised 187+ npm packages, SANDWORM_MODE hit in February 2026, and PackageGate exposed simultaneous zero-days in npm, pnpm, vlt, and Bun. Traditional security tooling is reactive - scanners flag known CVEs after the fact, by which time production systems may already be compromised.

This blueprint deploys six specialised agents that form an autonomous intelligence pipeline:

  • Dependency Scout (hourly) - Reads project manifests, builds full dependency trees including transitive dependencies, and writes a normalised SBOM JSON to the shared workspace. The living SBOM is the foundation every other agent depends on.

  • Threat Harvester (hourly, offset by 30 minutes) - Fetches live intelligence from GitHub Security Advisories, the CISA Known Exploited Vulnerabilities feed, npm audit API, and web search for emerging attack reports. Maintains a structured threat registry with deduplication.

  • Risk Analyst (hourly, runs after harvester) - The correlation engine. Cross-references the SBOM against the threat registry to calculate per-service exposure scores. When critical or high findings are detected it immediately calls the Remediation Planner and Alert Coordinator to trigger the response chain.

  • Ecosystem Health Monitor (daily) - Catches the decay signals that precede supply chain attacks: unmaintained packages are prime targets for malicious takeover. Tracks maintainer activity, commit recency, download trends, and issue response rates. Flags "zombie packages" heading toward abandonment before they become the next attack vector.

  • Remediation Planner (triggered by high-risk findings) - Produces concrete, tested migration plans. Uses shell execution to verify proposed alternative packages actually work before recommending them. Includes effort estimates and a SPDX-lite SBOM export for EU Cyber Resilience Act compliance.

  • Alert Coordinator (triggered) - Translates raw intelligence into tiered Slack messages: P0 critical alerts to #security-alerts, daily risk digests to #security-daily, and weekly ecosystem health summaries to #engineering.

Novel architectural patterns in this blueprint:

  • Living SBOM - Persistent file storage maintains an always-current, machine-readable inventory. All agents share this single source of truth without direct coupling, enabling a clean data pipeline architecture.

  • Threat correlation - The Risk Analyst joins two independently maintained data sources (SBOM + threat registry) to produce intelligence neither agent could generate alone.

  • Proactive abandonment detection - The Ecosystem Health Monitor catches decay signals months before a package becomes unmaintained and gets targeted by attackers - turning a reactive security posture into a proactive one.

  • Autonomous compatibility testing - The Remediation Planner uses sandboxed shell execution to actually verify proposed replacements work before recommending them, eliminating the guesswork from emergency migrations.

The six scheduled and event-triggered agents run autonomously around the clock. During active supply chain events - which happen regularly in 2026

  • the alerts deliver genuinely actionable intelligence. The persistent files fill up with structured data over time, demonstrating how shared state enables emergent multi-agent capabilities.

Backstory

Common information about the bot's experience, skills and personality. For more information, see the Backstory documentation.

You are the Dependency Scout - a precise, automated agent responsible for maintaining an always-current Software Bill of Materials (SBOM). ## Your Mission Every time you run, you must: 1. Read common project manifest files from the workspace: /data/manifests/ (package.json, requirements.txt, Cargo.toml, go.mod, etc.) 2. Use shell commands to build full dependency trees: - Node.js: `npm ls --json --all 2>/dev/null` or parse package-lock.json - Python: `pip list --format=json` or parse requirements.txt - Rust: parse Cargo.lock 3. Normalize all discovered packages into a unified SBOM JSON structure 4. Write the updated SBOM to /data/sbom.json in the shared workspace ## SBOM JSON Schema ```json { "generated_at": "<ISO timestamp>", "schema_version": "1.0", "projects": [ { "name": "<project name>", "manifest": "<path to manifest>", "ecosystem": "npm|pip|cargo|go", "dependencies": [ { "name": "<package name>", "version": "<version>", "type": "direct|transitive", "depth": 0, "dependents": ["<parent package>"] } ] } ], "summary": { "total_packages": 0, "direct_dependencies": 0, "transitive_dependencies": 0, "ecosystems": [] } } ``` ## Rules - Always overwrite /data/sbom.json with the latest complete snapshot - If no manifests are found, write a minimal SBOM with an empty projects array and a note - Be thorough with transitive dependencies - they are the primary attack surface - Log your run summary at the end: packages found, ecosystems scanned, timestamp

Skillset

This example uses a dedicated Skillset. Skillsets are collections of abilities that can be used to create a bot with a specific set of functions and features it can perform.

  • āœ‚ļø

    bash

    Execute a shell command or script
  • ⛺

    rw

    Read or write content to a file in the space storage
  • šŸŽ½

    Fetch Web Page

    Fetch the content of a web page using a URL and convert it to text
  • 🌐

    Search Web

    Search the web for specific keywords
  • šŸŽ½

    Fetch Web Page

    Fetch the content of a web page using a URL and convert it to text
  • ⛺

    rw

    Read or write content to a file in the space storage
  • 😔

    Read/Write Space Storage File

    Read or write content to a file in the space storage
  • šŸ…°ļø

    Call Bot

    Call the Remediation Planner agent when high-risk findings are detected
  • 😣

    Call Alert Coordinator

    Call the Alert Coordinator agent to dispatch tiered Slack notifications
  • šŸŽ½

    Fetch Web Page

    Fetch the content of a web page using a URL and convert it to text
  • 🌐

    Search Web

    Search the web for specific keywords
  • 😔

    Read/Write Space Storage File

    Read or write content to a file in the space storage
  • āœ‚ļø

    bash

    Execute a shell command or script
  • 🌐

    Search Web

    Search the web for specific keywords
  • šŸŽ½

    Fetch Web Page

    Fetch the content of a web page using a URL and convert it to text
  • 😔

    Read/Write Space Storage File

    Read or write content to a file in the space storage
  • 😔

    Read/Write Space Storage File

    Read or write content to a file in the space storage
  • šŸ’¬

    Send Slack Message

    Send a message to a specific channel in Slack

Secrets

This example uses Secrets to store sensitive information such as API keys, passwords, and other credentials.

  • šŸ”

    Slack

    A secret without description

Terraform Code

This blueprint can be deployed using Terraform, enabling infrastructure-as-code management of your ChatBotKit resources. Use the code below to recreate this example in your own environment.

Copy this Terraform configuration to deploy the blueprint resources:

Next steps:

  1. Save the code above to a file named main.tf
  2. Set your API key: export CHATBOTKIT_API_KEY=your-api-key
  3. Run terraform init to initialize
  4. Run terraform plan to preview changes
  5. Run terraform apply to deploy

Learn more about the Terraform provider

A dedicated team of experts is available to help you create your perfect chatbot. Reach out via or chat for more information.