ploughshares/DEMO.md

3.6 KiB

Project Ploughshares - OSINT Intake Workflow Demo

This demo summarizes the initial implementation of a replacement intake workflow for OSINT signals related to international trade, intended to help detect and deter embargo-evasion patterns. The system focuses on capturing OSINT leads, triaging and approving them, and maintaining an auditable record suitable for collaboration with researchers and analysts.

Goals

  • Streamline OSINT intake: Provide a clear path to submit, normalize, review, and approve OSINT signals and related evidence.
  • Create an auditable workflow: Every step - from ingestion to analyst approval - is persisted with status and metadata.
  • Enable automation: Support ingestion of leads from crawlers and curated feeds to reduce manual toil.
  • Improve quality: Bake in accessibility and quality checks (contrast, Lighthouse, basic headers) for the analyst-facing UI.

What we built (repo tour)

  • Web/API app: Flask application for UI and JSON endpoints
    • docker/ploughshares/app.py, templates in docker/ploughshares/templates/, static assets in docker/ploughshares/static/
    • Database schema and setup in docker/ploughshares/schema.sql and docker/ploughshares/init_db.py
  • Ingestion: Crawler components to fetch and submit OSINT leads to the API
    • docker/crawler/*.py and docker/crawler_dorks/*.py (e.g., write_to_api.py, marketline_crawler.py)
  • Quality/Accessibility: Utility scripts and checks
    • check_accessibility.py, check_contrast.py, run_lighthouse_test.sh
  • Tests and code quality: tests/ (API behavior, dependencies, and basic quality gates)
  • Runtime: Docker Compose files for local development and deployment stacks
    • docker-compose.dev.yml, docker-compose.yml, stack.staging.yml, stack.production.yml

High-level workflow (sequence)

sequenceDiagram
    participant Analyst
    participant WebUI as Web UI (Flask)
    participant API as API (Flask/JSON)
    participant DB as Database
    participant Crawler

    Analyst->>WebUI: Submit OSINT signal for review
    WebUI->>API: POST /transactions (signal + metadata)
    API->>DB: Insert pending transaction with audit fields

    Crawler->>API: POST /ingest (scraped leads)
    API->>DB: Upsert leads and link evidence

    Analyst->>WebUI: Review pending approvals
    WebUI->>API: PATCH /transactions/:id approve/deny
    API->>DB: Update status and record decision

Quality gates and accessibility checks (sequence)

sequenceDiagram
    participant Dev
    participant CI as CI/Local Checks
    participant LHT as Lighthouse
    participant A11y as Accessibility

    Dev->>CI: Push or run checks locally
    CI->>LHT: run_lighthouse_test.sh (performance & a11y)
    CI->>A11y: check_accessibility.py / check_contrast.py
    CI->>Dev: Reports and action items

How this supports Project Ploughshares

  • Mission-aligned intake: The workflow is designed to help researchers and analysts surface potential embargo-evasion signals in international trade.
  • Traceability: Signals and decisions are stored with clear statuses for collaboration and later analysis.
  • Extensible feeds: Crawlers and curated feeds can be extended as new sources become relevant.

Next steps (priority first)

  • Implement vector search for intake to perform semantic matching beyond literal keywords and synonyms, surfacing adjacent signals and concepts.
  • Author and test Disaster Recovery Plan (DRP) and Business Continuity Plan (BCP), with RTO/RPO targets, backups, failover, and runbooks.
  • Expand API docs and sample payloads.