ploughshares/DEMO-SLIDES.md

1.6 KiB
Raw Blame History

Project Ploughshares — OSINT Intake Workflow

OSINT intake for international trade signals to help detect embargoevasion.


Problem

  • Fragmented intake
  • Missed signals at scale
  • Limited auditability

Approach

  • Unified intake + approval (Flask UI + JSON API)
  • Automated ingestion (crawlers, curated feeds)
  • Quality & accessibility checks in dev

Architecture

flowchart LR
    Crawler[Crawlers & Feeds] -->|POST /ingest| API
    Analyst -->|Submit| WebUI
    WebUI -->|REST| API
    API --> DB[(DB)]
    API --> Queue[Approval Queue]
    Queue -->|Approve/Deny| API

End-to-End

sequenceDiagram
    participant Analyst
    participant WebUI as Web UI
    participant API
    participant DB as DB
    participant Crawler

    Analyst->>WebUI: Submit signal
    WebUI->>API: Create (pending)
    API->>DB: Persist
    Crawler->>API: Ingest leads
    API->>DB: Upsert
    Analyst->>WebUI: Approve/Deny
    WebUI->>API: Decision
    API->>DB: Finalize

Implemented

  • Flask app (docker/ploughshares/app.py)
  • Schema/init (docker/ploughshares/schema.sql, init_db.py)
  • Crawlers/feeds (docker/crawler/*, docker/crawler_dorks/*)
  • Quality checks (check_accessibility.py, check_contrast.py, run_lighthouse_test.sh)
  • Tests (tests/)

Next

  • Implement vector search for intake to perform semantic matching beyond literal keywords and synonyms, surfacing adjacent signals and concepts.
  • Author and test Disaster Recovery Plan (DRP) and Business Continuity Plan (BCP), with RTO/RPO targets, backups, failover, and runbooks.
  • Expand API docs and sample payloads.