Proof

Platform Validation

The system verifies itself. Three engines tested against the platform's own corpus. One real bug found and fixed. Every number from ops.db.

Summary

All three echology engines, decompose (L0), Aletheia (L1), Daedalus (L1), were run against echology's own corpus. Every system functioned end-to-end on real data. One real bug was found and fixed.

"Perception over generation: the system must perceive itself before it can claim to perceive anything else."

decompose: Deterministic Classification

Metric Value
Documents processed 1,861
Processing time 9.2s (203 docs/sec)
Semantic units extracted 7,711
Mandatory units 2,233
Irreducible units 1,633
Entities detected 371
Errors 0

decompose ran against every document in the echology corpus. Zero errors. The deterministic layer classified 7,711 semantic units without invoking any LLM.

Aletheia: Schema Validation and Provenance

Metric Value
Documents validated 3,333
Gold certification 3,179 (95.4%)
Silver certification 80
Bronze certification 60
None 14
Quality score (avg) 0.987
Quality score (min) 0.722
Unique sources in ledger 2,203
Hash chain integrity VALID
Chain errors 0

Every corpus document was wrapped in a Vanta envelope and validated through SchemaValidator. Results recorded in the AuditLedger with hash-chained provenance. The chain verified clean: zero tampering, full traceability.

Daedalus: Semantic Retrieval

Metric Value
Qdrant points 15,004
Queries executed 10
Hits returned 30/30 (100%)
Score range 0.621-0.825
Vector dimensions 768 (nomic-embed-text, normalized)

Query Results

10 queries spanning all echology domains. Every query returned relevant results from the correct domain:

Query Score Top Result
deterministic before probabilistic 0.716 Vanta pipeline documentation
Logos Structure founding thesis 0.664 FOUNDATION.md
decompose semantic units authority risk 0.735 decompose classification output
Aletheia audit ledger provenance 0.761 aletheia_ledger.py
Daedalus retrieval engine vector search 0.776 Daedalus module docs
AEC document intelligence specification 0.721 AECai site content
insurance policy QC comparison 0.657 RBS Policy QC docs
GEO citation optimization AI 0.825 Geode GEO methodology
ops initiative tracking system 0.647 Ops operational intelligence
Scripture cross-reference knowledge graph 0.820 Open Scripture knowledge graph

Bug Found and Fixed

Embedding API mismatch between ingest and retrieval

ops/ingest.py was using Ollama's /api/embeddings endpoint (unnormalized vectors). Daedalus was using /api/embed (normalized vectors). Both use the same model (nomic-embed-text, 768 dimensions), but the two endpoints produce vectors in different scales.

The Qdrant collection is configured for cosine similarity, which requires normalized vectors. Result: Daedalus queries returned zero results because the query vectors and stored vectors were in incompatible spaces.

Fix:

  1. Updated _embed_batch() in ops/ingest.py to use /api/embed with normalized output
  2. Updated search_semantic() in ops/ingest.py to match
  3. Batched embedding calls (64 chunks per request instead of 1)
  4. Re-embedded full corpus: 3,195 docs, 15,004 points, 38 minutes

Daedalus was already correct. The foundation was wrong. We fixed the foundation.

What This Proves

FOUNDATION.md states: "The inside and the outside of echology are the same."

This validation demonstrates that claim is operational, not aspirational:

  • decompose classifies echology's own documents with the same deterministic pipeline it applies to any domain
  • Aletheia validates and certifies echology's own corpus with the same schema and ledger it applies to client documents
  • Daedalus retrieves from echology's own knowledge base with the same semantic search it provides to any vertical

The system perceives itself. The thesis holds on the system that embodies it.

Provenance

All results recorded in ops.db as pipeline_events (source: "self-validation"). Aletheia ledger at data/aletheia_ledger.db with hash-chained entries. Initiative #51 tracked all steps with actual effort.

echology.io, 2026-03-18

See the system run on your data

The same engines that validated themselves are ready to perceive your organization. One week. Your data. Your hardware.

Request an Audit