Skip to content

Home

The Missing Layer in Your AI Stack

Context Relay Protocol™ gives every LLM unbounded context, unbounded generation, and full provenance — with zero in-window overhead. The model never knows CRP exists.

11.8xMore Content
6.1%Protocol Overhead
0Tokens Wasted
1,537Tests Passing
33/35EU AI Act Controls

The Problem

Every LLM has a finite output window. Ask for a 30-section document and the model stops at section 8. Context from previous calls is gone. There's no audit trail. There's no quality score.

RAG retrieves documents but doesn't manage output. MemGPT pages memory but burns tokens on self-management. MCP gives agents tools but not context. Nothing manages the complete context lifecycle.

CRP does.


Before & After

Without CRP

Task: "Write a 30-section K8s guide"
→ Output: 592 words, 8/30 sections
→ Truncated mid-sentence
→ No continuation
→ No quality score
→ Context lost between calls

With CRP

Task: "Write a 30-section K8s guide"
→ Output: 6,993 words, 25/30 sections
→ Automatic multi-window continuation
→ Quality tier: A
→ Full provenance DAG
→ Knowledge persists in CKF

Same model. Same hardware. Same task. CRP produces 11.8x more content at identical throughput (4.9 words/sec). The difference: CRP finishes the task.


How It Works

graph LR
    A[Your Data] -->|"ingest()"| B["6-Stage<br/>Extraction"]
    B --> C["Scored<br/>Fact Graph"]
    C -->|"envelope<br/>packing"| D["Context<br/>Envelope"]
    D -->|"dispatch()"| E["Any LLM"]
    E -->|output| F["Quality<br/>Assessment"]
    F -->|"wall hit"| G["Continuation<br/>Engine"]
    G -->|"re-extract"| B
    F -->|"complete"| H["Final<br/>Output"]

    style A fill:#7c4dff,color:#fff,stroke:none
    style E fill:#448aff,color:#fff,stroke:none
    style H fill:#43a047,color:#fff,stroke:none
  • 1. Ingest


    Feed any text. The 6-stage pipeline extracts atomic facts with scores, types, and provenance — not chunks, not embeddings.

  • 2. Pack


    Fill the context envelope to maximum saturation: $E = C - S - T - G$. Every token earns its place. Priority-ranked by relevance.

  • 3. Dispatch


    Send to any LLM — OpenAI, Anthropic, Ollama, LM Studio. 9 strategies: PUSH, PULL, agentic, streaming, batch, and more.

  • 4. Continue


    Hit the output wall? CRP detects it, extracts facts from partial output, repacks the envelope, and resumes in a fresh window. Automatically.

  • 5. Assess


    S/A/B/C/D quality tiers. 4-signal completion detection — fact flow, structure, vocabulary novelty, and content scoring.

  • 6. Audit


    Full provenance DAG from output → facts → source. HMAC-SHA256 session binding. 33/35 EU AI Act controls. Every claim is traceable.


Quick Start

pip install crprotocol[full]
import crp

client = crp.Client(model="gpt-4o-mini")

# Ingest domain knowledge
client.ingest("Kubernetes uses etcd as its distributed key-value store...")

# Dispatch with automatic continuation
output, report = client.dispatch(
    system_prompt="You are a senior infrastructure architect.",
    task_input="Write a comprehensive guide to Kubernetes networking.",
)

print(f"Words: {len(output.split()):,}")       # 6,993
print(f"Quality: {report.quality_tier}")        # A
print(f"Windows used: {report.continuation_windows}")  # 5

CRP in the AI Stack

┌────────────────────────────────────────────┐
│  A2A   — Agent-to-Agent Communication      │
├────────────────────────────────────────────┤
│  MCP   — Model Context Protocol (Tools)    │
├────────────────────────────────────────────┤
│  CRP   — Context Relay Protocol ◀ YOU ARE  │
│          Unbounded context & generation     │
│          Provenance · Quality · Compliance  │
└────────────────────────────────────────────┘

MCP gives agents tools. A2A lets agents communicate. CRP is the foundation layer that gives every agent the context it needs to reason effectively — and neither MCP nor A2A provides it.


Products Built on CRP

  • CRP Comply


    AI Governance & EU AI Act Compliance

    Drop-in compliance proxy for any OpenAI-compatible API. Change one URL — every LLM call is automatically PII-scanned, risk-classified, and written to a tamper-evident audit trail.

    Learn more

  • CRP Scribe


    AI-Powered Document Generation

    Generate long-form, structured documents — RFPs, compliance reports, technical manuals — with automatic multi-window continuation and quality-assessed output.

    Learn more

View all products


  • Why CRP?

    How CRP differs from RAG, MemGPT, LangChain, MCP, and A2A. The 10 axioms. The 9 innovations.

  • Getting Started

    Install CRP and run your first dispatch in 5 minutes.

  • Protocol Specification

    Deep-dive into the architecture: envelope, extraction, continuation, CKF.

  • Compliance

    EU AI Act, ISO 42001, GDPR, and NIST AI RMF coverage.

  • API Reference

    Complete API documentation for the Python SDK.

  • Benchmarks

    Real numbers: 11.8x content, 6.1% overhead, zero throughput loss.


Contact