The cognition layer
for AI agents.
One gateway. Seven planes. Up to 91% fewer tokens.
Why standard tools fail agents.
Agents process language natively. Forcing them to communicate via dense infra JSON breaks their context window and cripples autonomy.
Context Waste
Agents waste 55-105K tokens per session navigating tool schemas and repetitive JSON structures.
No Memory
Every session starts completely empty. Zero persistence of yesterday's architecture decisions or working context.
No Trust Layer
Invisible boundaries. No cryptographic proof connecting what the agent actually did to what the human reviewer was told.
Up to 91% Compression on the Wire
Interactive before/after: See the token difference between standard JSON-RPC and AgentCTX.
▌Bigger Context Windows Are Compounding Debt
CTX is Compounding Value.
Without CTX
More window → more overhead → more cost → diminishing returns
With CTX
Same window → less overhead → more reasoning space → compounding returns
Restore the Context Window
Tokens spent on infra syntax are tokens stolen from agent reasoning.
The Operating System for AI
POSIX gave processes a common interface to hardware. CTX gives agents a common interface to cognition. It is a complete synthesis.
The ISA (Grammar)
Instead of x86 instructions, AgentCTX defines a deterministic operational grammar (CTX). Models emit raw syntax strings that are guaranteed to compile, eliminating JSON-RPC hallucinations and validation loops.
The Kernel (Gateway)
The centralized routing engine. It parses CTX streams, enforces zero-trust memory segmentation, and orchestrates agent-to-agent Inter-Process Communication (IPC) via internal memory graphs.
The Process (Agent)
LLM Agents operate as isolated runtimes. They can spawn child processes (sub-agents) and pass file descriptors (memory pointers), while the Kernel handles state management and fault tolerance.
Memory Paging (CTX/CAS)
The system never transmits redundant context. Using Content-Addressable Storage (CAS) and the CTX grammar, the OS pages state in and out of the context window with 99% efficiency.
The Sidecar Compiler
AgentCTX is a compiler, not an LLM.
Instead of relying on unpredictable LLM middleware, AgentCTX uses a deterministic Rust compiler (the sidecar) to parse agent commands into 7 distinct target formats:
- SurrealQL: For vector search, memory, and analytics
- REST & GraphQL: For legacy API integration
- MCP JSON-RPC: For native Model Context Protocol tools
- CTXB: Binary machine-to-machine transport encoding
Every translation is cryptographically signed using Ed25519, creating an immutable, verifiable audit trail of what the agent actually saw and did.
The Architecture
6 tiers. Zero bloat. Every layer earns its place.
Full-Stack Compression
Bigger context windows are compounding debt. CTX is compounding value.
Comprehensive Control
View All 100+ Features & Roadmap →Seven Context Planes
Tools, knowledge, memory, skills, agents, inspection, LLM — one unified cryptographic gateway.
5.7× Token Compression
Measured across 12 operation types, not theoretical limits.
Persistent Memory
Cross-session, tag-filtered, sub-ms lookup flat to 50,000 entries.
MCP-Native Proxy
Lazily spawned, role-scoped, namespaced. Zero vendor lock-in.
8-Layer Security
From WASM parse-time validation to distributed Threshold Crypto.
Agent-to-Agent
15 tokens per protocol message vs 2,000+ tokens in prose manifests.
Signed Translations
Every action has an Ed25519 deterministic cryptographic receipt.
Rust Multi-Lang Core
TypeScript foundation + Rust parser (Node.js, WASM, Python, native binaries).
Convergent Deduplication
O(1) storage scaling across large fleets. 100 agents securely discover and deduplicate identical semantic insights.
Automation Engine
Run scheduled background tasks, stateless cron agents, and complex DAG workflows without human-in-the-loop dependencies.
Sentinel Alignment
Strictly enforce groundedness with pre/post execution hooks. Detect divergence, anomalies, and sandbagging across the fleet in real-time.
Deployment Tiers
From local open-source testing to enterprise-scale autonomous swarms.
Free (Local Open Source)
Full Apache-2.0 CLI, local encrypted credential storage (BYOK + 1Password), L1 compaction, Ed25519 signing, and local SurrealDB. 1-agent limit enforced via WASM compilation boundary.
Perfect for local scripting and personal AI assistants.
npm create agentctx@latestTerminal Velocity
Configure the entire context stack in five commands.
Ready to restore agent autonomy?
Start saving tokens and gaining cryptographic trust in your agent workflows today.