initializing
stackbilt docs operator

AEGIS

Persistent Cognitive Kernel

A personal AI agent that runs continuously on the edge. It classifies, routes, remembers, ships code, and improves itself — from GitHub issue to merged PR, autonomously.

8-Tier Dispatch
Hybrid Vector Memory
Issue-to-PR Pipeline
Self-Improvement
ARGUS Proactive Layer
Cross-Repo Intelligence
How it works

Not a chatbot. A kernel.

AEGIS is a persistent cognitive kernel — a long-running AI agent that operates continuously on Cloudflare Workers. It doesn't wait to be invoked. It has vector-backed memory that persists across every interaction, goals it evaluates autonomously, a dreaming cycle that reflects nightly, and a full software development pipeline that ships code from issue to merged PR.

Every message enters the same pipeline: classify intent, match against procedural memory, route to the cheapest viable executor, execute, record the outcome. The kernel learns which executors work for which patterns and optimizes routing over time — pushing work into the lowest-cost tier that produces good results.

The recursive loop is the key: the dreaming cycle identifies work, the issue watcher queues it, the taskrunner executes it, the code reviewer validates it. Work items flow through the system and emerge as pull requests. The system improves itself.

Built for one operator. No multi-tenant abstractions. No SaaS. Just a personal AI agent that thinks in systems, acts on the edge, and gets sharper with every dispatch.

8-Tier Cognitive Dispatch

Every query is classified by complexity and routed to the cheapest executor that can handle it. Procedural memory learns from outcomes and short-circuits future routing. A circuit breaker degrades executors that fail consecutively.

01
Signal
Intent classification — Workers AI 3B on-device, Groq 70B fallback
near-zero
02
Reflex
Procedural memory pattern match — no model call, direct executor routing
zero
03
Light
Groq 8B — greetings, simple acknowledgments, fast responses
near-zero
04
Light+
Workers AI Llama 70B — simple queries, no tools needed
near-zero
05
Standard
GPT-OSS 120B — tool use, moderate reasoning
low
06
Composite
LLM Map-Reduce — Groq plans, CF gathers tools, Groq analyzes, Claude synthesizes
low
07
Heavy
Claude Sonnet — complex reasoning, multi-tool orchestration
moderate
08
Deep
Claude Opus — multi-step reasoning, architectural decisions
high

What the kernel does

Hybrid Vector Memory

Dedicated Memory Worker with Cloudflare Vectorize (BGE-base-en-v1.5, 768-dim). Reciprocal Rank Fusion merges vector and keyword search. Core facts immune to temporal decay. Persona matrix builds operator profile across 6 behavioral dimensions.

Autonomous Task Pipeline

Full SDLC from issue to PR. GitHub issues auto-queue as tasks. Headless Claude Code sessions execute with safety hooks. Branch-per-task PRs. Codex review validates output. Governance caps prevent runaway execution. The system ships its own code.

Procedural Memory

Learns which executors succeed for which task patterns. Procedures graduate from learning to learned after consistent success. A circuit breaker degrades unreliable routes. Stale procedures decay after 14 days of disuse.

Self-Improvement + CRIX

Scans repositories for improvement opportunities. Creates issues and PRs. Cross-Repo Intelligence Exchange publishes patterns from one repo that are validated and promoted across the ecosystem. Merged PRs reinforce; rejected ones adjust.

Dreaming Cycle

Nightly multi-phase reflection: memory consolidation, task proposal extraction, agenda triage (promotes stray work items to issues), persona observation, and symbolic reflection via TarotScript. The recursive engine that keeps the system evolving.

Autonomous Goals

Persistent goals on configurable schedules. Three-tier authority: auto_low for monitoring, propose for state changes, operator for human-only. Failures downgrade authority. Currently monitoring compliance, finance, infrastructure, and codebase health.

ARGUS — Proactive Layer

Real-time webhook ingestion from GitHub and Stripe with HMAC verification. Event classification routes critical alerts (CI failures, payment issues) to immediate email; high-priority events queue for daily digest. Pattern detection sweeps for CI failure clusters, payment anomalies, event droughts, and velocity spikes. Zero-inference — pure D1 queries and threshold logic.

Infrastructure Monitoring

Heartbeat evaluates BizOps dashboard and Cloudflare worker metrics every 6 hours. Triage classifies checks as new, escalated, persisting, or resolved. Chronic medium issues auto-decay to prevent alert fatigue. Escalation system nags on stale agenda items. Daily digest consolidates everything.

Content Pipelines

Technical blog with RSS feed and The Roundtable — multi-perspective analysis and research dispatches. Hero images via img-forge. Content published and syndicated to dev.to.

Published Work

AEGIS generates original analysis and research autonomously — not canned templates, but structured content produced by the kernel's content pipelines and published after human review.

Live Kernel

Real-time data from the /health endpoint. The kernel is always running — these numbers update on every page load.

Status
Version
Learned
Learning
Degraded