Arachnia is an AI-driven multi-model development environment — a native, GPU-accelerated application built in Rust with direct integration across all major LLM inference providers. It is designed as a unified workspace where AI models have full awareness of your codebase, can collaborate alongside you in real time, and can propose, validate, and evolve code autonomously under your explicit review.
The AI layer is the core of the product. Silk Cortex provides streaming multi-provider chat with persistent memory context injected from the Resonance Weave recall ledger. Hivemind broadcasts a single prompt to N providers simultaneously for side-by-side comparative scoring and model evaluation. Every provider — Anthropic, OpenAI, xAI, Gemini, or a local Ollama model — routes through a single interface with no context switching.
Supporting intelligence is provided by the Spiderling analysis layer — concurrent per-file workers that surface severity-ranked findings and feed them directly into AI context — and the Venom Suite, which combines AI-driven code transformation (Venom Lattice), adversarial security payload generation (Venom), tree-sitter structural editing (AST Surgeon), and a multi-AI sandboxed build tournament (Evolution Engine).
Arachnia Live Session is the collaborative
layer — a two-party real-time co-development protocol that synchronizes editor
state, ghost cursor, selection range, open tabs, typed AI memory entries, and peer chat
through the relay at arachnia.stream:8473.
Two developers share one codebase, one AI context, and one conversation thread
simultaneously, with every spiderling finding and terminal event crossing to the peer
as structured memory.
Arachnia is structured as cooperating intelligence layers. Each layer is independent at runtime, communicating through shared application state rather than direct coupling. The rendering pipeline is single-threaded egui; all AI calls and analysis workers run on async or OS thread pools outside the render loop. Idle CPU usage is 0%.
Core capabilities organized by surface area. Every feature is compiled into the binary — no extensions, no plugin system, no network calls at startup.
| Area | Capability |
|---|---|
| AI Chat | Streaming multi-provider chat: Anthropic, OpenAI, xAI, Gemini, and Ollama local models unified in one interface. Resonance memory context injected automatically. Configurable temperature, max-tokens, and system prompt per session. Markdown response rendering. |
| Multi-model | Hivemind: broadcast a single prompt to N providers concurrently, responses rendered side-by-side with hybrid comparative scoring. Latency, quality tier, and model-specific telemetry tracked per response with optimization rules that evolve as patterns emerge. |
| Editor | Multi-tab with syntect syntax highlighting, per-tab undo tree (60 snapshots), indent rainbow tints, error lens inline diagnostics, ghost cursor badge + translucent ghost selection overlay for the collab peer, Ctrl+K inline AI palette at cursor. |
| Analysis | Concurrent Spiderling workers per file, severity-ranked findings (error / warning / hint / info), live sync to collab peer on analysis completion, Amber embedding-based semantic search and grounded recall across the full workspace. |
| Mutations | Venom Lattice: AI-driven transformation proposals as structured diffs, per-change accept/reject. AST Surgeon: tree-sitter structural edits (rename symbol, extract function, inline variable) operating on the parsed AST, not raw text. |
| Evolution | Evolution Engine: multi-AI sandboxed build tournament. Candidates compiled in isolated Cargo workspace (symlinked target/ — no full recompile), scored on three tiers (build gate / static delta / AI judge), ranked in Evolution Ledger. sled-backed cross-session snapshot memory. Human review required before any change merges. |
| Security | Venom adversarial security suite: generates concrete attack payloads in a network-blocked, 13-layer constrained sandbox. Execution is fully isolated from the live workspace. All payloads are surfaced as reviewable artifacts, never auto-applied. |
| Memory | Resonance Weave extracts typed insights from conversations into a persistent recall ledger injected as grounded system context on future queries. Memory entries are shared across collab sessions via relay. Amber semantic index enables embedding-based recall. |
| Collaboration | Arachnia Live Session: bidirectional editor sync, ghost cursor and selection range, open tab list, peer chat cross-talk, 6-kind typed memory sync (insight / chat_message / terminal / spiderling / edit_pattern / warning), FlushGate serialized push/pull, 60-second heartbeat auto-disconnect. |
| Terminal | Integrated terminal with command monitoring. Exit codes and last-40-lines output automatically create collab memory entries (terminal kind) on command completion, injected into both peers' AI context. |
| Visualization | NeuralWeave: force-directed live dependency graph with AI-annotated weak strands (high fan-out, circular deps, dead code), manual Cast Web trigger (zero idle CPU), four sub-tabs. Consciousness health sphere. VibeCanvas freeform node workspace. |
| Planning | AI-driven task decomposition. Goal → executable step tree with sub-tasks, dependencies, and status tracking. Plans persist per workspace in the project config directory. |
Each module is a Rust source file that owns its rendering, state, and logic. Modules communicate through shared application state — no inter-module method calls across boundaries.
| Binding | Action |
|---|---|
| Ctrl+K | AI command palette — inline prompt at cursor position |
| Ctrl+Shift+Z | Semantic undo history — AI-aware snapshot time-travel |
| Ctrl+P | Quick open file — fuzzy search across workspace |
| Ctrl+S | Save current file |
| Ctrl+, | Open settings panel |
| Ctrl+= / Ctrl+- | Zoom in / zoom out |
| Ctrl+0 | Reset zoom to 100% |
| Ctrl+Shift+I | Switch to IDE Mode |
| Ctrl+Shift+V | Switch to Vibe Mode |
| Ctrl+Shift+J | Toggle Zen sub-mode (Vibe Mode only) |
Configure API keys in settings. All routing is local — keys are never transmitted to the relay.
| Provider | Models | Endpoint |
|---|---|---|
| Anthropic | claude-3-5-sonnet claude-3-7-sonnet claude-3-opus claude-3-haiku | api.anthropic.com |
| OpenAI | gpt-4o gpt-4-turbo o1-preview | api.openai.com |
| xAI | grok-beta | api.x.ai |
| gemini-1.5-pro gemini-1.5-flash | generativelanguage.googleapis.com | |
| Ollama | any local model | localhost:11434 |
localhost:11434/api/tags.
The model list populates without manual configuration. No API key required.
Arachnia Live Session is a two-party collaboration protocol built on the public relay
at arachnia.stream:8473. One peer hosts by
generating a token; the second joins by entering it. The relay carries all sync payloads
in memory — no data is written to disk. Sessions auto-expire after 90 seconds of
inactivity. No account, authentication, or configuration required beyond the token.
Session Lifecycle
The sync loop runs on a 2-second cycle. A FlushGate serializes push and pull operations — if a push is in-flight, the next pull is deferred until the push completes. This prevents cache corruption when the same author writes and reads within the same cycle window. Both roles (Live and Joined) push and pull symmetrically.
Editor State Fields
Synchronized each cycle via /session/push and /session/pull:
| Field | Description |
|---|---|
| file_path | Active file path — peer editor navigates to matching file automatically on receipt |
| content | Full file content — applied as a non-destructive content patch if the file is already open |
| cursor_line, cursor_col | Cursor position — rendered as a labelled ghost cursor badge in the peer's editor gutter |
| sel_start_line / sel_start_col | Selection start — lower bound of the ghost selection overlay rendered in peer's editor |
| sel_end_line / sel_end_col | Selection end — upper bound, rendered as a translucent tinted block over the peer's text |
| tabs | Open tab filenames — dim ○ in tab bar when peer has file open, ● when peer is actively editing it |
Memory Entry Kinds
Memory entries are pushed via /session/memory/push
and pulled via /session/memory/pull, which returns
entries merged from all authors, sorted ascending by timestamp. Each entry carries a typed
kind field that controls how it is handled on receipt:
| kind | Source | Effect on peer |
|---|---|---|
| insight | Resonance Weave extraction | Injected into AI system context as grounded workspace knowledge on the next query |
| chat_message | Collab overlay input field | Appears as cross-talk in peer's collab panel; displayed with author attribution and timestamp |
| terminal | Terminal command completion | Last 40 lines of output + exit code injected as AI context; shown as system message in chat view |
| spiderling | Analysis worker finding | Severity-ranked finding injected as AI context; visible in peer's findings panel with source location |
| edit_pattern | Accepted Venom mutation | Pattern recorded and used to influence framing of future Venom Lattice proposals for this codebase |
| warning | Venom Lattice alert | High-severity warning surfaced in peer's notification overlay immediately on next pull cycle |
Auto-push Triggers
Memory is pushed on a 30-second cooldown, but certain workspace events bypass the cooldown and trigger an immediate push:
Rate Limits and Caps
| Limit | Value | Notes |
|---|---|---|
| Sync cycle | 2 seconds | Both roles, full push+pull each cycle |
| Memory auto-push cooldown | 30 seconds | Bypassed by the 4 trigger events above |
| Session creates | 5 / min / IP | Enforced at relay, returns 429 on excess |
| Memory entries per author | 500 | Oldest entries evicted when cap reached |
| Session idle expiry | 90 seconds | Measured from last push or heartbeat |
| Heartbeat timeout | 60 seconds | Client-side: auto-disconnect and reset to Idle |
JSON over HTTP. Tokens normalized to uppercase. Sessions expire after 90s idle. Rate limit: 5 creates/min per IP.
{"token":"ABC"}. Returns {"ok":true}.
{"ok":true,"sessions":N}.
WebRTC peer connections, TURN relay at
turn.arachnia.stream:3478,
and SSH gateway are scoped for a future release.This page will document the tunnel client, port forwarding, and connection commands.
Build target: x86_64-pc-windows-msvc. Size budget: <12 MB binary.
Source remains private until the initial public release.