SanctumOS Documentation
SanctumOS is a modular, self-hosted agentic operating system for AI agent communication and management. It provides a neuro-inspired cognitive architecture where software components map to brain functions, giving agents memory consolidation, task orchestration, multi-platform communication, and tool access through the Model Context Protocol (MCP).
Quick Overview
- Self-Hosted: Your data stays on your infrastructure â no third-party cloud dependencies
- Modular: Mix and match components to fit your needs
- Neuro-Inspired: Components map to cognitive functions (Thalamus, Broca, Basal Ganglia, etc.)
- MCP-Native: Tool integration through the Model Context Protocol standard
- Extensible: Plugin-based system across the entire stack
Architecture
SanctumOS follows a neuro-inspired architecture that maps software components to cognitive functions. For a complete understanding of the system architecture, see the Architecture Overview.
Core Modules
| Module | Role | Repo |
|---|---|---|
| Sanctum Letta MCP (SMCP) | Plugin-based MCP server â the tool layer | sanctumos/smcp |
| Broca-2 | Message processing middleware â the speech center | sanctumos/broca |
| Sanctum Web Chat | Web interface bridge for browser-based agent chat | sanctumos/broca-web-client |
| Sanctum Router | Self-hosted OpenAI-compatible inference proxy | sanctumos/sanctum-router |
| Cochlea | Audio input pipeline (research release) | sanctumos/cochlea |
SMCP Plugin Ecosystem
SMCP's plugin architecture has grown into a rich ecosystem. Each plugin exposes domain-specific tools to any MCP-connected agent. See the full SMCP Plugins overview for details.
| Plugin | What it does |
|---|---|
| smcp-cursor-cli | Run Cursor CLI agent sessions in headless mode |
| smcp-doc-manager | Letta sources/folders admin, markdown â PDF/DOCX export |
| smcp-image-analysis | Image interpretation via Venice AI vision API |
| smcp-gmail | Gmail as MCP tools (list, read, send, label) |
| smcp-plugin-github | GitHub CLI and Git wrappers for MCP |
| smcp-moltbook | Social network for AI agents (Moltbook) |
| smcp-bitlaunch.io | BitLaunch cloud server provisioning |
Getting Started
Prerequisites: Install the Docker instance of Letta to provide the foundation for your AI agent infrastructure.
>
Tip: Run each module (including Letta) in separate screen sessions to keep them running in the background and easily accessible.
- Sanctum Installer (early alpha) â automated setup for the full Sanctum stack on WSL, Ubuntu, and Raspbian
- Individual Module Installation â follow the installation instructions in each module's repository:
- Sanctum Letta MCP
- Broca-2
- Sanctum Web Chat
- Sanctum Router
Ecosystem & Applications
Beyond core infrastructure, SanctumOS includes reference applications and utilities built on the stack. See the Ecosystem Overview for the full catalog.
| Project | Description |
|---|---|
| Sanctum CRM | AI-first, MCP-native CRM reference architecture |
| Clawed Road | Marketplace stack: EVM payments, PHP + Python, agent API |
| Code Buddy | GitHub webhook processor + MCP server for agent dev awareness |
| Origin Conversation | ChatGPT history search/export + MCP server |
| Venice Billing Monitor | Venice AI billing â Letta agent memory block updates |
| Sanctum Social | Social media management and automation |
| Sanctum DMS | Dealer Management System (API-first, PHP + SQLite) |
Configuration
- CMS JSON API (pages, profiles) â API keys, rate limits, and
/api/endpoints for this site's CMS
Why Self-Hosted AI Matters
SanctumOS represents a fundamental shift from cloud-based AI to end-to-end self-hosted AI systems.
- Complete Data Sovereignty â your conversations, documents, and AI interactions never leave your infrastructure.
- Zero External Dependencies â no reliance on third-party APIs for core functionality. Your AI agents work entirely within your environment.
- Transparent Operations â every decision, every response, every piece of data processing happens on your hardware with full visibility.
- Privacy by Design â no data collection, no telemetry, no usage tracking.
- Customization Without Limits â modify, extend, and customize every aspect of your AI system without restrictions.
- Cost Predictability â if you host your own inference, there's no per-API-call pricing, no usage limits, no surprise bills.
- Regulatory Compliance â meet data residency requirements, industry regulations, and organizational policies without compromise.
Documentation
Getting Started
- Quick Start â your first steps with SanctumOS
- Installation Guide â detailed setup instructions
Architecture & Components
- Architecture Overview â complete system architecture
- Naming Rubric â consistent terminology guidelines
- Components â detailed component documentation (Thalamus, Basal Ganglia, Dream Agent)
Modules & Plugins
- Modules Overview â all SanctumOS modules
- SMCP Plugins â the MCP plugin ecosystem
- Sanctum Router â inference proxy
- Cochlea â audio input pipeline
Ecosystem
- Ecosystem Overview â reference applications and utilities
Reference
- CMS API Quick Reference â JSON API for this site's CMS
- Troubleshooting â common issues and solutions
- Contributing â how to contribute to SanctumOS
Community & Support
- GitHub: github.com/sanctumos
- Email: sanctumos@rizzn.com
License
- Code: AGPLv3 â GNU Affero General Public License v3.0
- Documentation: CC-BY-SA 4.0 â Creative Commons Attribution-ShareAlike 4.0
SanctumOS â Empowering AI agents with secure, modular communication.