SanctumOS

The Modular, Self-Hosted Agentic Operating System

Quick Start Guide

Get up and running with SanctumOS in minutes! This guide will help you set up a basic SanctumOS deployment and start communicating with AI agents.

đŸŽ¯ What You'll Build

By the end of this guide, you'll have:

  • A running Sanctum Letta MCP server
  • A Broca-2 instance connected to an AI agent
  • A web chat interface for agent communication
  • Basic understanding of the SanctumOS architecture

⚡ 5-Minute Setup

Step 1: Install Sanctum Letta MCP

# Clone and install SMCP
git clone https://github.com/sanctumos/smcp.git
cd smcp
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r requirements.txt

# Start the MCP server
python smcp/mcp_server.py

Clone and install SMCP

git clone https://github.com/sanctumos/smcp.git cd smcp python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate pip install -r requirements.txt

Start the MCP server

python smcp/mcp_server.py


The server will start on http://localhost:8000. You should see:

The server will start on http://localhost:8000. You should see:


The server will start on http://localhost:8000. You should see:

INFO:smcp.mcp_server:Starting MCP server on http://0.0.0.0:8000 INFO:smcp.mcp_server:Discovered plugin: botfather INFO:smcp.mcp_server:Discovered plugin: devops


The server will start on http://localhost:8000. You should see:

Step 2: Install Broca-2

Open a new terminal and install Broca-2:

# Clone and install SMCP
git clone https://github.com/sanctumos/smcp.git
cd smcp
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r requirements.txt

# Start the MCP server
python smcp/mcp_server.py

Clone and install Broca-2

git clone https://github.com/sanctumos/broca.git cd broca pip install -r requirements.txt

Configure environment

cp .env.example .env

Edit .env with your agent settings (see Step 3)

Start Broca-2

python main.py


The server will start on http://localhost:8000. You should see:

Step 3: Configure Your AI Agent

Edit the .env file in the Broca-2 directory:

# Clone and install SMCP
git clone https://github.com/sanctumos/smcp.git
cd smcp
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r requirements.txt

# Start the MCP server
python smcp/mcp_server.py

.env

AGENT_ENDPOINT=https://your-agent-api.com AGENT_API_KEY=your-agent-api-key MESSAGE_MODE=live DEBUG_MODE=true


The server will start on http://localhost:8000. You should see:

Note: You'll need access to a Letta agent or compatible AI agent API. If you don't have one, you can use the "echo" mode for testing:

# Clone and install SMCP
git clone https://github.com/sanctumos/smcp.git
cd smcp
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r requirements.txt

# Start the MCP server
python smcp/mcp_server.py

For testing without an agent

MESSAGE_MODE=echo


The server will start on http://localhost:8000. You should see:

Step 4: Install Sanctum Web Chat

Open another terminal and set up the web chat:

# Clone and install SMCP
git clone https://github.com/sanctumos/smcp.git
cd smcp
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r requirements.txt

# Start the MCP server
python smcp/mcp_server.py

Clone the web chat bridge

git clone https://github.com/sanctumos/broca-web-client.git cd broca-web-client

For PHP implementation

cd php/public php -S localhost:8080

Or for Flask implementation

cd python python -m venv venv source venv/bin/activate pip install -r requirements.txt python init_db.py python run.py


The server will start on http://localhost:8000. You should see:

Step 5: Test Your Setup

  1. Test the MCP server:
   curl -X POST http://localhost:8000/messages/ \
     -H "Content-Type: application/json" \
     -d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"health","arguments":{}}}'

curl -X POST http://localhost:8000/messages/ \ -H "Content-Type: application/json" \ -d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"health","arguments":{}}}'


2. Test the web chat:
   - Open http://localhost:8080 in your browser
   - Send a test message
   - Check if it appears in the Broca-2 logs

3. Test Broca-2:
   bash
   # Check queue status
   python -m cli.btool queue list
   
   # Check system health
   python -m cli.btool health check
  1. Test the web chat:
  2. Open http://localhost:8080 in your browser
  3. Send a test message
  4. Check if it appears in the Broca-2 logs
  1. Test Broca-2:
   curl -X POST http://localhost:8000/messages/ \
     -H "Content-Type: application/json" \
     -d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"health","arguments":{}}}'

Check queue status

python -m cli.btool queue list

Check system health

python -m cli.btool health check


2. Test the web chat:
   - Open http://localhost:8080 in your browser
   - Send a test message
   - Check if it appears in the Broca-2 logs

3. Test Broca-2:
   bash
   # Check queue status
   python -m cli.btool queue list
   
   # Check system health
   python -m cli.btool health check

🎉 Congratulations!

You now have a basic SanctumOS deployment running! Here's what's happening:

  • Sanctum Letta MCP is providing tools and capabilities to AI agents
  • Broca-2 is processing messages and communicating with your AI agent
  • Sanctum Web Chat is providing a web interface for users to chat with your agent

🔧 Next Steps

Explore the System

  1. Check available tools:
   curl -X POST http://localhost:8000/messages/ \
     -H "Content-Type: application/json" \
     -d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"health","arguments":{}}}'

curl -X POST http://localhost:8000/messages/ \ -H "Content-Type: application/json" \ -d '{"jsonrpc":"2.0","id":1,"method":"tools/list"}'


2. Test the web chat:
   - Open http://localhost:8080 in your browser
   - Send a test message
   - Check if it appears in the Broca-2 logs

3. Test Broca-2:
   bash
   # Check queue status
   python -m cli.btool queue list
   
   # Check system health
   python -m cli.btool health check
  1. View message queue:
   curl -X POST http://localhost:8000/messages/ \
     -H "Content-Type: application/json" \
     -d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"health","arguments":{}}}'

cd broca python -m cli.btool queue list


2. Test the web chat:
   - Open http://localhost:8080 in your browser
   - Send a test message
   - Check if it appears in the Broca-2 logs

3. Test Broca-2:
   bash
   # Check queue status
   python -m cli.btool queue list
   
   # Check system health
   python -m cli.btool health check
  1. Monitor web chat sessions:
  2. Open http://localhost:8080/web/admin.php (PHP) or http://localhost:8080/admin (Flask)
  3. Enter the admin key when prompted

Customize Your Setup

  1. Add custom plugins to Sanctum Letta MCP
  2. Configure message processing in Broca-2
  3. Customize the web chat interface
  4. Set up multiple agents for different use cases

Production Deployment

  1. Set up proper authentication and API keys
  2. Configure SSL/TLS for secure communication
  3. Set up monitoring and logging
  4. Configure backup and recovery procedures

🚨 Troubleshooting

Common Issues

Port Conflicts

If you get port conflicts:

# Clone and install SMCP
git clone https://github.com/sanctumos/smcp.git
cd smcp
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r requirements.txt

# Start the MCP server
python smcp/mcp_server.py

Check what's using the ports

lsof -i :8000 lsof -i :8080

Use different ports

python smcp/mcp_server.py --port 9000 php -S localhost:8081


The server will start on http://localhost:8000. You should see:

Permission Issues

Fix file permissions:

# Clone and install SMCP
git clone https://github.com/sanctumos/smcp.git
cd smcp
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r requirements.txt

# Start the MCP server
python smcp/mcp_server.py

chmod -R 755 ~/sanctum chown -R $USER:$USER ~/sanctum


The server will start on http://localhost:8000. You should see:

Agent Connection Issues

If Broca-2 can't connect to your agent:

  1. Check the AGENT_ENDPOINT and AGENT_API_KEY in .env
  2. Verify your agent is running and accessible
  3. Use MESSAGE_MODE=echo for testing without an agent

Getting Help

  • Check the Troubleshooting Guide
  • Review module-specific documentation
  • Check GitHub issues for known problems
  • Join community discussions

📚 Learn More

Now that you have SanctumOS running, explore these resources:

đŸŽ¯ What's Next?

You're ready to start building with SanctumOS! Here are some ideas:

  1. Create a custom plugin for your specific use case
  2. Set up multiple agents for different tasks
  3. Integrate with external services using the MCP protocol
  4. Build a custom web interface for your agents
  5. Deploy to production with proper security and monitoring

SanctumOS Quick Start Guide - Get your AI agent communication platform running in minutes!