Letta on Your Phone
Yes, really. The sanctumos/installer now includes documentation and scripts for running Letta on Android via Termux and proot. Letta is the agentic AI framework that grew out of the MemGPT research line β it gives LLMs persistent memory, tool calling, and orchestration instead of stateless chat completions. It's not fast on a phone, and it's not for production, but it proves that your personal AI agent can live in your pocket without depending on cloud infrastructure.
Zero-Docker Local Bootstrap
The bigger story is the new local Letta bootstrap script. No Docker, no containers β just a Python virtualenv, a Cloudflare Tunnel for HTTPS, and a Venice AI-compatible .env file. Venice is a privacy-first AI inference provider β it doesn't retain your prompts and supports open-weight models β which makes it a natural pairing with a self-hosted agent stack. The bootstrap:
- Creates a clean
.envwith Venice AI API endpoints (no dummy keys that would confuse first-time users). - Installs
cloudflaredand sets up a systemd service for the tunnel. - Configures CORS headers for the Agent Development Environment (ADE).
This matters because Docker is a non-starter on many edge devices. If SanctumOS is going to be a personal AI operating system, it needs to run where people actually compute β laptops, phones, Raspberry Pis β not just in containerized cloud environments.