Quickstart · FAQ · GitHub · Community
If Delphi.ai is the product, OpenDelphi is the self-hosted version you actually own
OpenDelphi lets experts — teachers, professors, coaches, founders — turn their real knowledge into a public digital mind. Feed it your meetings, emails, lectures, and documents. It builds a compounding knowledge base that sounds like you. Share it with students, clients, or the world.
It looks like a Delphi profile page — but under the hood it's powered by Hermes Agent locally, a LLM-compiling wiki, and Vercel AI SDK streaming. Your knowledge stays on your machine until you choose to publish.
Build your mind locally. Share it globally.
| Step | Example | |
|---|---|---|
| 01 | Feed your mind | Drop in lecture notes, office hour recordings, emails, meeting transcripts. |
| 02 | It learns | Hermes Agent reads, synthesizes, and builds a compounding knowledge wiki. |
| 03 | Share with the world | One command deploys your public Delphi-style profile to Vercel. Students chat with your mind 24/7. |
For students: Add your professor's mind as an MCP tool in Claude or Cursor — their knowledge becomes a native tool in your AI sessions.
| Powered by |
Hermes Agent |
Vercel AI SDK |
Neon Postgres |
Next.js |
MCP |
If your agent can read markdown, it already knows how to use OpenDelphi.
- ✅ You are a teacher or professor who wants students to get answers at 2am without emailing you
- ✅ You want a self-hosted alternative to Delphi.ai that you fully own
- ✅ You have years of knowledge in meetings, emails, and lectures that no one can access
- ✅ You want your expertise to compound over time — not just answer one-off questions
- ✅ You want students to plug your knowledge into Claude or Cursor as a native MCP tool
- ✅ You want a Delphi-style public profile without the $79–$2,499/month subscription
- ✅ You want your knowledge to stay on your machine until you choose to publish it
| Not just RAG. Hermes Agent builds a persistent wiki that gets smarter every time you add something — cross-references, contradictions flagged, synthesis updated. | Pixel-faithful public profile page. Squircle avatar, streaming chat, pinned questions, topic chips, MCP button. Students feel at home immediately. | Every deployed mind exposes an MCP server. Students add one line to Claude or Cursor and the professor's knowledge becomes a native tool. |
| Runs entirely on your machine. PGLite — no server, no account, no subscription. Publish to Vercel when you're ready. Your data stays yours. |
npx opendelphi start sets up locally in 2 minutes. One click deploys to Vercel with Neon auto-provisioned. Custom domain ready.
|
Every deployed mind exposes an MCP server. One command adds your professor's knowledge as a native tool in Claude Code or Cursor. |
| PDFs, Word docs, audio, video, URLs, meeting transcripts, Google Calendar, Gmail. Drop it in and Hermes handles the rest. | Every answer links back to the source — office hours transcript, lecture notes, email thread. Students trust it because they can verify it. | Chat with your own mind privately before publishing. Review what it knows. Edit compiled knowledge directly. You control what goes public. |
| Without OpenDelphi | With OpenDelphi |
|---|---|
| ❌ Students email the same questions at midnight. You answer them again tomorrow morning. | ✅ Your digital mind answers 24/7. You wake up to insights, not a flooded inbox. |
| ❌ Years of office hours, lectures, and emails are locked in your head or scattered across files. | ✅ Everything compiles into a searchable, citable knowledge base that improves over time. |
| ❌ Delphi.ai costs $79–$2,499/month and owns your data on their servers. | ✅ OpenDelphi is free, MIT-licensed, self-hosted. Your knowledge, your server. |
| ❌ Students using Claude or Cursor have no access to your actual teaching approach. | ✅ One MCP endpoint and your expertise is a native tool in their AI sessions. |
| ❌ A Q&A chatbot answers from stale chunks. It can't synthesize across everything you know. | ✅ The LLM-Wiki pattern compiles knowledge once and keeps it current — cross-references already built. |
| ❌ You'd need to hire engineers to build and maintain a custom solution. | ✅ npx opendelphi start and you're running. Deploy to Vercel in 2 minutes. |
| Not a generic chatbot. | It's your mind specifically — grounded in your actual content, citing your actual sources. |
| Not another RAG wrapper. | The LLM-Wiki pattern compiles knowledge into a persistent, cross-referenced wiki. Not chunk retrieval. |
| Not a CMS. | You don't write the wiki. Hermes Agent writes it from your real sources. You just add things. |
| Not a multi-tenant SaaS. | One deployment per expert. You own the server, the data, the domain. |
| Not a course platform. | No video hosting, no enrolment, no payments. Just your knowledge, accessible. |
| Not for teams (yet). | Single expert per deployment for now. Multi-mind is on the roadmap. |
Open source. Self-hosted. No account required.
Option 1 — Install script (recommended)
curl -fsSL https://opendelphi-opal.vercel.app/install.sh | bashChecks dependencies, prompts for your name + API key, writes .env.local, and starts the server. Opens your browser automatically.
Option 2 — Docker
git clone https://github.com/moralespanitz/opendelphi.git
cd opendelphi
cp .env.example .env.docker
# Edit .env.docker — add your GOOGLE_GENERATIVE_AI_API_KEY and profile
docker compose upOpen http://localhost:3000.
Option 3 — Manual
git clone https://github.com/moralespanitz/opendelphi.git
cd opendelphi
bun install
cp .env.example .env.local # add your API key and profile
bun devDeploy to Vercel (public URL in 2 minutes):
Requirements: Node.js 18+, Bun 1.x (auto-installed by install script), Google AI Studio API key (free)
Run OpenDelphi in a container — no local Node/Bun setup required.
# 1. Clone
git clone https://github.com/moralespanitz/opendelphi.git
cd opendelphi
# 2. Configure
cp .env.example .env.docker
# → set GOOGLE_GENERATIVE_AI_API_KEY
# → set OPENDELPHI_NAME, OPENDELPHI_TITLE, OPENDELPHI_USERNAME
# 3. Run
docker compose upApp starts at http://localhost:3000. Your wiki is persisted in a named Docker volume (opendelphi_data) — data survives container restarts.
docker compose down # stop
docker compose up --build # rebuild after code changes
docker volume rm opendelphi_opendelphi_data # wipe local database# Claude Code
claude mcp add prof-chen -t http https://chen.opendelphi.app/api/mcp
# Cursor (settings.json)
{ "prof-chen": { "url": "https://chen.opendelphi.app/api/mcp" } }Tools available: search_mind, ask_mind, get_page — the professor's knowledge is now a native tool in your AI session.
How is this different from Delphi.ai? Delphi.ai is a closed SaaS that costs up to $2,499/month and hosts your data on their servers. OpenDelphi is MIT-licensed, self-hosted, and free. You own everything.
Do I need Hermes Agent? Hermes Agent is the recommended local brain engine — it handles ingestion, the LLM-Wiki skill, and dream cycles. A built-in fallback agent (Vercel AI SDK) is included for users who can't run Hermes.
What is the LLM-Wiki pattern? Instead of retrieving chunks at query time (standard RAG), Hermes compiles your sources into a persistent wiki — entity pages, concept pages, cross-references, compiled truth + timeline. Knowledge compounds. See Karpathy's LLM-Wiki for the original pattern.
How does the MCP endpoint work?
Every deployed OpenDelphi instance exposes /api/mcp — a stateless MCP HTTP server built with @modelcontextprotocol/server. Students add it to any MCP-compatible client (Claude, Cursor, Windsurf) and the professor's knowledge becomes a native tool.
Is voice supported? No. Voice is out of scope for the current version.
Can I use a custom domain? Yes. Point any domain at your Vercel deployment.
Is my data private? Everything stays on your local machine (PGLite) until you explicitly click "Update public page." The public Vercel deployment only receives what you publish.
bun dev # Full dev (web + cli, watch mode)
bun dev:web # Next.js only
bun build # Build all packages
bun typecheck # Type checking
bun test # Run tests
bun db:generate # Generate Drizzle migration
bun db:migrate # Apply migrations
bun sync # Sync local PGLite → NeonRepo structure:
opendelphi/
├── apps/
│ ├── web/ Next.js app (local + Vercel, same code)
│ └── cli/ npx opendelphi CLI
├── packages/
│ ├── wiki-engine/ DB abstraction (PGLite ↔ Neon)
│ └── hermes-bridge/ Hermes Agent stdio MCP wrapper
└── docs/
- ⚪ Core wiki-engine (PGLite + Neon, Drizzle, pgvector)
- ⚪ Hermes Agent bridge (stdio MCP locally, FastAPI remotely)
- ⚪ Local Next.js dashboard (normie-first, 3-step setup)
- ⚪ Public profile page (Delphi UI clone)
- ⚪ Streaming chat (
/api/chatvia Vercel AI SDK) - ⚪ MCP server endpoint (
/api/mcp) - ⚪ Publish / sync flow (PGLite → Neon diff + SSE progress)
- ⚪ Source ingestion UI (drag-drop, URL, integrations)
- ⚪ Google Calendar integration
- ⚪ Gmail integration
- ⚪ Meeting transcript ingestion (Zoom, Circleback)
- ⚪ Multi-mind per deployment
- ⚪ Desktop app (Electron wrapper)
- ⚪ Public discover directory
- ⚪ Twitter/X integration
- ⚪ Monetization / paywalled sessions
We welcome contributions — issues, PRs, and feature requests on GitHub.
- GitHub Issues — bugs and feature requests
- GitHub Discussions — ideas and RFC
MIT © 2026 OpenDelphi
| Variable | Required | Description |
|---|---|---|
ANTHROPIC_API_KEY |
✓ | Anthropic API key — powers chat and query expansion |
OPENAI_API_KEY |
recommended | OpenAI API key — powers semantic search embeddings |
OPENDELPHI_NAME |
✓ | Your display name, e.g. Dr. Jane Chen |
OPENDELPHI_TITLE |
recommended | Your title, e.g. Professor of Computer Science |
DATABASE_URL |
✓ (Vercel) | Neon Postgres connection string |
NEON_DATABASE_URL |
✓ (Vercel) | Same as DATABASE_URL for publish sync |
BLOB_READ_WRITE_TOKEN |
✓ (Vercel) | Vercel Blob token for file uploads |
OPENDELPHI_CLASS_PASSCODE |
optional | Require a passcode before students can chat |
OPENDELPHI_CLASS_PASSCODE_HINT |
optional | Hint shown on the passcode prompt |