$cd ../integrations/
⚑ ProductivityOffline-firstv1.5+
$ cat obsidian-integration.md

openclaw.integrate('obsidian')

/** "What did I write about RAG last month?" β€” searches 2000 notes in 2 seconds */

// Obsidian vs Notion β€” when to choose Obsidian
Choose Obsidian if:
β€’ You want 100% offline notes
β€’ You have an existing large vault (1000+ notes)
β€’ You care about data ownership (plain Markdown)
β€’ You want backlinks and graph view queries
Choose Notion if:
β€’ You need team collaboration
β€’ You rely on databases and views
β€’ You want web-based access
how_it_works.md

πŸ— Architecture: Local Vector Search

1.
OpenClaw indexes your vault on startup
Reads all .md files, generates embeddings, stores in local vector DB (no cloud)
2.
Query via Telegram / any chat interface
Semantic search finds the most relevant notes for your question
3.
LLM synthesizes an answer with citations
References the source notes, links back to your vault
config.yaml

βš™οΈ Configuration

# openclaw/config.yaml
integrations:
obsidian:
enabled: true
vault_path: "~/Documents/MyVault"
index_on_startup: true
watch_changes: true
embedding_model: "nomic-embed-text"
example.log

πŸ“± Example Queries

You
What did I write about RAG architectures?
OpenClaw
Found 4 relevant notes: β€’ RAG vs Fine-tuning (2025-11-12) β€” argues RAG is better for factual recall β€’ Vector DB comparison (2025-10-03) β€” Chroma vs Qdrant vs Weaviate β€’ My RAG implementation notes (2025-12-01) β€” code snippets included β€’ LLM limitations for knowledge (2026-01-15)
You
Create a new note: 'OpenClaw Obsidian demo', link it to RAG vs Fine-tuning
OpenClaw
βœ“ Created: OpenClaw Obsidian demo.md Added [[RAG vs Fine-tuning]] backlink

❓ FAQ

Q1. Does it modify my vault files?

Only if you enable write mode. By default, it reads your vault in read-only mode for search and context.

Q2. Does it work with Obsidian plugins?

Yes. It integrates via the Local REST API plugin. Dataview queries are also supported.

Q3. Is my vault data sent to the cloud?

Never. OpenClaw reads your local vault files directly. With local LLMs, nothing leaves your machine.
← Back to Integrations