$cd ../integrations/
Integration20 min read

VS Code / Cursor Integration

Bring your powerful local OpenClaw orchestration engine directly into your coding environment. Full workspace context, autonomous file editing, and terminal executionβ€”without the latency of cloud APIs.

why_ide.md

Why Integrate OpenClaw into your IDE?

While chatting with OpenClaw in a separate web UI is helpful for general tasks, context switching kills developer flow. By piping your local OpenClaw instance directly into VS Code or Cursor, the LLM gains instant, deep access to your workspace. It can read open tabs, scan your project directory, execute tests in the integrated terminal, and directly suggest inline code diffs that you can apply with a single click. This transforms the AI from a chatbot into an active pair programmer.

install_bridge.sh

Method 1: The Official OpenClaw Bridge (Recommended)

The most seamless native experience is via the official OpenClaw Bridge extension on the VS Code Marketplace. It supports both VS Code and VSCodium.

  1. 1.Open VS Code and navigate to the Extensions view (Ctrl+Shift+X or Cmd+Shift+X).
  2. 2.Search for 'OpenClaw Bridge' in the marketplace.
  3. 3.Click Install. The extension will automatically detect if the OpenClaw daemon is running on localhost:11434.
  4. 4.Click the crab icon in the Activity Bar to open the chat panel and start coding.
configure_cline.yaml

Method 2: Using Cline / RooCode (OpenAI Proxy)

If you prefer popular agent extensions like Cline, RooCode, or Continue.dev, you can easily point them to your OpenClaw instance by utilizing its built-in OpenAI-compatible API proxy.

  • API Provider: Select 'OpenAI Compatible' or 'Custom'
  • Base URL: http://127.0.0.1:11434/v1 (The local OpenClaw router port)
  • Model Name: openclaw-coordinator (or map directly to your local model, e.g., qwen2.5-coder:32b)
  • API Key: openclaw-local-dev (Authentication is disabled for localhost by default)
filesystem_mcp.json

Deep Context via File System MCP

To give the coding agent true autonomy, it needs to read and write files reliably. Ensure the FileSystem Model Context Protocol (MCP) server is enabled and mapped to your workspace in your ~/.openclaw/config.yaml:

{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/workspace"]
}
}
}
best_practices.md

## Best Practices & Optimization

  • β˜…Use a .openclawignore file: Essential for performance. Prevent the agent from getting lost reading massive node_modules, .git, or build/ directories which will exhaust the context window immediately.
  • β˜…Pin Contexts: When initiating a major cross-file refactor, explicitly mention the 3-4 relevant files. This prevents the agent from falling back to a broad, slow RAG search.
  • β˜…Enable Terminal Access: Grant the extension permission to run commands. The agent can then automatically run `npm run build` or `pytest` to self-verify its code before presenting it to you.
  • β˜…System Prompts: Customize the system prompt in the extension settings to enforce your project's specific coding guidelines (e.g., 'Always use functional components and strict TypeScript').
troubleshooting.log

Troubleshooting Connection Issues

ERROR: ECONNREFUSED 127.0.0.1:11434
$ The OpenClaw daemon is not running. Start it in your terminal via `openclaw start`.
ERROR: Model XYZ not found
$ You referenced a model in the extension that hasn't been pulled locally. Run `openclaw pull <model_name>`.
ERROR: Agent overwrites entire files
$ Ensure you are using an extension that supports diff-based editing (like Cline) rather than full-file replacement, and that your local model is a coding-specific variant (e.g., Llama 3.1 8B Instruct).

❓ FAQ

Q1. Is it free?

Yes. The VS Code extension is free and open source. You provide your own LLM (local or API).

Q2. Does it work offline?

Yes, with local models via Ollama. No internet required for code assistance.

Q3. How does it compare to GitHub Copilot?

OpenClaw is self-hosted and uses your choice of model. It offers more customization but requires setup. Copilot is plug-and-play.
← Back to Integrations