Skip to main content

AI Editor Setup

Connect your AI coding assistant to the OneSource documentation using MCP (Model Context Protocol). This gives your AI direct access to the full OneSource Web3 API reference, so it can answer questions and generate accurate code without you copy-pasting docs.

Prerequisites

MCP documentation serving is powered by mcpdoc, which requires uv (a fast Python package manager). Install it if you don't have it:

macOS / Linux:

curl -LsSf https://astral.sh/uv/install.sh | sh

Windows (PowerShell):

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

You do not need to install mcpdoc manually — your editor will run it automatically via uvx.

Claude Code

If you've cloned the OneSource developer docs repo, MCP is already configured. The .mcp.json file in the project root registers the server automatically — no setup needed.

User-level

To make the OneSource docs available in all Claude Code sessions, run:

claude mcp add-json onesource-docs '{"type":"stdio","command":"uvx","args":["--from","mcpdoc","mcpdoc","--urls","OneSourceDocs:https://docs.onesource.io/llms.txt","--follow-redirects"]}'

Cursor

Add the following to .cursor/mcp.json in your project root (create the file if it doesn't exist):

{
"mcpServers": {
"onesource-docs": {
"command": "uvx",
"args": [
"--from", "mcpdoc", "mcpdoc",
"--urls", "OneSourceDocs:https://docs.onesource.io/llms.txt",
"--follow-redirects"
]
}
}
}

VS Code / Continue

Add the following to your .vscode/mcp.json (create the file if it doesn't exist):

{
"servers": {
"onesource-docs": {
"type": "stdio",
"command": "uvx",
"args": [
"--from", "mcpdoc", "mcpdoc",
"--urls", "OneSourceDocs:https://docs.onesource.io/llms.txt",
"--follow-redirects"
]
}
}
}

Windsurf

Add the following to ~/.codeium/windsurf/mcp_config.json (create the file if it doesn't exist):

{
"mcpServers": {
"onesource-docs": {
"command": "uvx",
"args": [
"--from", "mcpdoc", "mcpdoc",
"--urls", "OneSourceDocs:https://docs.onesource.io/llms.txt",
"--follow-redirects"
]
}
}
}

Direct access

If your tool doesn't support MCP, you can access the documentation directly:

  • llms.txt — Index of all documentation pages (lightweight, suitable for discovery)
  • llms-full.txt — Complete documentation in a single file (suitable for ingestion into context windows)

Paste either URL into your AI tool's context or download the file for local use.