
Nanobot - Build MCP Agents
Warning
This project is under heavy development and is moving away from its original design and intent. Expect significant breaking changes, architectural shifts, and evolving APIs.
Nanobot enables building agents with MCP and MCP-UI by providing a flexible MCP host.
While existing applications like VSCode, Claude, Cursor, ChatGPT, and Goose all include an MCP host,
Nanobot is designed to be a standalone, open-source MCP host that can be easily deployed or integrated into
your applications. You can use Nanobot to create your own dedicated MCP and MCP-UI powered chatbot.
What is an MCP Host?
An MCP host is
the service that combines MCP servers with an LLM and context to present an agent experience to a
consumer. The primary experience today is a chat interface, but it can be many other interfaces such
as voice, SMS, e-mail, AR/VR, Slack, MCP, or any other interface that can be used to interact with
an agent.

Examples
Here are some examples of Nanobots in action:
Installation
Nanobot can be installed via Homebrew:
brew install obot-platform/tap/nanobot
This will give you the nanobot CLI, which you can use to run and manage your MCP host.
Getting Started
Configuration
Nanobot supports two configuration formats:
- Single File Configuration - A
nanobot.yaml file
- Directory-Based Configuration - A directory with
.md agent files
Single File Configuration
Nanobot supports the following providers:
- OpenAI (e.g.
gpt-4)
- Anthropic (e.g.
claude-3)
To use them, set the corresponding API key:
# For OpenAI models
export OPENAI_API_KEY=sk-...
# For Anthropic models
export ANTHROPIC_API_KEY=sk-ant-...
Nanobot automatically selects the correct provider based on the model specified.
Create a configuration file (e.g. nanobot.yaml) that defines your agents and MCP servers.
Example:
agents:
dealer:
name: Blackjack Dealer
model: gpt-4.1
mcpServers: blackjackmcp
mcpServers:
blackjackmcp:
url: https://blackjack.nanobot.ai/mcp
Start Nanobot with:
nanobot run ./nanobot.yaml
The UI will be available at http://localhost:8080.
Directory-Based Configuration
You can organize your configuration as a directory with a top-level nanobot.yaml for shared settings and individual .md files for each agent.
Directory Structure:
my-config/
├── nanobot.yaml # Shared config: mcpServers, llmProviders, env, etc.
└── agents/ # Agent definitions directory
├── main.md # Main agent (auto-set as entrypoint)
└── helper.md # Additional agent
LLM Providers
openai and anthropic are built-in providers — set OPENAI_API_KEY or ANTHROPIC_API_KEY and they work with no additional config. Use the {provider}/{model} format in the model field to select a provider.
Additional providers (Azure, Bedrock, Ollama, etc.) can be configured in nanobot.yaml under llmProviders.
Each entry specifies a dialect (the API protocol), credentials, and base URL. The dialect field accepts:
| Dialect |
Description |
OpenAIResponses |
OpenAI Responses API (default) |
OpenAIChatCompletions |
OpenAI Chat Completions API |
AnthropicMessages |
Anthropic Messages API |
OpenResponses |
Generic OpenResponses-compatible endpoint |
Example llmProviders config
llmProviders:
# Built-in providers — shown here for reference or to override baseURL/headers
openai:
dialect: OpenAIResponses
apiKey: ${OPENAI_API_KEY}
baseURL: ${OPENAI_BASE_URL} # optional, default: https://api.openai.com/v1
anthropic:
dialect: AnthropicMessages
apiKey: ${ANTHROPIC_API_KEY}
baseURL: ${ANTHROPIC_BASE_URL} # optional, default: https://api.anthropic.com/v1
# Custom providers pointing at any compatible endpoint
azureOpenAI:
dialect: OpenAIResponses
apiKey: ${AZURE_API_KEY}
baseURL: https://<your-resource>.cognitiveservices.azure.com/openai/v1
azureAnthropic:
dialect: AnthropicMessages
apiKey: ${AZURE_ANTHROPIC_KEY}
baseURL: https://<your-resource>.services.ai.azure.com/anthropic/v1
bedrockOpenAI:
dialect: OpenAIResponses
apiKey: ${BEDROCK_API_KEY}
baseURL: https://bedrock-mantle.us-east-1.api.aws/v1
ollama:
dialect: OpenResponses
baseURL: http://localhost:11434/v1
nanobot.yaml defines shared resources available to all agents:
llmProviders:
ollama:
dialect: OpenResponses
baseURL: http://localhost:11434/v1
mcpServers:
store:
url: https://example.com/mcp
headers:
Authorization: Bearer ${MY_TOKEN}
Agent File Format (agents/main.md):
---
name: Shopping Assistant
model: anthropic/claude-3-7-sonnet-latest
mcpServers:
- store
temperature: 0.7
---
You are a helpful shopping assistant.
Help users find products and answer their questions.
The YAML front-matter supports all agent configuration fields (model, name, mcpServers, tools, temperature, etc.), and the markdown body becomes the agent's instructions. Markdown agents take precedence over any agents defined in nanobot.yaml with the same name.
Usage:
nanobot run ./my-config/
The default config path is .nanobot/, so nanobot run with no arguments will use that directory if it exists.
Features:
- Agent directory: All agent
.md files must be in the agents/ subdirectory
- Auto-entrypoint: If
agents/main.md exists, it's automatically set as the default agent
- Markdown precedence: Agents defined in
agents/*.md override same-named agents in nanobot.yaml
- README.md: Automatically ignored in the
agents/ directory (use it for documentation)
Deprecated: mcp-servers.yaml and mcp-servers.json are still supported for backwards compatibility but are deprecated. Define mcpServers in nanobot.yaml instead.
See the directory-config example for a complete working example.
Development & Contribution
Contributions are welcome! Nanobot is still in alpha, so expect active development and rapid changes.
Build from Source
make
Working on the UI
The Nanobot UI lives in the ./ui directory. To develop against it:
-
Remove the old build artifacts:
rm -rf ./ui/dist
-
Rebuild the Nanobot binary:
make
-
Start the UI in development mode:
cd ui
npm run dev
-
The UI must be served from port 5173.
Nanobot runs on port 8080 and will forward UI requests to :5173.
Features & Roadmap
Nanobot aims to be a fully compliant MCP Host and support all MCP + MCP-UI features.
| Feature Category |
Feature |
Status |
| MCP Core |
TODO |
✅ Implemented |
|
TODO |
🚧 Partial |
|
TODO |
❌ Not yet |
|
TODO |
✅ Implemented |
| MCP-UI |
TODO |
🚧 Partial |
|
TODO |
✅ Implemented |
|
TODO |
❌ Not yet |
✅ = Implemented 🚧 = Partial / WIP ❌ = Not yet ⏳ = Planned
Roadmap
- Full MCP + MCP-UI compliance
- More robust multi-agent support
- Production-ready UI
- Expanded model provider support
- Expanded authentication and security features
- Frontend integrations (Slack, SMS, email, embedded web agents)
- Easy embedding into existing apps and websites
License
Nanobot is licensed under the Apache 2.0 License.
Links