OpenCode School

Lesson 9

Tools

Extend OpenCode with external tools and MCP servers.

When you type a message to an AI, it generates text. That’s it — on its own, it can only produce words. But words aren’t always enough. Sometimes you need the AI to actually do something: look up today’s weather, search the web, read a file, or call an API.

That’s what tools are for. A tool is a function the AI can call when it decides one is needed — it recognizes the situation, picks the right tool, runs it, and uses the result to give you a better answer. This is sometimes called “function calling” or “tool use.”

OpenCode comes with built-in tools for things like reading files, editing code, and running shell commands. But you can extend it with external tools too, using a standard called MCP (Model Context Protocol). An MCP server is a small program that exposes tools to the AI — things like searching your calendar, querying a database, or fetching live data from the web. You connect an MCP server to OpenCode once, and those tools become available in every session.

Two kinds of MCP servers

Local servers run as a process on your own machine. OpenCode starts them automatically on launch using a command you specify in your config — typically npx followed by a package name. Local servers are common for tools that need access to your filesystem or local environment.

Remote servers run in the cloud. You connect to them by URL — no local process required. Many services publish remote MCP servers so you can give your AI agent access to their platform without installing anything.

Public vs. authenticated

Public servers require no credentials. You add them to your config and they work immediately.

Token-authenticated servers require an API key. You add the key to your config, usually via an environment variable so it’s not stored in plain text.

OAuth servers open a browser to log in to the service. OpenCode handles the OAuth flow automatically — it detects when authentication is needed and prompts you to complete it.

Add a weather MCP server

Let’s install a local MCP server that fetches live weather data from Open-Meteo — a free, public weather API with no account or API key required.

Even though MCP servers are managed in the global OpenCode config, you don’t have to edit the file yourself — you can just ask OpenCode to do it. Try asking it to add the following to your ~/.config/opencode/opencode.jsonc:

"mcp": {
  "open-meteo": {
    "type": "local",
    "command": ["npx", "-y", "-p", "open-meteo-mcp-server", "open-meteo-mcp-server"]
  }
}

This tells OpenCode to start the open-meteo-mcp-server package via npx each time it launches. The server runs locally on your machine and connects to the Open-Meteo API when a weather tool is called.

This requires Node.js 22 or later. If you don’t have it installed, ask OpenCode to help you install it for your operating system.

Check with /mcp

After adding a server, type /mcp in OpenCode to see a list of your configured MCP servers and their connection status. This is a good habit — pop it open after installing a new server to confirm it’s connected and, if needed, authenticated.

Restart OpenCode

MCP servers are connected at startup. Quit and reopen OpenCode Desktop, then use the prompt below to continue.

Try it

Once you’re back, ask OpenCode something like:

What’s the weather forecast for Tokyo this week?

Watch the UI as OpenCode responds — you should see it make a tool call to the weather server, fetch the data, and use the result in its answer. That’s the AI deciding a tool is needed, calling it, and working with what comes back. This is what distinguishes an agent from a chatbot.

If you have several tools available, the model might not always pick the one you expect. You can nudge it by mentioning the tool by name — for example, “use the open-meteo MCP server to check the weather in Tokyo.” This makes the model more likely to reach for that specific tool instead of trying to answer from its training data or picking a different tool.

A note on context

Each MCP server you add registers its tools with the model at the start of every session. With many servers enabled, those tool descriptions can take up a meaningful portion of your context window. OpenCode’s docs recommend being selective — only enable the servers you actually use regularly.

Find more MCP servers

The OpenCode MCP docs include ready-to-use examples for services like Sentry, Context7, and Grep. Beyond that, the broader MCP ecosystem has hundreds of servers for GitHub, Notion, Slack, databases, and more — a growing library of ways to give your AI agent new capabilities.