Skip to main content
Glama
nosolosoft

OpenCode MCP Server

by nosolosoft

OpenCode MCP Server

Python 3.10+ MCP 1.2.0+ License: MIT

An MCP (Model Context Protocol) server that provides seamless integration with OpenCode, the open-source AI coding agent for the terminal.

Features

  • Execute OpenCode Commands: Run any OpenCode CLI command programmatically

  • Session Management: Create, continue, and export coding sessions

  • Model Discovery: List available AI models from all configured providers

  • Async Execution: Non-blocking command execution with timeout handling

  • JSON Lines Parsing: Robust parsing of OpenCode's streaming output format

Tools Available

Tool

Description

execute_opencode_command

Execute any OpenCode CLI command with full flexibility

opencode_run

Run OpenCode with a simple prompt message

opencode_continue_session

Continue an existing OpenCode session

opencode_list_models

List available models, optionally filtered by provider

opencode_export_session

Export session data as JSON

opencode_get_status

Check OpenCode CLI availability and status

Installation

Prerequisites

  • Python 3.10+

  • OpenCode CLI installed and configured

  • MCP-compatible client (Claude Desktop, etc.)

Install Dependencies

pip install -r requirements.txt

Configure MCP Client

Add to your MCP client configuration (e.g., ~/.claude.json or Claude Desktop settings):

{ "mcpServers": { "opencode": { "command": "python", "args": ["-m", "src.services.fast_mcp.opencode_server"], "cwd": "/path/to/opencode-mcp" } } }

Usage

Basic Usage

Once configured, the MCP tools are available through your MCP client:

# Run a coding task opencode_run(message="Create a Python function that calculates fibonacci numbers") # List available models opencode_list_models(provider="anthropic") # Continue a previous session opencode_continue_session(session_id="abc123", message="Now add unit tests") # Check status opencode_get_status()

Tool Parameters

execute_opencode_command

{ "prompt": str, # Required: The prompt/task for OpenCode "model": str, # Optional: Model in provider/model format (e.g., "anthropic/claude-sonnet-4-20250514") "agent": str, # Optional: Agent to use (e.g., "build", "plan") "session": str, # Optional: Session ID to continue "continue_session": bool, # Optional: Whether to continue last session "timeout": int # Optional: Timeout in seconds (default: 300, max: 600) }

opencode_run

{ "message": str, # Required: Message/prompt to send "model": str, # Optional: Model to use "agent": str, # Optional: Agent to use "files": [str], # Optional: Files to attach "timeout": int # Optional: Timeout in seconds }

opencode_continue_session

{ "session_id": str, # Required: Session ID to continue "message": str, # Optional: Follow-up message "timeout": int # Optional: Timeout in seconds }

Configuration

Environment variables (prefix: OPENCODE_):

Variable

Default

Description

OPENCODE_COMMAND

opencode

Path to OpenCode CLI

OPENCODE_DEFAULT_MODEL

None

Default model to use

OPENCODE_DEFAULT_AGENT

None

Default agent to use

OPENCODE_DEFAULT_TIMEOUT

300

Default timeout (seconds)

OPENCODE_MAX_TIMEOUT

600

Maximum timeout (seconds)

OPENCODE_SERVER_LOG_LEVEL

INFO

Logging level

Architecture

src/services/fast_mcp/opencode_server/ ├── __init__.py ├── __main__.py # Entry point ├── server.py # MCP server & tool definitions ├── opencode_executor.py # CLI execution wrapper ├── models.py # Pydantic models ├── settings.py # Configuration └── handlers/ ├── __init__.py ├── execution.py # Run/continue operations ├── session.py # Session management └── discovery.py # Model/status discovery

Development

Running Tests

pytest tests/ -v

Code Formatting

black src/ ruff check src/

Roadmap

Planned features for v2.0:

  • opencode_import_session - Import sessions from JSON/URL

  • opencode_list_sessions - List all sessions with filtering

  • opencode_get_stats - Usage statistics and cost tracking

  • opencode_list_agents - List available agents

  • opencode_github_run - GitHub Actions integration (async)

  • opencode_pr_checkout - PR workflow support

Contributing

Contributions are welcome! Please read the contributing guidelines before submitting PRs.

License

MIT License - see LICENSE for details.

Acknowledgments

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/nosolosoft/opencode-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server