Skip to main content
Glama

MCP Goose Subagents Server

by lordstyled55
opencode-config-full.json847 B
{ "$schema": "https://opencode.ai/config.json", "mcp": { "goose-subagents": { "type": "local", "command": ["node", "C:\\Users\\adam0\\projects\\New folder\\src\\index.js"], "enabled": true, "environment": { "ALPHA_FEATURES": "true" } }, "memory": { "type": "local", "command": ["npx", "-y", "@modelcontextprotocol/server-memory"], "enabled": true, "environment": { "MEMORY_FILE_PATH": "memory.json" } }, "sequential-thinking": { "type": "local", "command": ["npx", "-y", "@modelcontextprotocol/server-sequential-thinking"], "enabled": true }, "filesystem": { "type": "local", "command": ["npx", "-y", "@modelcontextprotocol/server-filesystem", "C:\\Users\\adam0\\projects"], "enabled": true } } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/lordstyled55/goose-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server