Skip to main content
Glama

Katamari MCP Server

by ciphernaut
CONFIGURATION.mdโ€ข12.7 kB
# Katamari MCP Configuration Guide Complete configuration examples for different environments and use cases. ## Table of Contents - [OpenCode (Current Environment)](#opencode-current-environment) - [Claude Desktop](#claude-desktop) - [Web Applications](#web-applications) - [Docker Deployment](#docker-deployment) - [Production/Enterprise](#productionenterprise) - [Environment Variables](#environment-variables) --- ## OpenCode (Current Environment) ### Remote MCP Server Configuration For OpenCode, you configure **remote** MCP servers in your `opencode.jsonc` file. Katamari runs as a web service and OpenCode connects to it via HTTP/WebSocket endpoints. **Important**: Katamari only supports remote connections (stdio transport was removed). You must start the Katamari web service before using it with OpenCode. #### Prerequisites First, start the Katamari MCP server as a web service: ```bash # Start Katamari web service (run once in background) cd /projects/katamari-mcp python -m katamari_mcp.server sse,websocket ``` #### opencode.jsonc ```jsonc { "$schema": "https://opencode.ai/config.json", "mcp": { "katamari-sse": { "type": "remote", "url": "http://localhost:49152/mcp", "enabled": true, "timeout": 10000 }, "katamari-websocket": { "type": "remote", "url": "ws://localhost:49153", "enabled": false, "timeout": 10000 }, "katamari-full": { "type": "remote", "url": "http://localhost:49152/mcp", "enabled": false, "timeout": 15000, "headers": { "X-Transport-Mode": "sse,websocket" } } } } ``` #### Available Endpoints When Katamari server is running, these endpoints are available: - **SSE Transport**: `http://localhost:49152/mcp` - Status: `http://localhost:49152/status` - Stream: `http://localhost:49152/mcp` (GET for SSE, POST for commands) - **WebSocket Transport**: `ws://localhost:49153` - Direct WebSocket connection for real-time communication #### Usage Examples Once configured, Katamari tools are automatically available in OpenCode: ``` Search for information about quantum computing using katamari-sse tools Analyze data in real-time using katamari-websocket tools Research latest AI developments using katamari-full tools ``` #### Agent Configuration Configure Katamari tools per agent for specialized workflows: ```jsonc { "$schema": "https://opencode.ai/config.json", "mcp": { "katamari-sse": { "type": "remote", "url": "http://localhost:49152/mcp", "enabled": true } }, "agent": { "web-researcher": { "description": "Agent for web research and data collection", "tools": { "katamari*": true }, "rules": [ "When searching for web information, use katamari web_search tool", "When scraping websites, use katamari web_scrape tool", "Always cite sources when using web_search results" ] } } } ``` #### Tool Management Enable/disable tools globally or per agent: ```jsonc { "$schema": "https://opencode.ai/config.json", "mcp": { "katamari-sse": { "type": "remote", "url": "http://localhost:49152/mcp", "enabled": true } }, "tools": { "katamari-sse": true, "katamari-websocket": false } } ``` Or use glob patterns: ```jsonc { "tools": { "katamari*": true } } ``` --- ## Claude Desktop ### Configuration File Location - **macOS**: `~/Library/Application Support/Claude/claude_desktop_config.json` - **Windows**: `%APPDATA%\Claude\claude_desktop_config.json` - **Linux**: `~/.config/claude/claude_desktop_config.json` ### Basic Configuration ```json { "mcpServers": { "katamari": { "command": "python", "args": ["-m", "katamari_mcp.server", "stdio"], "cwd": "/path/to/katamari-mcp", "env": { "PYTHONPATH": "/path/to/katamari-mcp", "KATAMARI_LOG_LEVEL": "INFO" } } } } ``` ### Advanced Configuration with Multiple Transports ```json { "mcpServers": { "katamari-sse": { "command": "python", "args": ["-m", "katamari_mcp.server", "sse"], "cwd": "/path/to/katamari-mcp", "env": { "PYTHONPATH": "/path/to/katamari-mcp" } }, "katamari-websocket": { "command": "python", "args": ["-m", "katamari_mcp.server", "websocket"], "cwd": "/path/to/katamari-mcp", "env": { "PYTHONPATH": "/path/to/katamari-mcp" } } } } ``` --- --- ## Web Applications ### Remote Server Configuration For web applications, configure Katamari as a remote MCP server: ```javascript // Client-side configuration const katamariConfig = { name: "katamari-remote", type: "remote", url: "http://localhost:49152/mcp", headers: { "Content-Type": "application/json" }, timeout: 10000 }; // SSE Connection const eventSource = new EventSource('http://localhost:49152/mcp'); // WebSocket Connection const ws = new WebSocket('ws://localhost:49153'); ``` ### React Integration ```jsx import { MCPClient } from '@mcp/client'; function KatamariProvider({ children }) { const katamariClient = new MCPClient({ name: 'katamari', transport: 'sse', url: 'http://localhost:49152/mcp' }); return ( <MCPProvider client={katamariClient}> {children} </MCPProvider> ); } ``` --- ## Docker Deployment ### Dockerfile ```dockerfile FROM python:3.9-slim WORKDIR /app # Install system dependencies RUN apt-get update && apt-get install -y \ git \ curl \ && rm -rf /var/lib/apt/lists/* # Copy project COPY . . # Install Python dependencies RUN pip install --no-cache-dir -e . # Create data directories RUN mkdir -p .katamari/acp/{feedback,metrics,adjustments,performance} # Expose ports EXPOSE 49152 49153 # Health check HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \ CMD curl -f http://localhost:49152/status || exit 1 # Start server CMD ["python", "-m", "katamari_mcp.server", "sse,websocket"] ``` ### Docker Compose ```yaml version: '3.8' services: katamari: build: . ports: - "49152:49152" # SSE - "49153:49153" # WebSocket environment: - PYTHONPATH=/app - KATAMARI_LOG_LEVEL=INFO - KATAMARI_SSE_PORT=49152 - KATAMARI_WS_PORT=49153 volumes: - ./data:/app/.katamari - ./logs:/app/logs restart: unless-stopped healthcheck: test: ["CMD", "curl", "-f", "http://localhost:49152/status"] interval: 30s timeout: 10s retries: 3 start_period: 40s # Optional: Redis for session persistence redis: image: redis:7-alpine ports: - "6379:6379" volumes: - redis_data:/data restart: unless-stopped volumes: redis_data: ``` --- ## Production/Enterprise ### Nginx Reverse Proxy ```nginx upstream katamari_sse { server localhost:49152; } upstream katamari_ws { server localhost:49153; } server { listen 80; server_name katamari.example.com; # SSE endpoints location /mcp { proxy_pass http://katamari_sse; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; # SSE specific headers proxy_cache off; proxy_buffering off; proxy_set_header Connection ''; proxy_http_version 1.1; chunked_transfer_encoding off; } # WebSocket endpoints location /ws { proxy_pass http://katamari_ws; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection "upgrade"; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; } # Health check location /health { proxy_pass http://katamari_sse/status; access_log off; } } ``` ### Systemd Service ```ini [Unit] Description=Katamari MCP Server After=network.target [Service] Type=simple User=katamari Group=katamari WorkingDirectory=/opt/katamari-mcp Environment=PYTHONPATH=/opt/katamari-mcp Environment=KATAMARI_LOG_LEVEL=INFO ExecStart=/opt/katamari-mcp/.venv/bin/python -m katamari_mcp.server sse,websocket Restart=always RestartSec=10 # Security NoNewPrivileges=true PrivateTmp=true ProtectSystem=strict ProtectHome=true ReadWritePaths=/opt/katamari-mcp/.katamari [Install] WantedBy=multi-user.target ``` --- ## Environment Variables ### Core Configuration ```bash # Server Configuration export KATAMARI_LOG_LEVEL=INFO # DEBUG, INFO, WARNING, ERROR export KATAMARI_WORKSPACE_ROOT=/projects/katamari-mcp export PYTHONPATH=/projects/katamari-mcp # Transport Configuration export KATAMARI_TRANSPORTS=sse,websocket export KATAMARI_SSE_PORT=49152 export KATAMARI_WS_PORT=49153 export KATAMARI_SSE_HOST=localhost export KATAMARI_WS_HOST=localhost # ACP Configuration export KATAMARI_ACP_ENABLED=true export KATAMARI_ACP_HEURISTICS_FILE=.katamari/heuristics.json export KATAMARI_ACP_METRICS_RETENTION_DAYS=30 # Performance Configuration export KATAMARI_MAX_CONCURRENT_REQUESTS=100 export KATAMARI_REQUEST_TIMEOUT=30000 export KATAMARI_SESSION_TIMEOUT=3600 # Security Configuration export KATAMARI_CORS_ORIGINS="*" export KATAMARI_RATE_LIMIT_REQUESTS=1000 export KATAMARI_RATE_LIMIT_WINDOW=3600 ``` ### Development Environment ```bash # .env file for development KATAMARI_LOG_LEVEL=DEBUG KATAMARI_TRANSPORTS=sse,websocket KATAMARI_ACP_ENABLED=true PYTHONPATH=./ ``` ### Production Environment ```bash # .env.production file KATAMARI_LOG_LEVEL=INFO KATAMARI_TRANSPORTS=sse,websocket KATAMARI_SSE_HOST=0.0.0.0 KATAMARI_WS_HOST=0.0.0.0 KATAMARI_CORS_ORIGINS="https://app.example.com" KATAMARI_RATE_LIMIT_REQUESTS=100 KATAMARI_ACP_ENABLED=true ``` --- ## Quick Start Commands ### OpenCode (Current Environment) ```bash # Start Katamari web service first python -m katamari_mcp.server sse,websocket & # Test configuration in OpenCode opencode # Then run: "use katamari-sse tool to search for test" # Verify MCP server is working curl http://localhost:49152/status ``` ### Claude Desktop **Note**: Claude Desktop requires stdio transport, which Katamari no longer supports. Use OpenCode for local development or configure Claude Desktop to connect to Katamari's remote endpoints if supported. ### Alternative: Remote Configuration (if supported) ```json { "mcpServers": { "katamari-remote": { "command": "curl", "args": ["-X", "POST", "http://localhost:49152/mcp", "-H", "Content-Type: application/json"], "cwd": "/path/to/katamari-mcp" } } } ``` **Recommendation**: Use OpenCode for the best Katamari MCP experience. ### Docker ```bash # Build and run docker build -t katamari-mcp . docker run -p 49152:49152 -p 49153:49153 katamari-mcp # With Docker Compose docker-compose up -d ``` --- ## Troubleshooting ### Common Issues 1. **Port Already in Use** ```bash # Check what's using the port lsof -i :49152 # Change port in environment variables export KATAMARI_SSE_PORT=49154 ``` 2. **Import Errors** ```bash # Ensure virtual environment is activated source .venv/bin/activate # Install missing dependencies pip install -r requirements.txt ``` 3. **Permission Issues** ```bash # Fix file permissions chmod +x start_server.sh chown -R $USER:$USER .katamari ``` 4. **Transport Connection Issues** ```bash # Test endpoints directly curl http://localhost:49152/status # Check logs tail -f server.log ``` ### Debug Mode ```bash # Enable debug logging export KATAMARI_LOG_LEVEL=DEBUG ./start_server.sh sse,websocket # Monitor with logs ./start_server.sh sse,websocket 2>&1 | tee debug.log ``` --- ## Performance Tuning ### High-Performance Configuration ```json { "mcp": { "katamari": { "type": "remote", "url": "http://localhost:49152/mcp", "enabled": true, "timeout": 15000, "headers": { "X-Performance-Mode": "high" } } } } ``` ### Resource Monitoring ```bash # Monitor resource usage htop -p $(pgrep -f "katamari_mcp.server") # Monitor network connections netstat -an | grep -E ":(49152|49153)" # Monitor logs tail -f .katamari/logs/server.log | grep ERROR ``` This configuration guide covers all major deployment scenarios for Katamari MCP. Choose the configuration that best fits your environment and use case.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ciphernaut/katamari-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server