Skip to main content
Glama

Serena

by oraios
compose.yaml1.21 kB
services: serena: image: serena:latest # To work with projects, you must mount them as volumes: # volumes: # - ./my-project:/workspace/my-project # - /path/to/another/project:/workspace/another-project build: context: ./ dockerfile: Dockerfile target: production ports: - "${SERENA_PORT:-9121}:9121" # MCP server port - "${SERENA_DASHBOARD_PORT:-24282}:24282" # Dashboard port (default 0x5EDA = 24282) environment: - SERENA_DOCKER=1 command: - "uv run --directory . serena-mcp-server --transport sse --port 9121 --host 0.0.0.0" # Add the context for the IDE assistant # - "uv run --directory . serena-mcp-server --transport sse --port 9121 --host 0.0.0.0 --context ide-assistant" serena-dev: image: serena:dev build: context: ./ dockerfile: Dockerfile target: development tty: true stdin_open: true environment: - SERENA_DOCKER=1 volumes: - .:/workspaces/serena ports: - "${SERENA_PORT:-9121}:9121" # MCP server port - "${SERENA_DASHBOARD_PORT:-24282}:24282" # Dashboard port command: - "uv run --directory . serena-mcp-server"

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/oraios/serena'

If you have feedback or need assistance with the MCP directory API, please join our Discord server