Skip to main content
Glama
smithery.yaml•1.88 kB
# Use container runtime with custom Dockerfile runtime: "container" # Build configuration build: dockerfile: "Dockerfile.smithery" dockerBuildPath: "." # Start command startCommand: type: "http" port: 3000 # Server metadata name: "egw-research-server" description: "MCP server providing fast access to local EGW research database" version: "1.0.0" author: "surgbc" license: "MIT" # Capabilities capabilities: tools: - name: "search_local" description: "Search the local EGW writings database" inputSchema: type: "object" properties: query: type: "string" description: "Search query" limit: type: "number" description: "Maximum results" default: 20 required: ["query"] - name: "get_local_book" description: "Get information about a specific book" inputSchema: type: "object" properties: bookId: type: "number" description: "Book ID" required: ["bookId"] - name: "get_local_content" description: "Get content from a specific book" inputSchema: type: "object" properties: bookId: type: "number" description: "Book ID" limit: type: "number" default: 50 offset: type: "number" default: 0 required: ["bookId"] - name: "list_local_books" description: "List all available books" inputSchema: type: "object" properties: language: type: "string" default: "en" limit: type: "number" default: 50 - name: "get_database_stats" description: "Get database statistics" inputSchema: type: "object" properties: {}

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/pythondev-pro/egw_writings_mcp_server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server