Skip to main content
Glama

Parquet MCP Server

by DeepSpringAI
smithery.yaml1.78 kB
# Smithery configuration file: https://smithery.ai/docs/config#smitheryyaml build: dockerfile: Dockerfile dockerBuildPath: . startCommand: type: stdio configSchema: # JSON Schema defining the configuration options for the MCP. type: object required: - ollama_url - embedding_url - embedding_model properties: ollama_url: type: string description: URL for the Ollama server postgres_db: type: string description: PostgreSQL database name embedding_url: type: string description: URL for the embedding service postgres_host: type: string description: PostgreSQL host postgres_port: type: string description: PostgreSQL port postgres_user: type: string description: PostgreSQL username embedding_model: type: string description: Model to use for generating embeddings postgres_password: type: string description: PostgreSQL password commandFunction: # A JS function that produces the CLI command based on the given config to start the MCP on stdio. |- (config) => ({ "command": "uv", "args": [ "--directory", "./src/parquet_mcp_server", "run", "main.py" ], "env": { "OLLAMA_URL": config.ollama_url, "EMBEDDING_URL": config.embedding_url, "EMBEDDING_MODEL": config.embedding_model, "POSTGRES_DB": config.postgres_db || "", "POSTGRES_USER": config.postgres_user || "", "POSTGRES_PASSWORD": config.postgres_password || "", "POSTGRES_HOST": config.postgres_host || "", "POSTGRES_PORT": config.postgres_port || "5432" } })

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/DeepSpringAI/search_mcp_server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server