Skip to main content
Glama

Superglue MCP

Official
by superglue-ai
turbo.json1.98 kB
{ "$schema": "https://turbo.build/schema.json", "tasks": { "build": { "dependsOn": [ "^build" ], "outputs": [ "dist/**", ".next/**" ], "inputs": [ "src/**/*.tsx", "src/**/*.ts", "*.ts", "*.tsx", "package.json" ] }, "dev": { "dependsOn": [ "build" ], "cache": false, "persistent": true }, "start": { "cache": false, "persistent": true }, "test": { "dependsOn": [ "@superglue/core#build" ], "outputs": [] }, "test:coverage": { "dependsOn": [ "@superglue/core#build" ], "outputs": [ "coverage/**" ] } }, "globalEnv": [ "GRAPHQL_ENDPOINT", "GRAPHQL_PORT", "WEB_PORT", "DATASTORE_TYPE", "REDIS_HOST", "REDIS_PORT", "REDIS_USERNAME", "REDIS_PASSWORD", "NODE_ENV", "AUTH_TOKEN", "OPENAI_API_KEY", "OPENAI_MODEL", "OPENAI_BASE_URL", "OPENAI_API_VERSION", "NEXT_PUBLIC_SUPERGLUE_ENDPOINT", "NEXT_PUBLIC_SUPERGLUE_API_KEY", "POSTGRES_HOST", "POSTGRES_PORT", "POSTGRES_USERNAME", "POSTGRES_PASSWORD", "POSTGRES_DB", "POSTGRES_SSL", "PRIV_SUPABASE_SERVICE_ROLE_KEY", "NEXT_PUBLIC_SUPABASE_URL", "NEXT_PUBLIC_SUPABASE_ANON_KEY", "STORAGE_DIR", "LLM_PROVIDER", "AI_GATEWAY_MODEL", "AI_GATEWAY_API_KEY", "FRONTEND_LLM_PROVIDER", "FRONTEND_LLM_MODEL", "GEMINI_API_KEY", "GEMINI_MODEL", "OPENAI_API_BASE_URL", "NEXT_PUBLIC_DISABLE_WELCOME_SCREEN", "API_PORT", "API_ENDPOINT", "START_SCHEDULER_SERVER", "ANTHROPIC_API_KEY", "ANTHROPIC_MODEL", "AZURE_OPENAI_ENDPOINT", "MASTER_ENCRYPTION_KEY", "DISABLE_TELEMETRY", "AZURE_API_KEY", "AZURE_MODEL", "AZURE_RESOURCE_NAME", "AZURE_BASE_URL", "AZURE_API_VERSION", "AZURE_USE_DEPLOYMENT_BASED_URLS" ] }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/superglue-ai/superglue'

If you have feedback or need assistance with the MCP directory API, please join our Discord server