We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/lhmpaiPublic/McpLLMServer'
If you have feedback or need assistance with the MCP directory API, please join our Discord server
{
"version": "0.2.0",
"configurations": [
{
"name": "Fastapi debug (Docker)",
"type": "debugpy",
"request": "attach",
"connect": {
"host": "localhost",
"port": 5678
},
"pathMappings": [
{
"localRoot": "${workspaceFolder}/app",
"remoteRoot": "/app"
}
]
}
]
}