Skip to main content
Glama

Cursor Conversations MCP Server

.taskmasterconfig733 B
{ "models": { "main": { "provider": "anthropic", "modelId": "claude-sonnet-4-20250514", "maxTokens": 120000, "temperature": 0.2 }, "research": { "provider": "perplexity", "modelId": "sonar-pro", "maxTokens": 8700, "temperature": 0.1 }, "fallback": { "provider": "anthropic", "modelId": "claude-3-5-sonnet-20240620", "maxTokens": 8192, "temperature": 0.1 } }, "global": { "logLevel": "info", "debug": false, "defaultSubtasks": 5, "defaultPriority": "medium", "projectName": "Taskmaster", "ollamaBaseUrl": "http://localhost:11434/api", "azureOpenaiBaseUrl": "https://your-endpoint.openai.azure.com/" } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/vltansky/cursor-conversations-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server