# @llamaindex/mcp-server-llamacloud
## 0.1.3
### Patch Changes
- 6ba4c5c: Update packages and set "Default" project as default
- 1f8fb84: Add topK parameter per index to define how many results to use
## 0.1.2
### Patch Changes
- 99f09b7: fix: send startup logs to stderr to keep stdout JSON-only
## 0.1.1
### Patch Changes
- 7f1b5d0: Support multiple indexes
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/run-llama/mcp-server-llamacloud'
If you have feedback or need assistance with the MCP directory API, please join our Discord server