We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/willianpinho/large-file-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server
{
"mcpServers": {
"large-file": {
"command": "npx",
"args": ["-y", "@willianpinho/large-file-mcp"],
"env": {
"CHUNK_SIZE": "500",
"OVERLAP_LINES": "10",
"CACHE_SIZE": "104857600",
"CACHE_TTL": "300000",
"CACHE_ENABLED": "true"
}
}
}
}