Skip to main content
Glama

LlamaCloud MCP Server

by run-llama

LlamaCloud MCP 服务器

一个 MCP 服务器连接到LlamaCloud上的多个托管索引

这是一个基于 TypeScript 的 MCP 服务器,它创建了多个工具,每个工具都连接到 LlamaCloud 上的特定托管索引。每个工具都通过命令行参数定义。

特征

工具

  • 为您定义的每个索引创建一个单独的工具
  • 每个工具都提供一个query参数来搜索其特定索引
  • 根据索引名称自动生成工具名称,例如get_information_index_name

安装

要与您的 MCP 客户端(例如 Claude Desktop、Windsurf 或 Cursor)一起使用,请将以下配置添加到您的 MCP 客户端配置中:

{ "mcpServers": { "llamacloud": { "command": "npx", "args": [ "-y", "@llamaindex/mcp-server-llamacloud", "--index", "10k-SEC-Tesla", "--description", "10k SEC documents from 2023 for Tesla", "--index", "10k-SEC-Apple", "--description", "10k SEC documents from 2023 for Apple" ], "env": { "LLAMA_CLOUD_PROJECT_NAME": "<YOUR_PROJECT_NAME>", "LLAMA_CLOUD_API_KEY": "<YOUR_API_KEY>" } } } }

对于 Claude,MCP 配置可以在以下位置找到:

  • 在 MacOS 上: ~/Library/Application Support/Claude/claude_desktop_config.json
  • 在 Windows 上: %APPDATA%/Claude/claude_desktop_config.json

工具定义格式

在 MCP 配置的args数组中,您可以通过提供--index--description参数对来定义多个工具。每对参数定义一个新工具。

例如:

--index "10k-SEC-Tesla" --description "10k SEC documents from 2023 for Tesla"

向 MCP 服务器添加10k-SEC-Tesla LlamaCloud 索引工具。

发展

安装依赖项:

npm install

构建服务器:

npm run build

对于使用自动重建的开发:

npm run watch

要使用开发版本,请在 MCP 配置中将npx @llamaindex/mcp-server-llamacloud替换为node ./build/index.js

调试

由于 MCP 服务器通过 stdio 进行通信,调试起来可能比较困难。我们推荐使用MCP Inspector ,它以包脚本的形式提供:

npm run inspector

检查器将提供一个 URL 来访问浏览器中的调试工具。

Install Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

连接到LlamaCloud托管索引的 MCP 服务器。这是一个基于 TypeScript 的 MCP 服务器,实现了与 LlamaCloud 托管索引的连接。

  1. 特征
    1. 工具
  2. 安装
    1. 工具定义格式
  3. 发展
    1. 调试

Related MCP Servers

  • A
    security
    A
    license
    A
    quality
    An MCP server implementation that integrates the Tavily Search API, providing optimized search capabilities for LLMs.
    Last updated -
    1
    2
    TypeScript
    MIT License
    • Apple
  • -
    security
    F
    license
    -
    quality
    A MCP server that exposes OpenAPI schema information to LLMs like Claude. This server allows an LLM to explore and understand large OpenAPI schemas through a set of specialized tools, without needing to load the whole schema into the context
    Last updated -
    58
    35
    JavaScript
    • Apple
    • Linux
  • -
    security
    F
    license
    -
    quality
    An MCP server that fetches real-time documentation for popular libraries like Langchain, Llama-Index, MCP, and OpenAI, allowing LLMs to access updated library information beyond their knowledge cut-off dates.
    Last updated -
    2
    Python
  • A
    security
    F
    license
    A
    quality
    A lightweight MCP server that provides a unified interface to various LLM providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama.
    Last updated -
    6
    545
    Python

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/run-llama/mcp-server-llamacloud'

If you have feedback or need assistance with the MCP directory API, please join our Discord server