Skip to main content
Glama

MCP Server for Apache OpenDAL™

by Xuanwo
# LlamaIndex Agent Example with OpenDAL MCP ## Start the MCP Server To run this example, you need to have a MCP server running. Make sure you are in the root directory of the project, set the environment variables first: - `OPENDAL_FS_TYPE=fs` - `OPENDAL_FS_ROOT=./examples/` Then, run the following command: ```bash uv sync # To install the project, this should be done only once uv run mcp-server-opendal --transport sse ``` ## Run the Example Set the environment variables below. - `MCP_HOST`: The host of the MCP server - `MCP_PORT`: The port of the MCP server - `OPENAI_API_KEY`: The API key of the OpenAI API - `OPENAI_MODEL`: The model of the OpenAI API - `OPENAI_ENDPOINT`: The endpoint of the OpenAI API Then, run the example with the following command: ```bash uv run examples/llamaindex-with-opendal-mcp.py ```

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Xuanwo/mcp-server-opendal'

If you have feedback or need assistance with the MCP directory API, please join our Discord server