Skip to main content
Glama

Model Context Protocol Server for NebulaGraph

A Model Context Protocol (MCP) server implementation that provides access to NebulaGraph.

PyPI - Version PyPI - Python Version Lint and Test

Features

  • Seamless access to NebulaGraph 3.x .

  • Get ready for graph exploration, you know, Schema, Query, and a few shortcut algorithms.

  • Follow Model Context Protocol, ready to integrate with LLM tooling systems.

  • Simple command-line interface with support for configuration via environment variables and .env files.

LlamaIndex with NebulaGraph MCP

Related MCP server: MongoDB MCP Server for LLMs

Installation

pip install nebulagraph-mcp-server

Usage

nebulagraph-mcp-server will load configs from .env, for example:

NEBULA_VERSION=v3 # only v3 is supported NEBULA_HOST=<your-nebulagraph-server-host> NEBULA_PORT=<your-nebulagraph-server-port> NEBULA_USER=<your-nebulagraph-server-user> NEBULA_PASSWORD=<your-nebulagraph-server-password>

It requires the value of NEBULA_VERSION to be equal to v3 until we are ready for v5.

Development

npx @modelcontextprotocol/inspector \ uv run nebulagraph-mcp-server

Credits

The layout and workflow of this repo is copied from mcp-server-opendal.

-
security - not tested
A
license - permissive license
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/nebula-contrib/nebulagraph-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server