Integrations
Supports configuration through .env files for credential management, enabling secure storage of Graphistry authentication details.
Provides containerized deployment of the server with Docker, allowing isolated execution with proper credential configuration.
Hosts the repository for the MCP server at bmorphism/graphistry-mcp, allowing users to clone and install the server from GitHub.
Graphistry MCP Integration
GPU-accelerated graph visualization and analytics for Large Language Models using Graphistry and MCP.
Overview
This project integrates Graphistry's powerful GPU-accelerated graph visualization platform with the Model Control Protocol (MCP), enabling advanced graph analytics capabilities for AI assistants and LLMs. It allows LLMs to visualize and analyze complex network data through a standardized, LLM-friendly interface.
Key features:
- GPU-accelerated graph visualization via Graphistry
- Advanced pattern discovery and relationship analysis
- Network analytics (community detection, centrality, path finding, anomaly detection)
- Support for various data formats (Pandas, NetworkX, edge lists)
- LLM-friendly API: single
graph_data
dict for graph tools
🚨 Important: Graphistry Registration Required
This MCP server requires a free Graphistry account to use visualization features.
- Sign up for a free account at hub.graphistry.com
- Set your credentials as environment variables or in a
.env
file before starting the server:SeeCopy.env.example
for a template.
MCP Configuration (.mcp.json)
To use this project with Cursor or other MCP-compatible tools, you need a .mcp.json
file in your project root. A template is provided as .mcp.json.example
.
Setup:
Edit .mcp.json
to:
- Set the correct paths for your environment (e.g., project root, Python executable, server script)
- Set your Graphistry credentials (or use environment variables/.env)
- Choose between HTTP and stdio modes:
graphistry-http
: Connects via HTTP (set theurl
to match your server's port)graphistry
: Connects via stdio (set thecommand
,args
, andenv
as needed)
Note:
.mcp.json.example
contains both HTTP and stdio configurations. Enable/disable as needed by setting thedisabled
field.- See
.env.example
for environment variable setup.
Installation
Recommended Installation (Python venv + pip)
Or use the setup script:
Usage
Starting the Server
Security & Credential Handling
- The server loads credentials from environment variables or
.env
using python-dotenv, so you can safely use a.env
file for local development. - The
start-graphistry-mcp.sh
script sources.env
and is the most robust and secure way to launch the server.
Adding to Cursor (or other LLM tools)
- Add the MCP server to your
.cursor/mcp.json
or equivalent config:Copy - Make sure the virtual environment is used (either by using the full path to the venv's python, or by activating it before launching).
- If you see errors about API version or missing credentials, double-check your environment variables and registration.
Example: Visualizing a Graph (LLM-friendly API)
The main tool, visualize_graph
, now accepts a single graph_data
dictionary. Example:
What We Learned / Caveats & Gotchas
- Environment Variables & .env: Always ensure
GRAPHISTRY_USERNAME
andGRAPHISTRY_PASSWORD
are set in the environment or.env
file. The server loads.env
automatically usingpython-dotenv
. - Virtual Environment: Use the venv's Python directly or activate the environment before running the server. This avoids dependency and path issues.
- Cursor Integration: When adding to Cursor, use the full path to the venv's Python and ensure all environment variables are set in the config.
Contributing
PRs and issues welcome! This project is evolving rapidly as we learn more about LLM-driven graph analytics and tool integration.
License
MIT
This server cannot be installed
GPU-accelerated graph visualization and analytics server for Large Language Models that integrates with Model Control Protocol (MCP), enabling AI assistants to visualize and analyze complex network data.
Related MCP Servers
- AsecurityFlicenseAqualityModel Context Protocol (MCP) server that integrates Redash with AI assistants like Claude, allowing them to query data, manage visualizations, and interact with dashboards through natural language.Last updated -105316JavaScript
- -securityFlicense-qualityAn MCP server that enables graph database interactions with Neo4j, allowing users to access and manipulate graph data through natural language commands.Last updated -Python
- -securityAlicense-qualityA Model Context Protocol server that enables Claude and other MCP-compatible AI assistants to securely access and manage PI Dashboard resources including categories and charts.Last updated -252JavaScriptApache 2.0
- -securityFlicense-qualityAn MCP server that allows AI models to create data visualizations using Vega-Lite syntax by providing tools to save data tables and generate visualizations from them.Last updated -Python