Skip to main content
Glama

Graphiti Knowledge Graph MCP Server

by michabbb

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
NEO4J_URINoURI for the Neo4j databasebolt://localhost:7687
MODEL_NAMENoOpenAI model name to use for LLM operations
NEO4J_USERNoNeo4j usernameneo4j
NEO4J_PASSWORDNoNeo4j passworddemodemo
OPENAI_API_KEYYesOpenAI API key (required for LLM operations)
LLM_TEMPERATURENoTemperature for LLM responses (0.0-2.0)
OPENAI_BASE_URLNoOptional base URL for OpenAI API
SEMAPHORE_LIMITNoEpisode processing concurrency. See Concurrency and LLM Provider 429 Rate Limit Errors
SMALL_MODEL_NAMENoOpenAI model name to use for smaller LLM operations
AZURE_OPENAI_ENDPOINTNoOptional Azure OpenAI LLM endpoint URL
AZURE_OPENAI_API_VERSIONNoOptional Azure OpenAI LLM API version
GRAPHITI_TELEMETRY_ENABLEDNoSet to false to disable telemetry in the MCP serverfalse
AZURE_OPENAI_DEPLOYMENT_NAMENoOptional Azure OpenAI LLM deployment name
AZURE_OPENAI_EMBEDDING_API_KEYNoOptional Azure OpenAI Embedding deployment key (if other than OPENAI_API_KEY)
AZURE_OPENAI_EMBEDDING_ENDPOINTNoOptional Azure OpenAI Embedding endpoint URL
AZURE_OPENAI_USE_MANAGED_IDENTITYNoOptional use Azure Managed Identities for authentication
AZURE_OPENAI_EMBEDDING_API_VERSIONNoOptional Azure OpenAI API version
AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAMENoOptional Azure OpenAI embedding deployment name

Schema

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/michabbb/graphiti-mcp-but-working'

If you have feedback or need assistance with the MCP directory API, please join our Discord server