Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| NEO4J_URI | No | URI for the Neo4j database | bolt://localhost:7687 |
| MODEL_NAME | No | OpenAI model name to use for LLM operations | |
| NEO4J_USER | No | Neo4j username | neo4j |
| NEO4J_PASSWORD | No | Neo4j password | demodemo |
| OPENAI_API_KEY | Yes | OpenAI API key (required for LLM operations) | |
| LLM_TEMPERATURE | No | Temperature for LLM responses (0.0-2.0) | |
| OPENAI_BASE_URL | No | Optional base URL for OpenAI API | |
| SEMAPHORE_LIMIT | No | Episode processing concurrency. See Concurrency and LLM Provider 429 Rate Limit Errors | |
| SMALL_MODEL_NAME | No | OpenAI model name to use for smaller LLM operations | |
| AZURE_OPENAI_ENDPOINT | No | Optional Azure OpenAI LLM endpoint URL | |
| AZURE_OPENAI_API_VERSION | No | Optional Azure OpenAI LLM API version | |
| GRAPHITI_TELEMETRY_ENABLED | No | Set to false to disable telemetry in the MCP server | false |
| AZURE_OPENAI_DEPLOYMENT_NAME | No | Optional Azure OpenAI LLM deployment name | |
| AZURE_OPENAI_EMBEDDING_API_KEY | No | Optional Azure OpenAI Embedding deployment key (if other than OPENAI_API_KEY) | |
| AZURE_OPENAI_EMBEDDING_ENDPOINT | No | Optional Azure OpenAI Embedding endpoint URL | |
| AZURE_OPENAI_USE_MANAGED_IDENTITY | No | Optional use Azure Managed Identities for authentication | |
| AZURE_OPENAI_EMBEDDING_API_VERSION | No | Optional Azure OpenAI API version | |
| AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME | No | Optional Azure OpenAI embedding deployment name |
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |