Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| NEO4J_URI | Yes | URI for the Neo4j database | bolt://localhost:7687 |
| MODEL_NAME | No | OpenAI model name to use for LLM operations | gpt-4.1-mini |
| NEO4J_USER | Yes | Neo4j username | neo4j |
| NEO4J_PASSWORD | Yes | Neo4j password | demodemo |
| OPENAI_API_KEY | No | OpenAI API key (required for LLM operations) | |
| LLM_TEMPERATURE | No | Temperature for LLM responses (0.0-2.0) | 0.0 |
| MCP_SERVER_HOST | No | Host to bind the server to | 127.0.0.1 |
| MCP_SERVER_PORT | No | Port to bind the server to | 8000 |
| OPENAI_BASE_URL | No | Optional base URL for OpenAI API | |
| SEMAPHORE_LIMIT | No | Episode processing concurrency | 10 |
| SMALL_MODEL_NAME | No | OpenAI model name to use for smaller LLM operations | gpt-4.1-nano |
| AZURE_OPENAI_ENDPOINT | No | Optional Azure OpenAI LLM endpoint URL | |
| AZURE_OPENAI_API_VERSION | No | Optional Azure OpenAI LLM API version | |
| AZURE_OPENAI_DEPLOYMENT_NAME | No | Optional Azure OpenAI LLM deployment name | |
| AZURE_OPENAI_EMBEDDING_API_KEY | No | Optional Azure OpenAI Embedding deployment key | |
| AZURE_OPENAI_EMBEDDING_ENDPOINT | No | Optional Azure OpenAI Embedding endpoint URL | |
| AZURE_OPENAI_USE_MANAGED_IDENTITY | No | Optional use Azure Managed Identities for authentication | |
| AZURE_OPENAI_EMBEDDING_API_VERSION | No | Optional Azure OpenAI API version | |
| AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME | No | Optional Azure OpenAI embedding deployment name |
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |