Skip to main content
Glama

Project Synapse

.env.exampleโ€ข1.07 kB
# Project Synapse MCP Server Environment Configuration # Neo4j Database Configuration NEO4J_URI=bolt://localhost:7687 NEO4J_USER=neo4j NEO4J_PASSWORD=synapse_password NEO4J_DATABASE=synapse # AI Model Configuration ANTHROPIC_API_KEY=your_anthropic_key_here OPENAI_API_KEY=your_openai_key_here OLLAMA_BASE_URL=http://localhost:11434 # Embedding Model Settings EMBEDDING_MODEL=sentence-transformers/all-mpnet-base-v2 EMBEDDING_DIMENSION=768 # Zettelkasten Configuration ZETTEL_ID_FORMAT=timestamp LINK_THRESHOLD=0.7 INSIGHT_CONFIDENCE_THRESHOLD=0.8 # Knowledge Processing Settings MAX_CONCURRENT_PROCESSING=3 BATCH_SIZE=100 CACHE_TTL=3600 # Logging Configuration LOG_LEVEL=INFO LOG_TO_FILE=false LOG_FILE_PATH=./data/synapse.log # MCP Server Settings MCP_SERVER_NAME=project-synapse MCP_SERVER_VERSION=0.1.0 MCP_MAX_CONNECTIONS=10 # Data Persistence KNOWLEDGE_GRAPH_PATH=./data/knowledge_graph CACHE_PATH=./data/cache BACKUP_ENABLED=true BACKUP_INTERVAL_HOURS=24 # Performance Tuning SEMANTIC_BATCH_SIZE=50 GRAPH_QUERY_TIMEOUT=30 PATTERN_DETECTION_INTERVAL=300

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/angrysky56/project-synapse-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server