Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
WAGGLE_MODELNoSentence-transformers model for local embeddingsall-MiniLM-L6-v2
WAGGLE_BACKENDNoBackend database type: 'sqlite' or 'neo4j'sqlite
WAGGLE_DB_PATHNoPath to SQLite database file (used with WAGGLE_BACKEND=sqlite)memory.db
WAGGLE_HTTP_HOSTNoBind host for HTTP service (used with WAGGLE_TRANSPORT=http)0.0.0.0
WAGGLE_HTTP_PORTNoBind port for HTTP service (used with WAGGLE_TRANSPORT=http)8080
WAGGLE_LOG_LEVELNoLog levelINFO
WAGGLE_NEO4J_URINoNeo4j Bolt URI, e.g. 'bolt://localhost:7687' (required with WAGGLE_BACKEND=neo4j)
WAGGLE_TRANSPORTNoTransport mode: 'stdio' or 'http'stdio
WAGGLE_EXPORT_DIRNoOptional export directory
WAGGLE_OLLAMA_URLNoBase URL for local Ollama (used with WAGGLE_EXTRACT_BACKEND=llm)http://localhost:11434
WAGGLE_EXTRACT_MODELNoOllama model name for extraction (used with WAGGLE_EXTRACT_BACKEND=llm)mistral
WAGGLE_NEO4J_DATABASENoNeo4j database name (used with WAGGLE_BACKEND=neo4j)
WAGGLE_NEO4J_PASSWORDNoNeo4j password (required with WAGGLE_BACKEND=neo4j)
WAGGLE_NEO4J_USERNAMENoNeo4j username (required with WAGGLE_BACKEND=neo4j)
WAGGLE_RATE_LIMIT_RPMNoGlobal rate limit (requests per minute)120
WAGGLE_EXTRACT_BACKENDNoExtraction backend: 'auto', 'llm', or 'regex'auto
WAGGLE_DEFAULT_TENANT_IDNoDefault tenant IDlocal-default
WAGGLE_MAX_PAYLOAD_BYTESNoMax request size in bytes1048576
WAGGLE_WRITE_RATE_LIMIT_RPMNoWrite-tool rate limit (requests per minute)60
WAGGLE_EXTRACT_MIN_CONFIDENCENoMinimum confidence threshold for extraction (float 0-1)0.5
WAGGLE_OLLAMA_TIMEOUT_SECONDSNoTimeout in seconds for Ollama requests15
WAGGLE_MAX_CONCURRENT_REQUESTSNoConcurrency cap8
WAGGLE_REQUEST_TIMEOUT_SECONDSNoPer-request timeout in seconds30

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Abhigyan-Shekhar/Waggle-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server