Skip to main content
Glama

BookStack MCP Server

by ttpears
docker-compose.override.yml.example627 B
# LibreChat Integration - Add this to your LibreChat docker-compose.override.yml # Then restart with: docker compose down && docker compose -f docker-compose.yml -f docker-compose.override.yml up -d # # Requirements: # 1. Copy Dockerfile.mcp-bookstack to your LibreChat root directory # 2. Add BookStack environment variables to your .env file # 3. Add this service configuration to docker-compose.override.yml services: bookstack-mcp: build: context: . dockerfile: Dockerfile.mcp-bookstack env_file: - .env ports: - "8007:8007" networks: - librechat restart: unless-stopped

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ttpears/bookstack-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server