Skip to main content
Glama

Neo4j Agent Memory MCP Server

by knowall-ai
Dockerfile1.04 kB
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile # Use a Node.js image as the base FROM node:22.12-alpine AS builder # Set the working directory inside the container WORKDIR /app # Copy package.json and package-lock.json to the container COPY package.json package-lock.json ./ # Copy the source files needed for build COPY src ./src COPY tsconfig.json ./ # Install dependencies (this will also run the prepare script) RUN --mount=type=cache,target=/root/.npm npm install # Build the application RUN npm run build # Use a lightweight Node.js image for the final build FROM node:22-alpine # Set the working directory inside the container WORKDIR /app # Copy the build output and node_modules from the builder stage COPY --from=builder /app/build ./build COPY --from=builder /app/node_modules ./node_modules # Don't set default environment variables to ensure proper tool discovery # These should be provided at runtime # Specify the command to run the application ENTRYPOINT ["node", "build/index.js"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/knowall-ai/mcp-neo4j-agent-memory'

If you have feedback or need assistance with the MCP directory API, please join our Discord server