Skip to main content
Glama

Loc Knowledge Graph Memory Server

by Myangsun
Dockerfile675 B
FROM node:20-alpine AS builder WORKDIR /app # Copy package files and source code first COPY package*.json ./ COPY tsconfig.json ./ COPY index.ts ./ # Install all dependencies (including dev dependencies for build) # The prepare script will run automatically and build the project RUN npm ci FROM node:20-alpine AS release WORKDIR /app # Copy package files COPY package*.json ./ # Install only production dependencies (skip prepare script since we already built) RUN npm ci --omit=dev --ignore-scripts # Copy built files from builder COPY --from=builder /app/dist ./dist ENV NODE_ENV=production # Smithery expects port 8081 EXPOSE 8081 CMD ["node", "dist/index.js"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Myangsun/loc-memory-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server