Skip to main content
Glama
server.json770 B
{ "$schema": "https://static.modelcontextprotocol.io/schemas/2025-10-17/server.schema.json", "name": "me.pi22by7/in-memoria", "description": "Persistent codebase intelligence that gives AI assistants memory across sessions", "version": "0.6.0", "homepage": "https://github.com/pi22by7/in-memoria#readme", "license": "MIT", "packages": [ { "registryType": "npm", "identifier": "in-memoria", "version": "0.6.0", "transport": { "type": "stdio" }, "environmentVariables": [ { "name": "OPENAI_API_KEY", "description": "Optional OpenAI API key for embeddings (falls back to local transformers.js)", "isRequired": false, "isSecret": true } ] } ] }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/pi22by7/In-Memoria'

If you have feedback or need assistance with the MCP directory API, please join our Discord server