remember_consult
Store consult results from Ollama models in local memory for future reference and retrieval.
Instructions
Store the result of a consult into a local memory store (or configured memory service).
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| key | No | ||
| prompt | Yes | ||
| model | No | ||
| response | No |
Implementation Reference
- src/handlers.ts:271-344 (handler)The core handler for the 'remember_consult' tool. Optionally generates a response using the specified Ollama model if not provided, then persists the key (optional), prompt, model (optional), and response as a timestamped JSON file in MEMORY_DIR (/tmp/mcp-consult-memory by default). Returns success message with file URI resource.case "remember_consult": { const key = args?.key as string | undefined; const prompt = args?.prompt as string | undefined; const model = args?.model as string | undefined; let responseText = args?.response as string | undefined; if (!prompt) { return { content: [ { type: "text", text: "Missing required argument: prompt", }, ], isError: true, }; } if (!responseText) { // generate using Ollama if a model was provided if (!model) { return { content: [ { type: "text", text: "Missing 'response' and no 'model' provided to generate it." }, ], isError: true, }; } try { const gen = await axios.post(`${OLLAMA_BASE_URL}/api/generate`, { model, prompt, stream: false, }); responseText = gen.data.response; } catch (e) { const message = e instanceof Error ? e.message : String(e); return { content: [{ type: "text", text: `Failed to generate response: ${message}` }], isError: true, }; } } // Persist to a simple local memory directory by default. Can be overridden with MEMORY_DIR. const memoryDir = process.env.MEMORY_DIR || path.join("/tmp", "mcp-consult-memory"); try { await fs.mkdir(memoryDir, { recursive: true }); const id = `${Date.now()}-${Math.random().toString(36).slice(2, 9)}`; const filePath = path.join(memoryDir, `observation-${id}.json`); const payload = { key: key || null, prompt, model: model || null, response: responseText, _meta: { createdAt: new Date().toISOString() } }; await fs.writeFile(filePath, JSON.stringify(payload, null, 2), "utf-8"); return { content: [ { type: "text", text: `Saved consult to ${filePath}`, }, { type: "resource", resource: { uri: `file://${filePath}`, text: responseText, }, }, ], }; } catch (err) { const message = err instanceof Error ? err.message : String(err); return { content: [{ type: "text", text: `Failed to save memory: ${message}` }], isError: true }; } }
- src/handlers.ts:126-139 (schema)Input schema definition for the 'remember_consult' tool, specifying properties for key, prompt (required), model, and response.{ name: "remember_consult", description: "Store the result of a consult into a local memory store (or configured memory service).", inputSchema: { type: "object", properties: { key: { type: "string" }, prompt: { type: "string" }, model: { type: "string" }, response: { type: "string" }, }, required: ["prompt"], }, },
- src/handlers.ts:90-142 (registration)The listTools() function registers the 'remember_consult' tool (among others) by returning its metadata and schema for MCP tool discovery.export function listTools() { return { tools: [ { name: "consult_ollama", description: "Consult an Ollama model with a prompt and get its response for reasoning from another viewpoint. If the requested model is unavailable locally, automatically falls back to: cloud models (deepseek-v3.1:671b-cloud, kimi-k2-thinking:cloud) or local alternatives (mistral, llama2). Never fails on model availability.", inputSchema: { type: "object", properties: { model: { type: "string" }, prompt: { type: "string" }, system_prompt: { type: "string" }, }, required: ["model", "prompt"], }, }, { name: "list_ollama_models", description: "List all available Ollama models on the local instance.", inputSchema: { type: "object", properties: {} }, }, { name: "compare_ollama_models", description: "Run the same prompt against multiple Ollama models and return their outputs side-by-side for comparison. Requested models that are unavailable automatically fall back to cloud models or local alternatives. Handles unavailable models gracefully without breaking the comparison.", inputSchema: { type: "object", properties: { models: { type: "array", items: { type: "string" } }, prompt: { type: "string" }, system_prompt: { type: "string" }, }, required: ["prompt"], }, }, { name: "remember_consult", description: "Store the result of a consult into a local memory store (or configured memory service).", inputSchema: { type: "object", properties: { key: { type: "string" }, prompt: { type: "string" }, model: { type: "string" }, response: { type: "string" }, }, required: ["prompt"], }, }, ], }; }