Provides reasoning content to MCP-enabled AI clients by interfacing with Deepseek's API or a local Ollama server, enabling focused reasoning and thought process visualization.
Facilitates knowledge graph representation with semantic search using Qdrant, supporting OpenAI embeddings for semantic similarity and robust HTTPS integration with file-based graph persistence.
Enables neural memory sequence learning with a memory-augmented model for improved code understanding and generation, featuring state management, novelty detection, and model persistence.
A lightweight, stateless MCP server utilizing Puppeteer for web searches, returning structured JSON results, easily integratable with other MCP-enabled systems.
A Python-based server that implements the Model Context Protocol to interface with Claude Desktop as an MCP client, supporting interaction through efficient memory management.
An improved implementation of persistent memory using a local knowledge graph with a customizable --memory-path. This lets Claude remember information about the user across chats.