Facilitates knowledge graph representation with semantic search using Qdrant, supporting OpenAI embeddings for semantic similarity and robust HTTPS integration with file-based graph persistence.
Enables neural memory sequence learning with a memory-augmented model for improved code understanding and generation, featuring state management, novelty detection, and model persistence.
A lightweight, stateless MCP server utilizing Puppeteer for web searches, returning structured JSON results, easily integratable with other MCP-enabled systems.
Provides memory/knowledge graph storage capabilities using Supabase, enabling multiple Claude instances to safely share and maintain a knowledge graph with features like entity storage, concurrent access safety, and full text search.
This server facilitates the invocation of AI models from providers like Anthropic, OpenAI, and Groq, enabling users to manage and configure large language model interactions seamlessly.
A Python-based server that implements the Model Context Protocol to interface with Claude Desktop as an MCP client, supporting interaction through efficient memory management.