MCP-ORTools integrates Google's OR-Tools constraint programming solver with Large Language Models through the MCP, enabling AI models to:
Submit and validate constraint models
Set model parameters
Solve constraint satisfaction and optimization problems
Retrieve and analyze solution
A persistent memory and context management system for AI CLI tools that utilizes a three-layer architecture and semantic search to prevent context loss between sessions. It provides time-aware orientation and smart memory routing to help AI agents maintain project knowledge and architectural decisions.
An open-source memory layer that provides persistent project context and architectural history for AI development tools across multiple platforms and sessions. It enables AI assistants to maintain a shared understanding of codebases while integrating directly with services like Notion for documentation management.
Claude Context is an MCP plugin that adds semantic code search to Claude Code and other AI coding agents, giving them deep context from your entire codebase.
A local Model Context Protocol (MCP) server that exposes custom tools to Claude Desktop, enabling direct interaction with your local environment. It provides a framework for building and integrating custom TypeScript tools into the Claude interface.
An MCP server that provides persistent, cross-session memory and team knowledge sharing for AI development workflows. It enables project DNA scanning, semantic search, context budgeting, and git-aware indexing to prevent AI context loss between sessions.
An MCP server that preserves LLM context by intercepting large data outputs and returning only concise summaries or relevant sections. It enables efficient sandboxed code execution, file processing, and documentation indexing across multiple programming languages and authenticated CLIs.
An MCP (Model Context Protocol) server implementation using HTTP SSE (Server-Sent Events) connections with built-in utility tools including echo, time, calculator, and weather query functionality.
A local MCP server for RAG memory, semantic search, and context optimization using Ollama and SQLite. It serves as a central hub that manages document embeddings, text compression, and proxies calls to other sub-MCP servers.
Provides tools for AI-powered graph analysis, including relationship extraction, adjacency matrix creation, and network centrality calculations. It enables users to perform complex structural analysis and generate interactive D3.js visualizations from structured data.
A modular multi-server architecture providing development automation, JIRA management, and performance reporting for Claude Code. It features specialized tools for PR health analysis, code reviews, and generating comprehensive team and individual quarterly reports.
Provides AI assistants with real-time visibility into your codebase's internal libraries, team patterns, naming conventions, and usage frequencies to generate code that matches your team's actual practices.
A high-performance MCP server providing up-to-date documentation for Go, npm, Python, Rust, Docker, Kubernetes, Terraform, and more — fetched from official sources, not training data.
Provides persistent context management for AI agents by storing and querying semantic information using Upstash Vector DB and Google AI embeddings. It enables semantic search, batch operations, and metadata filtering to help agents retrieve relevant stored knowledge.
Long AI conversations fail in predictable ways. Context-First fixes all four:
Failure Mode What Goes Wrong Context-First Solution
Context Drift AI forgets earlier decisions and intent as the conversation grows context_loop + detect_drift continuously re-anchor every turn
Silent Contradiction New inputs silently overrule established facts — the AI doesn't notice detect_conflicts compares every inp
Enables LLM assistants to store, retrieve, and update user-specific context memory including travel preferences and general information through a chat interface. Provides analytics on tool usage patterns and token costs for continuous improvement.
A server that provides advanced mathematical and financial calculation capabilities for AI code assistants, enabling them to perform complex calculations like symbolic calculus, numerical methods, and financial analysis without implementing algorithms directly.
An MCP server that enables processing of massive datasets up to 10M+ tokens using a recursive language model pattern for strategic chunking and analysis. It automates sub-queries and result aggregation using free local inference via Ollama or the Claude API to handle context beyond standard prompt limits.
Enables high-precision detection, anonymization, encryption and decryption of personally identifiable information (PII) in text using GPT-4o-based detection and advanced cryptographic methods. Supports both deterministic encryption for searchable data and format-preserving encryption for structured identifiers.
A comprehensive toolkit for Ethereum blockchain analysis within Claude AI, enabling contract auditing, wallet analysis, profitability tracking, and on-chain data retrieval.
A CloudFlare Workers-based MCP server that provides semantic memory and journal capabilities with vector search. Enables users to store, search, and retrieve memories and journal entries using AI-powered semantic similarity without any local setup required.
Provides semantic code search capabilities that run 100% locally using EmbeddingGemma embeddings. Enables finding code by meaning across 15 file extensions and 9+ programming languages without API costs or sending code to the cloud.
This server provides an API to query Large Language Models using context from local files, supporting various models and file types for context-aware responses.
Enables semantic search of project documentation using hybrid vector and full-text search with fast and deep query modes for immediate results or complex multi-round synthesis.
Provides persistent tool context that survives across Claude Desktop chat sessions, automatically injecting tool-specific rules, syntax preferences, and best practices. Eliminates the need to re-establish context in each new conversation.
An MCP server that enables users to read, search, and analyze PDF documents. It provides tools for extracting text, viewing metadata, searching content with context, and generating word statistics.
A Model Context Protocol server that provides web content fetching capabilities with robots.txt checking removed, allowing LLMs to retrieve and convert web content to markdown.
A static MCP server that helps AI models maintain tool context across chat sessions, preventing loss of important information and keeping conversations smooth and uninterrupted.
Enables AI to automatically search, retrieve, and organize your Cursor chat history across sessions. Supports tagging, nicknames, project-scoped search, and full-text search to maintain context between conversations.
Provides a suite of regex and text processing tools for AI agents, including pattern testing, extraction, and replacement with capture group support. It also enables various text transformations like case conversion, line sorting, and deduplication through the Model Context Protocol.
A local Model Context Protocol server designed to share contextual information between an AI and a user. It primarily provides a tool to retrieve the current date and time in ISO 8601 format based on the server's local timezone.
Image Tools MCP is a Model Context Protocol (MCP) service that retrieves image dimensions and compresses images from URLs and local files using the TinyPNG API. It supports converting images to formats like webp, jpeg/jpg, and png, providing detailed information on width, height, type, and compressi
Enables AI-driven semantic code search using Windsurf's reverse-engineered SWE-grep protocol to query local codebases with natural language. It executes local search tools like ripgrep and tree-node-cli to return relevant file paths and line ranges to MCP-compatible clients.
Enables cross-window context sharing in VS Code by persisting Copilot conversations to SQLite, allowing code discussions from one repository to be accessible in other VS Code windows through smart entity matching and search.
A local-first, agent-agnostic MCP server that provides semantic search, persistent memory, and automated code review capabilities for development workflows. It leverages the Auggie SDK to offer advanced tools for codebase indexing, implementation planning, and deterministic static analysis.
Enables managing personal information with dynamic topic-based organization (tasks, meetings, contacts, etc.), supporting optional OTP authentication and AES-256 encryption for sensitive data with automatic backups.
Enables access to Brazil's Central Bank (BCB) Open Data API for payment methods, allowing queries about PIX, card transactions, ATM terminals, interchange rates, and other payment statistics through natural language.
An MCP server implementing Recursive Language Models (RLM) to process arbitrarily large contexts through a programmatic probe, recurse, and synthesize loop. It enables LLMs to perform multi-step investigations and evidence-backed extraction across massive file sets without being limited by standard context windows.