Smart-Thinking is a sophisticated MCP server designed for advanced, graph-based reasoning and verification. It offers:
Multi-dimensional thought graphs for nuanced reasoning and analysis
Robust verification system with 8 distinct statuses to verify facts and detect contradictions
Interactive visualizations in multiple layouts (standard, chronological, thematic, hierarchical, force, radial)
Persistent data storage across sessions via session IDs
Cross-platform compatibility on Windows, macOS, and Linux
Integration capabilities with various MCP clients and as a Node.js module
User personalization to adapt reasoning based on preferences and history
Collaborative multi-agent systems for team-based reasoning
Various thought types (regular, revision, meta, hypothesis, conclusion)
Auto-learning mechanisms to improve reasoning over time
Offers API integration for Node.js applications to utilize Smart-Thinking's reasoning capabilities programmatically, with functions for thought processing and verification.
Available as an npm package (smart-thinking-mcp) for easy installation and integration into JavaScript/TypeScript projects.
Built with TypeScript 5.1.6, providing type-safe integration for applications that want to incorporate Smart-Thinking's reasoning capabilities.
Smart-Thinking
Smart-Thinking is a Model Context Protocol (MCP) server that delivers graph-based, multi-step reasoning without relying on external AI APIs. Everything happens locally: similarity search, heuristic-based scoring, verification tracking, memory, and visualization all run in a deterministic pipeline designed for transparency and reproducibility.
Core Capabilities
Graph-first reasoning that connects thoughts with rich relationships (supports, contradicts, refines, contextual links, and more).
Local TF-IDF + cosine similarity engine powering memory lookups and graph expansion without third-party embedding services.
Heuristic quality evaluation that scores confidence, relevance, and quality using transparent rules instead of LLM calls.
Verification workflow with detailed statuses and calculation tracing to surface facts, guardrails, and uncertainties.
Persistent sessions that can be resumed across runs, keeping both the reasoning graph and verification ledger in sync.
Related MCP server: Vibe Coder MCP
Reasoning Flow
Session bootstrap –
ReasoningOrchestratorinitializes a session, restores any saved graph state, and prepares feature flags.Pre-verification – deterministic guards inspect the incoming thought, perform light-weight calculation checks, and annotate the payload.
Graph integration – the thought is inserted into
ThoughtGraph, linking to context, prior thoughts, and relevant memories.Heuristic evaluation –
QualityEvaluatorandMetricsCalculatorcompute weighted scores and traces that explain the decision path.Verification feedback – statuses from
VerificationServiceand heuristic traces are attached to the node and propagated across connections.Persistence & response – updates are written to
MemoryManager/VerificationMemory, and a structured MCP response is returned with a timeline of reasoning steps.
Each step is logged with structured metadata so you can visualize the reasoning fabric, audit decisions, and replay sessions deterministically.
Installation
Smart-Thinking ships as an npm package compatible with Windows, macOS, and Linux.
Global install (recommended)
Run with npx
Install via Smithery
From source
Need platform-specific configuration details? See
GUIDE_INSTALLATION.mdfor step-by-step instructions covering Windows, macOS, Linux, and Claude Desktop integration.
Quick Tour
smart-thinking-mcp— start the MCP server (globally installed package).npx -y smart-thinking-mcp— launch without a global install.npm run start— execute the built server from source.npm run demo:session— run the built-in CLI walkthrough that feeds sample thoughts through the reasoning pipeline and prints the resulting timeline.
The demo script showcases how the orchestrator adds nodes, evaluates heuristics, and records verification feedback step by step.
MCP Client Compatibility
Smart-Thinking is validated across the most popular MCP clients and operating systems. Use the new connector mode (--mode=connector or SMART_THINKING_MODE=connector) when a client only accepts the search and fetch tools required by ChatGPT connectors.openai-mcp
Client | Transport | Notes |
ChatGPT Connectors & Deep Research | HTTP + SSE | Deploy with
. Point ChatGPT to
and keep only
/
enabled, aligning with OpenAI’s remote MCP guidance. |
OpenAI Codex CLI & Agents SDK | Streamable HTTP / SSE | Configure the Codex agent with
or
and set
when only knowledge retrieval is needed. |
Claude Desktop / Claude Code | stdio | Add
(or an
command) to
. Full toolset is available. |
Cursor IDE | stdio / SSE / Streamable HTTP | Add the server to
or the project
. Cursor supports prompts, roots, elicitation, and streaming. |
Cline (VS Code) | stdio | Place the command in
or use the in-app marketplace to register the toolset. |
Kilo Code | stdio | Register via the MCP marketplace and run the server locally; Smart-Thinking exposes deterministic tooling for autonomous edits. |
Need a minimal deployment footprint? Combine
--transport=http --mode=connectorwith a reverse proxy (ngrok, fly.io, render, etc.) so remote clients can consume the server without exposing the full toolset.
Configuration & Feature Flags
feature-flags.tstoggles advanced behaviours such as external integrations (disabled by default) and verbose tracing.config.tsaligns platform-specific paths and verification thresholds.memory-manager.tsandverification-memory.tsstore session graphs, metrics, and calculation results using deterministic JSON snapshots.
Development Workflow
See TRANSFORMATION_PLAN.md for the full transformation history and the checklist that drives ongoing hardening.
Quality & Support
Deterministic heuristics and verification eliminate dependency on remote LLMs.
Coverage targets: ≥80 % on persistence modules, ≥60 % branch coverage across orchestrator logic.
CI recommendations: run
npm run lintandnpm run test:coveragebefore each release candidate.
Contributing
Contributions are welcome. Please open an issue or pull request describing the change, and run the quality checks above before submitting.
License
OpenAI, “Building MCP servers for ChatGPT and API integrations,” highlights that connectors require
↩searchandfetchtools for remote use. (https://platform.openai.com/docs/mcp)OpenAI Agents SDK documentation on MCP transports (stdio, SSE, streamable HTTP). (https://openai.github.io/openai-agents-python/mcp/)
↩Model Context Protocol client catalogue listing Claude, Cline, Kilo Code, and other MCP-compatible applications. (https://modelcontextprotocol.io/clients)
↩Cursor documentation for configuring MCP servers via stdio/SSE/HTTP transports. (https://cursor.com/docs/context/mcp)
↩