Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Chain of Thought MCP ServerPlan how to analyze quarterly sales data and identify top-performing products"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Chain of Thought MCP Server
Anthropic's recent article "The "think" tool: Enabling Claude to stop and think in complex tool use situations" shows that using an external think tool notably increases performance on SWE Bench.
This MCP Server uses Groq's API to call LLMs which expose raw chain-of-thought tokens from Qwen's qwq model.
Installation
Clone this repository to your local machine.
Run
uv syncto install depenciesGet a Groq API key from here.
Update your mcp configuration with:
The path should be the local path to this repository. You can get this easily by running pwd in the terminal from the root of the repository.
Related MCP server: Branch Thinking
Instructing The AI To Use This MCP Server
I personally prefer the agent call this tool on every request to increase performance. I add this to my rules for the agent: