Skip to main content
Glama

Gemini Thinking Server

模型上下文协议 - Gemini Thinking Server

这是模型上下文协议 (MCP) 的实现,它与 Google 的 Gemini API 集成,无需代码生成即可提供分析思维能力。

概述

Gemini Thinking Server 是一款专用的 MCP 服务器,它利用 Google 的 Gemini 模型来提供顺序思考和问题解决能力。它能够实现以下功能:

  • 将复杂问题分解成几个步骤
  • 规划和设计留有修改空间
  • 可能需要修正路线的分析
  • 最初可能不清楚全部范围的问题

特征

  • 双子座思维:利用双子座的分析能力来产生深思熟虑的回应
  • 元评论:提供对推理过程的见解
  • 置信水平:表明 Gemini 对其分析的信心程度
  • 替代路径:建议解决问题的不同方法
  • 分支思维:允许探索不同的思维路径
  • 修改能力:支持修改以前的想法
  • 会话持久性:保存并恢复分析会话

安装

# Clone the repository git clone <repository-url> # Install dependencies npm install # Build the project npm run build

用法

环境设置

在运行服务器之前,您需要设置您的 Gemini API 密钥:

export GEMINI_API_KEY=your_api_key_here

运行服务器

node dist/gemini-index.js

工具参数

geminithinking工具接受以下参数:

  • query (必需):要分析的问题
  • context (可选):附加上下文信息
  • approach (可选):建议解决问题的方法
  • previousThoughts (可选):上下文中先前想法的数组
  • thought (可选):您当前的思考步骤(如果为空,将由 Gemini 生成)
  • nextThoughtNeeded (必需):是否需要另一个思考步骤
  • thoughtNumber (必填):当前的想法数
  • totalThoughts (必填):估计需要的想法总数
  • isRevision (可选):这是否改变了以前的想法
  • revisesThought (可选):正在重新考虑哪些想法
  • branchFromThought (可选):分支点思想编号
  • branchId (可选):分支标识符
  • needsMoreThoughts (可选):如果需要更多想法

会话管理

该工具还支持会话管理命令:

  • sessionCommand :管理会话的命令('save'、'load'、'getState')
  • sessionPath :保存或加载会话文件的路径(“保存”和“加载”命令所需)
示例:保存会话
{ "sessionCommand": "save", "sessionPath": "/path/to/save/session.json", "query": "dummy", "thoughtNumber": 1, "totalThoughts": 1, "nextThoughtNeeded": false }
示例:加载会话
{ "sessionCommand": "load", "sessionPath": "/path/to/load/session.json", "query": "dummy", "thoughtNumber": 1, "totalThoughts": 1, "nextThoughtNeeded": false }
示例:获取会话状态
{ "sessionCommand": "getState", "query": "dummy", "thoughtNumber": 1, "totalThoughts": 1, "nextThoughtNeeded": false }

例子

以下是如何使用该工具的示例:

{ "query": "How might we design a sustainable urban transportation system?", "context": "The city has 500,000 residents and currently relies heavily on personal vehicles.", "approach": "Consider environmental, economic, and social factors.", "thoughtNumber": 1, "totalThoughts": 5, "nextThoughtNeeded": true }

响应格式

服务器响应:

{ "thought": "The generated thought from Gemini", "thoughtNumber": 1, "totalThoughts": 5, "nextThoughtNeeded": true, "branches": [], "thoughtHistoryLength": 1, "metaComments": "Meta-commentary about the reasoning", "confidenceLevel": 0.85, "alternativePaths": ["Alternative approach 1", "Alternative approach 2"] }

示例客户端

提供了几个示例客户端来演示不同的用例:

  • sample-client.js :基本客户端示例
  • example-usage.js :具体使用示例
  • codebase-analysis-example.js :代码库分析示例
  • session-example.js :演示会话持久性的示例
  • advanced-filtering-example.js :演示高级语义过滤的示例

运行会话示例:

node dist/session-example.js

运行高级过滤示例:

node dist/advanced-filtering-example.js

执照

麻省理工学院

Install Server
A
security – no known vulnerabilities
F
license - not found
A
quality - confirmed to work

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

MCP 服务器实现利用 Google 的 Gemini API,通过顺序思考步骤提供分析问题解决能力,而无需代码生成。

  1. 概述
    1. 特征
      1. 安装
        1. 用法
          1. 环境设置
          2. 运行服务器
          3. 工具参数
          4. 会话管理
        2. 例子
          1. 响应格式
            1. 示例客户端
              1. 执照

                Related MCP Servers

                • A
                  security
                  F
                  license
                  A
                  quality
                  A server that provides access to Google Gemini AI capabilities including text generation, image analysis, YouTube video analysis, and web search functionality through the MCP protocol.
                  Last updated -
                  6
                  7
                  4
                  TypeScript
                  • Apple
                • A
                  security
                  A
                  license
                  A
                  quality
                  A dedicated server that wraps Google's Gemini AI models in a Model Context Protocol (MCP) interface, allowing other LLMs and MCP-compatible systems to access Gemini's capabilities like content generation, function calling, chat, and file handling through standardized tools.
                  Last updated -
                  16
                  30
                  TypeScript
                  MIT License
                  • Linux
                  • Apple
                • -
                  security
                  A
                  license
                  -
                  quality
                  An MCP server that enables other AI models (like Claude) to use Google's Gemini models as tools for specific tasks through a standardized interface.
                  Last updated -
                  1
                  TypeScript
                  MIT License
                • -
                  security
                  A
                  license
                  -
                  quality
                  A Model Context Protocol (MCP) server implementation for the Google Gemini language model. This server allows Claude Desktop users to access the powerful reasoning capabilities of Gemini-2.0-flash-thinking-exp-01-21 model.
                  Last updated -
                  1
                  JavaScript
                  MIT License

                View all related MCP servers

                MCP directory API

                We provide all the information about MCP servers via our MCP API.

                curl -X GET 'https://glama.ai/api/mcp/v1/servers/bartekke8it56w2/new-mcp'

                If you have feedback or need assistance with the MCP directory API, please join our Discord server