Skip to main content
Glama

Omi Memories MCP Server

Omi Memories MCP 服务器

这是一个模型上下文协议 (MCP) 服务器,通过工具界面为特定用户提供对 Omi 内存的访问。

特征

  • 用于从 OMI App 获取指定用户 ID 的所有记忆的工具

设置

  1. 安装依赖项:
npm install
  1. 配置您的用户 ID:
    • 打开src/server.ts
    • 使用 Omira 应用程序的“帐户”部分中的用户 ID 更新SPECIFIC_USER_ID常量
  2. 构建 TypeScript 代码:
npm run build
  1. 启动服务器:
npm start

可用工具

获取记忆

获取已配置用户 ID 的所有记忆。

import { Client } from "@modelcontextprotocol/sdk/client/index.js"; import { StdioClientTransport } from "@modelcontextprotocol/sdk/client/stdio.js"; const transport = new StdioClientTransport({ command: "node", args: ["dist/server.js"] }); const client = new Client( { name: "example-client", version: "1.0.0" }, { capabilities: { tools: {} } } ); await client.connect(transport); // Fetch memories using the tool const result = await client.callTool({ name: "fetch-memories", arguments: {} }); console.log(result.content[0].text);

配置

服务器期望:

  1. Express API 将在http://localhost:3000上运行
  2. 应配置用户 ID:将src/server.ts中的SPECIFIC_USER_ID常量更新为您可以从 Omira 应用程序的帐户部分获取的用户 ID。

Claude 桌面集成

要与 Claude Desktop 集成,请更新您的 Claude Desktop 配置( claude_desktop_config.json )以包含以下内容:

{ "mcpServers": { "omi-mcp": { "command": "node", "args": [ "/path/to/your/mcp-server/dist/server.js" ], "env": { "NODE_ENV": "development" } } } }

Cursor IDE 集成

要与 Cursor IDE 集成:

  1. 打开 Cursor IDE 设置
  2. 导航至“AI 和 Copilot”设置
  3. 在“模型上下文协议”下,使用以下设置添加一个新的 MCP 服务器:
{ "name": "Omi Memories", "command": "node", "args": [ "/path/to/your/mcp-server/dist/server.js" ], "cwd": "/path/to/your/mcp-server", "env": { "NODE_ENV": "development" } }

/path/to/your/mcp-server替换为 MCP 服务器安装目录的实际路径。

You must be authenticated.

A
security – no known vulnerabilities
F
license - not found
A
quality - confirmed to work

local-only server

The server can only run on the client's local machine because it depends on local resources.

模型上下文协议服务器,允许特定用户帐户通过工具界面访问 Omi 内存。

  1. 特征
    1. 设置
      1. 可用工具
        1. 获取记忆
      2. 配置
        1. Claude 桌面集成
          1. Cursor IDE 集成

            Related MCP Servers

            • -
              security
              A
              license
              -
              quality
              A Model Context Protocol server that provides file system operations, analysis, and manipulation capabilities through a standardized tool interface.
              Last updated -
              1
              TypeScript
              MIT License
            • -
              security
              F
              license
              -
              quality
              A Model Context Protocol server that enables Claude to persistently store, search, and manage text memories with tags in a local JSON file.
              Last updated -
              TypeScript
              • Apple
            • -
              security
              A
              license
              -
              quality
              A Model Context Protocol server that provides standardized interfaces for interacting with Ollama API, offering JSON responses, error handling, and intelligent guidance for LLM-based API calls.
              Last updated -
              Python
              MIT License
              • Linux
              • Apple
            • A
              security
              F
              license
              A
              quality
              A Model Context Protocol server that enables AI assistants to interact with the Omi API for retrieving and creating conversations and memories for users.
              Last updated -
              4
              TypeScript

            View all related MCP servers

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/Ritesh2351235/Omi-MCP'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server