Skip to main content
Glama
EmjayAhn

Pensieve MCP Server

by EmjayAhn

list_conversations

Retrieve stored conversation lists from the Pensieve MCP Server to access and manage cross-platform AI chat history with pagination controls.

Instructions

저장된 대화 목록을 조회합니다

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
limitNo조회할 대화 수 (기본값: 50)
offsetNo시작 위치 (기본값: 0)

Implementation Reference

  • Handler function for listing conversations, which reads conversation files from the storage directory.
    def list_conversations(limit: int = 50, offset: int = 0) -> List[Dict[str, Any]]:
        """저장된 모든 대화 목록 반환"""
        conversations = []
        
        # 모든 JSON 파일 읽기
        json_files = sorted(STORAGE_DIR.glob("*.json"), key=lambda x: x.stat().st_mtime, reverse=True)
        
        for file_path in json_files[offset:offset + limit]:
            try:
                with open(file_path, 'r', encoding='utf-8') as f:
                    data = json.load(f)
                    # 메타데이터만 포함한 간략한 정보
                    conversations.append({
                        "id": data["id"],
                        "metadata": data.get("metadata", {}),
                        "created_at": data.get("created_at"),
                        "updated_at": data.get("updated_at"),
                        "message_count": len(data.get("messages", []))
                    })
            except Exception as e:
                print(f"Error loading {file_path}: {e}")
        
        return conversations
  • The actual implementation of the list_conversations function, which reads and returns conversation data from JSON files.
    def list_conversations(limit: int = 50, offset: int = 0) -> List[Dict[str, Any]]:
        """저장된 모든 대화 목록 반환"""
        conversations = []
        
        # 모든 JSON 파일 읽기
        json_files = sorted(STORAGE_DIR.glob("*.json"), key=lambda x: x.stat().st_mtime, reverse=True)
        
        for file_path in json_files[offset:offset + limit]:
            try:
                with open(file_path, 'r', encoding='utf-8') as f:
                    data = json.load(f)
                    # 메타데이터만 포함한 간략한 정보
                    conversations.append({
                        "id": data["id"],
                        "metadata": data.get("metadata", {}),
                        "created_at": data.get("created_at"),
                        "updated_at": data.get("updated_at"),
                        "message_count": len(data.get("messages", []))
                    })
            except Exception as e:
                print(f"Error loading {file_path}: {e}")
  • Tool registration in the list_tools() function.
        name="list_conversations",
        description="저장된 대화 목록을 조회합니다",
        inputSchema={
            "type": "object",
            "properties": {
                "limit": {
                    "type": "integer",
                    "description": "조회할 대화 수 (기본값: 50)",
                    "default": 50
                },
                "offset": {
                    "type": "integer",
                    "description": "시작 위치 (기본값: 0)",
                    "default": 0
                }
            }
        }
    ),
  • Registration of the list_conversations tool with its description and input schema.
    Tool(
        name="list_conversations",
        description="저장된 대화 목록을 조회합니다",
        inputSchema={
            "type": "object",
            "properties": {
                "limit": {
                    "type": "integer",
                    "description": "조회할 대화 수 (기본값: 50)",
                    "default": 50
                },
                "offset": {
                    "type": "integer",
                    "description": "시작 위치 (기본값: 0)",
                    "default": 0
                }
  • Tool dispatch logic for "list_conversations" in call_tool() function.
    elif name == "list_conversations":
        limit = arguments.get("limit", 50)
        offset = arguments.get("offset", 0)
        
        conversations = list_conversations(limit, offset)
        return [TextContent(
            type="text",
            text=json.dumps(conversations, ensure_ascii=False, indent=2)
        )]
  • The tool execution handler in the server's main call loop that processes requests for list_conversations.
    elif name == "list_conversations":
        limit = arguments.get("limit", 50)
        offset = arguments.get("offset", 0)
        
        conversations = list_conversations(limit, offset)
        return [TextContent(
            type="text",
            text=json.dumps(conversations, ensure_ascii=False, indent=2)
        )]

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/EmjayAhn/pensieve-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server