Skip to main content
Glama

search_conversations

Search conversation history by keyword or role to find specific messages and recent user questions across AI chat platforms.

Instructions

    Full-text search across all conversation messages.

    Args:
        term: Search term (keyword). If empty with role="user", finds recent user questions.
        limit: Max results (default 15)
        role: Filter by role — "user" for your words, "assistant" for AI responses.
              With role="user" and empty term, returns recent questions asked.
    

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
termNo
limitNo
roleNo

Implementation Reference

  • The implementation of the search_conversations tool function.
    @mcp.tool()
    def search_conversations(term: str = "", limit: int = 15, role: str = None) -> str:
        """
        Full-text search across all conversation messages.
    
        Args:
            term: Search term (keyword). If empty with role="user", finds recent user questions.
            limit: Max results (default 15)
            role: Filter by role — "user" for your words, "assistant" for AI responses.
                  With role="user" and empty term, returns recent questions asked.
        """
        con = get_conversations()
    
        # Special mode: find recent user questions when no term and role=user
        if (not term or not term.strip()) and role == "user":
            results = con.execute("""
                SELECT substr(content, 1, 200) as question,
                       conversation_title, source, created
                FROM conversations
                WHERE has_question = 1 AND role = 'user'
                ORDER BY created DESC
                LIMIT ?
            """, [limit]).fetchall()
    
            output = ["## Recent Questions Asked\n"]
            for question, title, source, created in results:
                output.append(f"**[{created}]** {question}")
                output.append(f"  _From: {title or 'Untitled'} ({source})_\n")
            return "\n".join(output)
    
        pattern = f"%{term}%"
    
        if role:
            results = con.execute("""
                SELECT source, model, conversation_title, role,
                       substr(content, 1, 200) as preview,
                       created, conversation_id
                FROM conversations
                WHERE content ILIKE ? AND role = ?
                ORDER BY created DESC
                LIMIT ?
            """, [pattern, role, limit]).fetchall()
        else:
            results = con.execute("""
                SELECT source, model, conversation_title, role,
                       substr(content, 1, 200) as preview,
                       created, conversation_id
                FROM conversations
                WHERE content ILIKE ?
                ORDER BY created DESC
                LIMIT ?
            """, [pattern, limit]).fetchall()
    
        if not results:
            return f"No conversations found containing '{term}'"
    
        output = [f"## Conversations containing '{term}' ({len(results)} found)\n"]
        for source, model, title, msg_role, preview, created, conv_id in results:
            output.append(f"**[{created}]** {title or 'Untitled'}")
            output.append(f"  {msg_role}: {preview}...")
            output.append(f"  _ID: {conv_id[:20]}... | {source}/{model}_\n")
        return "\n".join(output)
  • The register function where tools in this file are registered with the MCP server.
    def register(mcp):
        """Register conversation tools with the MCP server."""

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mordechaipotash/brain-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server