Skip to main content
Glama

semantic_search

Search conversations by conceptual meaning using vector embeddings to find related ideas beyond keyword matching.

Instructions

Search conversations using semantic similarity (vector embeddings). Finds messages that are conceptually similar to your query, even if they don't contain the exact words.

More powerful than keyword search for finding related ideas.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
queryYes
limitNo

Implementation Reference

  • The 'semantic_search' tool implementation, which performs vector-based semantic search on conversations using LanceDB.
    def semantic_search(query: str, limit: int = 10) -> str:
        """
        Search conversations using semantic similarity (vector embeddings).
        Finds messages that are conceptually similar to your query,
        even if they don't contain the exact words.
    
        More powerful than keyword search for finding related ideas.
        """
        cfg = get_config()
    
        if not cfg.lance_path.exists():
            return "Vector database not found. Run the embed pipeline first."
    
        embedding = get_embedding(query)
        if not embedding:
            try:
                import fastembed  # noqa: F401
                return "Could not generate embedding for query."
            except ImportError:
                return ("Semantic search requires the embedding model.\n"
                        "Install with: pip install brain-mcp[embed]\n"
                        "Then run: brain-mcp embed\n\n"
                        "Meanwhile, use search_conversations() for keyword search.")
    
        db = get_lance_db()
        if not db:
            return "Could not connect to vector database."
    
        try:
            tbl = db.open_table("message")
            results = tbl.search(embedding).limit(limit).to_pandas()
        except Exception as e:
            return f"Search error: {e}"
    
        if results.empty:
            return "No results found."
    
        output = [f"## Semantic Search: '{query}'\n"]
        output.append(f"_Found {len(results)} semantically similar messages_\n")
    
        for i, row in results.iterrows():
            title = row.get("conversation_title") or "Untitled"
            content = str(row.get("content", ""))
            year = row.get("year", 0)
            month = row.get("month", 0)
            distance = row.get("_distance", 0)
    
            preview = content[:400] + "..." if len(content) > 400 else content
    
            output.append(f"### {i+1}. [{year}-{month:02d}] {title}")
            output.append(f"**Similarity**: {distance:.4f}")
            output.append(f"> {preview}\n")
    
        return "\n".join(output)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mordechaipotash/brain-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server