Skip to main content
Glama

search_memories

Search stored memories using natural language queries and filters to find relevant information based on user, agent, or time criteria.

Instructions

Run a semantic search over existing memories.

Use filters to narrow results. Common filter patterns: - Single user: {"AND": [{"user_id": "john"}]} - Agent memories: {"AND": [{"agent_id": "agent_name"}]} - Recent memories: {"AND": [{"user_id": "john"}, {"created_at": {"gte": "2024-01-01"}}]} - Multiple users: {"AND": [{"user_id": {"in": ["john", "jane"]}}]} - Cross-entity: {"OR": [{"user_id": "john"}, {"agent_id": "agent_name"}]} user_id is automatically added to filters if not provided.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
queryYesNatural language description of what to find.
filtersNoAdditional filter clauses (user_id injected automatically).
limitNoMaximum number of results to return.
enable_graphNoSet true only when the user explicitly wants graph-derived memories.

Implementation Reference

  • Implementation of the search_memories tool handler. It resolves configuration, validates arguments using SearchMemoriesArgs, injects default user_id into filters, prepares payload, and calls the Mem0 client's search method.
    def search_memories( query: Annotated[str, Field(description="Natural language description of what to find.")], filters: Annotated[ Optional[Dict[str, Any]], Field(default=None, description="Additional filter clauses (user_id injected automatically)."), ] = None, limit: Annotated[ Optional[int], Field(default=None, description="Maximum number of results to return.") ] = None, enable_graph: Annotated[ Optional[bool], Field( default=None, description="Set true only when the user explicitly wants graph-derived memories.", ), ] = None, ctx: Context | None = None, ) -> str: """Semantic search against existing memories.""" api_key, default_user, graph_default = _resolve_settings(ctx) args = SearchMemoriesArgs( query=query, filters=filters, limit=limit, enable_graph=_default_enable_graph(enable_graph, graph_default), ) payload = args.model_dump(exclude_none=True) payload["filters"] = _with_default_filters(default_user, payload.get("filters")) payload.setdefault("enable_graph", graph_default) client = _mem0_client(api_key) return _mem0_call(client.search, **payload)
  • Pydantic BaseModel schema defining the input parameters for the search_memories tool: query (required), filters, limit, enable_graph.
    class SearchMemoriesArgs(BaseModel): query: str = Field(..., description="Describe what you want to find.") filters: Optional[Dict[str, Any]] = Field( None, description="Additional filter clauses; user_id is injected automatically." ) limit: Optional[int] = Field(None, description="Optional maximum number of matches.") enable_graph: Optional[bool] = Field( None, description="Set True only when the user asks for graph knowledge." )
  • FastMCP @server.tool decorator registration for the search_memories tool, providing a detailed description of usage and filter examples.
    @server.tool( description="""Run a semantic search over existing memories. Use filters to narrow results. Common filter patterns: - Single user: {"AND": [{"user_id": "john"}]} - Agent memories: {"AND": [{"agent_id": "agent_name"}]} - Recent memories: {"AND": [{"user_id": "john"}, {"created_at": {"gte": "2024-01-01"}}]} - Multiple users: {"AND": [{"user_id": {"in": ["john", "jane"]}}]} - Cross-entity: {"OR": [{"user_id": "john"}, {"agent_id": "agent_name"}]} user_id is automatically added to filters if not provided. """ )

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mem0ai/mem0-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server