Skip to main content
Glama

mock_search

Retrieve relevant financial information from the internet using search keywords to support research and analysis workflows.

Instructions

Use search keywords to retrieve relevant information from the internet. If you have multiple keywords, please call this tool separately for each one.

Input Schema

NameRequiredDescriptionDefault
queryYessearch keyword

Input Schema (JSON Schema)

{ "properties": { "query": { "description": "search keyword", "type": "string" } }, "required": [ "query" ], "type": "object" }

Implementation Reference

  • The async_execute method that implements the core logic of the 'mock_search' tool. It prompts an LLM with the search query to generate mock JSON-formatted search results.
    async def async_execute(self): """Generate mock search results using an LLM. The method builds a small conversation where the system message instructs the model to return JSON-formatted search results, and the user message contains the formatted query. The JSON structure is then extracted and pretty-printed. """ query: str = self.input_dict["query"] if not query: answer = "query is empty, no results found." logger.warning(answer) self.set_output(answer) return messages = [ Message( role=Role.SYSTEM, content="You are a helpful assistant that generates realistic search results in JSON format.", ), Message( role=Role.USER, content=self.prompt_format( "mock_search_op_prompt", query=query, num_results=random.randint(0, 5), ), ), ] logger.info(f"messages={messages}") def callback_fn(message: Message): return extract_content(message.content, "json") search_results: str = await self.llm.achat(messages=messages, callback_fn=callback_fn) self.set_output(json.dumps(search_results, ensure_ascii=False, indent=2))
  • The build_tool_call method that defines the input/output schema for the 'mock_search' tool, specifying the required 'query' string parameter.
    def build_tool_call(self) -> ToolCall: """Build the tool call schema describing the mock search tool. Returns: ToolCall: Definition containing description and input schema for the ``query`` parameter. """ return ToolCall( **{ "description": self.get_prompt("tool_description"), "input_schema": { "query": { "type": "string", "description": "search keyword", "required": True, }, }, }, )
  • The @C.register_op() decorator registers the MockSearchOp class, which provides the 'mock_search' tool implementation.
    @C.register_op() class MockSearchOp(BaseAsyncToolOp): """Asynchronous mock search tool that generates LLM-based results.""" file_path: str = __file__

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/FlowLLM-AI/finance-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server