searxng_search
Perform web searches through a SearxNG instance to retrieve search results with titles, URLs, and content snippets for any query.
Instructions
Searches the web using a SearxNG instance and returns a list of results.
Args:
query: The search query.
max_results: The maximum number of results to return. Defaults to 30.
Returns:
A list of dictionaries, where each dictionary represents a search result
and contains the title, URL, and content snippet. Returns an error
message in a dictionary if the search fails.
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| max_results | No | ||
| query | Yes |
Implementation Reference
- src/mcp_searxng_search/server.py:18-69 (handler)The handler function for the 'searxng_search' tool. It is decorated with @mcp.tool() for registration and implements the search logic by posting to SearxNG, parsing HTML with BeautifulSoup, and extracting title, url, content for up to max_results.@mcp.tool() def searxng_search(query: str, max_results: int = 30) -> List[Dict[str, str]]: """ Searches the web using a SearxNG instance and returns a list of results. Args: query: The search query. max_results: The maximum number of results to return. Defaults to 30. Returns: A list of dictionaries, where each dictionary represents a search result and contains the title, URL, and content snippet. Returns an error message in a dictionary if the search fails. """ if max_results <= 0: raise McpError(ErrorData(INVALID_PARAMS, "max_results must be greater than 0.")) search_url = f"{SEARXNG_BASE_URL}/search" headers = { 'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7', 'Accept-Language': 'en-US,en;q=0.9', 'Cache-Control': 'no-cache', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Pragma': 'no-cache', 'Upgrade-Insecure-Requests': '1', 'User-Agent': USER_AGENT } data = f"q={query}&categories=general&language=auto&time_range=&safesearch=0&theme=simple" try: response = requests.post(search_url, headers=headers, data=data, verify=False, timeout=30) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) html_content = response.text soup = BeautifulSoup(html_content, 'html.parser') results = [] for article in soup.find_all('article', class_='result')[:max_results]: url_header = article.find('a', class_='url_header') if url_header: url = url_header['href'] title = article.find('h3').text.strip() if article.find('h3') else "No Title" description = article.find('p', class_='content').text.strip() if article.find('p', class_='content') else "No Description" results.append({ 'title': title, 'url': url, 'content': description }) return results except requests.exceptions.RequestException as e: raise McpError(ErrorData(INTERNAL_ERROR, f"Error during search: {str(e)}")) except Exception as e: raise McpError(ErrorData(INTERNAL_ERROR, f"Unexpected error: {str(e)}"))
- src/mcp_searxng_search/server.py:9-9 (registration)Creates the FastMCP server instance named 'searxng'. Tools like searxng_search are registered on this instance via decorators.mcp = FastMCP("searxng")
- src/mcp_searxng_search/__init__.py:10-10 (registration)Runs the MCP server, making the registered tools available.mcp.run()
- The function signature with type hints and docstring define the input schema (query: str, max_results: int=30) and output (List[Dict[str,str]] with title/url/content).def searxng_search(query: str, max_results: int = 30) -> List[Dict[str, str]]: """ Searches the web using a SearxNG instance and returns a list of results. Args: query: The search query. max_results: The maximum number of results to return. Defaults to 30. Returns: A list of dictionaries, where each dictionary represents a search result and contains the title, URL, and content snippet. Returns an error message in a dictionary if the search fails. """