Skip to main content
Glama
chatmcp
by chatmcp

extract-mcp-servers-from-url

Extract MCP servers from a specified URL to collect and organize server information from web sources.

Instructions

Extract MCP Servers from a URL

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
urlYes

Implementation Reference

  • Main tool handler function (@server.call_tool()). For 'extract-mcp-servers-from-url', fetches content from URL via helper, then extracts MCP servers using shared logic and returns as text.
    @server.call_tool() async def handle_call_tool( name: str, arguments: dict | None ) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]: if not arguments: raise ValueError("Missing arguments") content = None match name: case "extract-mcp-servers-from-url": url = arguments.get("url") if not url: raise ValueError("Missing url") content = await call_fetch_tool(url) case "extract-mcp-servers-from-content": content = arguments.get("content") case "submit-mcp-server": url = arguments.get("url") avatar_url = arguments.get("avatar_url") or "" result = await submit_mcp_server(url, avatar_url) content = json.dumps(result) return [ types.TextContent( type="text", text=content, ) ] case _: raise ValueError(f"Unknown tool: {name}") if not content: raise ValueError("Missing content") logger.info(f"Fetched content from {url}: {content}") mcp_servers = await extract_mcp_servers_from_content(content) if not mcp_servers: raise ValueError("Extracted no MCP Servers") logger.info(f"Extracted MCP Servers from {url}: {mcp_servers}") return [ types.TextContent( type="text", text=mcp_servers, ) ]
  • Input schema definition for 'extract-mcp-servers-from-url' tool: requires a 'url' string property.
    inputSchema={ "type": "object", "properties": { "url": {"type": "string"}, }, "required": ["url"], },
  • Registration of the tool in @server.list_tools(): defines name, description, and input schema.
    return [ types.Tool( name="extract-mcp-servers-from-url", description="Extract MCP Servers from a URL", inputSchema={ "type": "object", "properties": { "url": {"type": "string"}, }, "required": ["url"], }, ),
  • Helper function to extract MCP servers from content using OpenAI LLM in JSON object response format with custom prompt.
    async def extract_mcp_servers_from_content(content: str) -> str | None: client = OpenAI( api_key=os.getenv("OPENAI_API_KEY"), base_url=os.getenv("OPENAI_BASE_URL"), ) user_content = extract_mcp_servers_prompt.format(content=content) logger.info(f"Extract prompt: {user_content}") chat_completion = client.chat.completions.create( messages=[ { "role": "user", "content": user_content, } ], model=os.getenv("OPENAI_MODEL"), response_format={"type": "json_object"}, ) return chat_completion.choices[0].message.content
  • Helper function to fetch raw content from URL by invoking external 'mcp-server-fetch' via MCP client.
    async def call_fetch_tool(url: str): async with stdio_client(server_params) as (read, write): async with ClientSession(read, write) as session: await session.initialize() result = await session.call_tool( "fetch", arguments={ "url": url, "max_length": 100000, "raw": True } ) return result.content[0].text

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/chatmcp/mcp-server-collector'

If you have feedback or need assistance with the MCP directory API, please join our Discord server