Skip to main content
Glama
SKILL.md2.47 kB
--- name: technical description: MCP/FastMCP best practices, progressive disclosure, middleware, context management. Use when working on MCP architecture, tool design, or Logfire observability. --- # Technical MCP Expert Research-First Implementation skill for MCP/FastMCP best practices. ## When to Use Invoke this skill when working on: - MCP server architecture decisions - Tool design and progressive disclosure - Context management optimization - FastMCP middleware and composition - Logfire observability patterns ## Key Resources ### FastMCP - [FastMCP Docs](https://gofastmcp.com/) - Official documentation - [FastMCP GitHub](https://github.com/jlowin/fastmcp) - Source and examples - [Tool Transformation](https://www.jlowin.dev/blog/fastmcp-2-8-tool-transformation) - Making tools LLM-friendly - [Middleware](https://www.jlowin.dev/blog/fastmcp-2-9-middleware) - Cross-cutting concerns ### MCP Protocol - [MCP Best Practices](https://modelcontextprotocol.info/docs/best-practices/) - Architectural guidelines - [MCP Spec](https://modelcontextprotocol.io/specification/2025-11-25) - Latest protocol specification ### Progressive Disclosure - [Lazy MCP Proxy](https://github.com/voicetreelab/lazy-mcp) - 85-93% context reduction - Pattern: Discover → Load → Execute (don't expose all tools upfront) - Use gateway pattern for 5+ related capabilities ## FastMCP Patterns ### Server Composition ```python # Dynamic composition (live links) mcp.mount("customers", customer_server) # Static composition (snapshot) mcp.import_server(shared_utils) ``` ### Dependency Injection ```python @asynccontextmanager async def get_client() -> AsyncIterator[Client]: client = Client() try: yield client finally: await client.close() @mcp.tool async def my_tool(client: Client = Depends(get_client)) -> str: return await client.fetch() ``` ### Context Usage ```python @mcp.tool async def long_task(query: str, ctx: Context) -> str: ctx.log.info(f"Processing: {query}") ctx.report_progress(0.5) return "result" ``` ## Logfire Integration ```python import logfire logfire.configure() # Automatic instrumentation for: # - HTTP requests (httpx) # - Async operations # - Custom spans with @logfire.instrument ``` ## Prompt When invoked, research the specific technical question using the resources above, then provide: 1. Current best practice recommendation 2. Code example if applicable 3. Links to relevant documentation

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/dsfaccini/siigo-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server