check_llm_status
Check if your local llama.cpp server is running and reachable to ensure offline diagram generation from natural language.
Instructions
Check whether the local llama.cpp server is running and reachable.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Output Schema
| Name | Required | Description | Default |
|---|---|---|---|
| result | Yes |
Implementation Reference
- src/excalidraw_mcp/server.py:74-82 (handler)The tool handler function for 'check_llm_status'. Checks if the llama.cpp server is running by delegating to check_llm_health() and returns a human-readable status message.
async def check_llm_status() -> str: """Check whether the local llama.cpp server is running and reachable.""" if await check_llm_health(): return "llama.cpp server is running at localhost:8080" return ( "llama.cpp server is NOT running.\n" "Start it with:\n" " ./build/bin/llama-server -m models/your-model.gguf --port 8080 -c 8192" ) - src/excalidraw_mcp/server.py:73-74 (registration)Registration as an MCP tool via the @mcp.tool() decorator on the check_llm_status function.
@mcp.tool() async def check_llm_status() -> str: - The check_llm_health() helper function that performs an HTTP GET to localhost:8080/health and returns True if status is 200.
async def check_llm_health() -> bool: """Return True if the llama.cpp server is reachable.""" try: async with httpx.AsyncClient(timeout=5.0) as client: r = await client.get(f"{LLAMA_BASE_URL}/health") return r.status_code == 200 except Exception: return False - src/excalidraw_mcp/server.py:74-82 (schema)Schema is implicitly defined by the function signature: no input params, returns a string. The docstring serves as the description.
async def check_llm_status() -> str: """Check whether the local llama.cpp server is running and reachable.""" if await check_llm_health(): return "llama.cpp server is running at localhost:8080" return ( "llama.cpp server is NOT running.\n" "Start it with:\n" " ./build/bin/llama-server -m models/your-model.gguf --port 8080 -c 8192" )