We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/dstreefkerk/ms-sentinel-mcp-server'
If you have feedback or need assistance with the MCP directory API, please join our Discord server
llm_instructions_get.md•886 B
# LLM Instructions Get Tool
**Tool Name:** `llm_instructions_get`
## Overview
Retrieves the LLM usage instructions for the Sentinel MCP Server. This tool should be called before all other tools to understand LLM-specific guidelines and requirements.
## Parameters
- None
## Output
- `content` (str): Raw markdown content of the LLM instructions file (typically `docs/llm_instructions.md`).
- If error, returns a dict with `error` (str).
## Example Requests
### Get LLM usage instructions
```
{}
```
## Example Output
```
{
"content": "# LLM Usage Instructions\n\n- Use fictional placeholders for all workspace details...\n..."
}
```
## Error Handling
- Returns `error` if the instructions file cannot be read.
## MCP Compliance
- Inherits from `MCPToolBase`.
- Implements `async def run(self, ctx, **kwargs)`.
- Registered in `register_tools()`.
- Uses robust error handling.