sampleLLM
Generate text responses from an LLM by submitting prompts and specifying token limits using the Model Context Protocol (MCP) sampling feature on the EpicMe MCP server.
Instructions
Samples from an LLM using MCP's sampling feature
Input Schema
Name | Required | Description | Default |
---|---|---|---|
maxTokens | No | Maximum number of tokens to generate | |
prompt | Yes | The prompt to send to the LLM |