sampleLLM
Generate text samples from a large language model by providing a prompt and specifying the maximum token limit. Part of the MCP Elicitations Demo Server for dynamic user input collection.
Instructions
Samples from an LLM using MCP's sampling feature
Input Schema
Name | Required | Description | Default |
---|---|---|---|
maxTokens | No | Maximum number of tokens to generate | |
prompt | Yes | The prompt to send to the LLM |