sampleLLM
Generate text responses from a language model by providing prompts and setting token limits using MCP's sampling feature on the Elicitations Demo Server.
Instructions
Samples from an LLM using MCP's sampling feature
Input Schema
Name | Required | Description | Default |
---|---|---|---|
maxTokens | No | Maximum number of tokens to generate | |
prompt | Yes | The prompt to send to the LLM |