sampleLLM
Generate text samples from a large language model by providing a prompt and specifying the maximum token limit. Part of the MCP Elicitations Demo Server for dynamic user input collection.
Instructions
Samples from an LLM using MCP's sampling feature
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| maxTokens | No | Maximum number of tokens to generate | |
| prompt | Yes | The prompt to send to the LLM |
Input Schema (JSON Schema)
{
"$schema": "http://json-schema.org/draft-07/schema#",
"additionalProperties": false,
"properties": {
"maxTokens": {
"default": 100,
"description": "Maximum number of tokens to generate",
"type": "number"
},
"prompt": {
"description": "The prompt to send to the LLM",
"type": "string"
}
},
"required": [
"prompt"
],
"type": "object"
}