sampleLLM
Generate text responses from prompts using the Model Context Protocol's sampling feature to interact with language models.
Instructions
Samples from an LLM using MCP's sampling feature
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| prompt | Yes | The prompt to send to the LLM | |
| maxTokens | No | Maximum number of tokens to generate |