Skip to main content
Glama

MCP Elicitations Demo Server

by soriat

sampleLLM

Generate text samples from a large language model by providing a prompt and specifying the maximum token limit. Part of the MCP Elicitations Demo Server for dynamic user input collection.

Instructions

Samples from an LLM using MCP's sampling feature

Input Schema

NameRequiredDescriptionDefault
maxTokensNoMaximum number of tokens to generate
promptYesThe prompt to send to the LLM

Input Schema (JSON Schema)

{ "$schema": "http://json-schema.org/draft-07/schema#", "additionalProperties": false, "properties": { "maxTokens": { "default": 100, "description": "Maximum number of tokens to generate", "type": "number" }, "prompt": { "description": "The prompt to send to the LLM", "type": "string" } }, "required": [ "prompt" ], "type": "object" }

Other Tools from MCP Elicitations Demo Server

Related Tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/soriat/soria-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server