sampleLLM
Generate text samples from a language model by inputting a prompt and specifying token limits, integrated with Genkit MCP for streamlined use.
Instructions
Samples from an LLM using MCP's sampling feature
Input Schema
Name | Required | Description | Default |
---|---|---|---|
maxTokens | No | Maximum number of tokens to generate | |
prompt | Yes | The prompt to send to the LLM |
Input Schema (JSON Schema)
You must be authenticated.
Other Tools from Genkit MCP
Related Tools
- @66julienmartin/MCP-server-Qwen_Max
- @PhialsBasement/KoboldCPP-MCP-Server
- @PhialsBasement/KoboldCPP-MCP-Server
- @PhialsBasement/KoboldCPP-MCP-Server
- @DMontgomery40/deepseek-mcp-server