Skip to main content
Glama

sampleLLM

Generate text responses from prompts using the Model Context Protocol's sampling feature to interact with language models.

Instructions

Samples from an LLM using MCP's sampling feature

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
promptYesThe prompt to send to the LLM
maxTokensNoMaximum number of tokens to generate

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/epicweb-dev/epic-me-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server