Skip to main content
Glama

EpicMe MCP

by epicweb-dev

sampleLLM

Generate text responses from an LLM by submitting prompts and specifying token limits using the Model Context Protocol (MCP) sampling feature on the EpicMe MCP server.

Instructions

Samples from an LLM using MCP's sampling feature

Input Schema

NameRequiredDescriptionDefault
maxTokensNoMaximum number of tokens to generate
promptYesThe prompt to send to the LLM

Input Schema (JSON Schema)

{ "$schema": "http://json-schema.org/draft-07/schema#", "additionalProperties": false, "properties": { "maxTokens": { "default": 100, "description": "Maximum number of tokens to generate", "type": "number" }, "prompt": { "description": "The prompt to send to the LLM", "type": "string" } }, "required": [ "prompt" ], "type": "object" }
Install Server

Other Tools from EpicMe MCP

Related Tools

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/epicweb-dev/epic-me-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server