Skip to main content
Glama

text_generation

Generate text completions using AI models to extend prompts, create content, or answer questions through DeepInfra's API.

Instructions

Generate text completion using DeepInfra OpenAI-compatible API.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
modelNo
promptYes

Implementation Reference

  • Conditional registration and implementation of the 'text_generation' tool handler. Uses DeepInfra's OpenAI-compatible completions API to generate text from a prompt using a configurable model.
    if "all" in ENABLED_TOOLS or "text_generation" in ENABLED_TOOLS: @app.tool() async def text_generation(prompt: str) -> str: """Generate text completion using DeepInfra OpenAI-compatible API.""" model = DEFAULT_MODELS["text_generation"] try: response = await client.completions.create( model=model, prompt=prompt, max_tokens=256, temperature=0.7, ) if response.choices: return response.choices[0].text else: return "No text generated" except Exception as e: return f"Error generating text: {type(e).__name__}: {str(e)}"
  • Configuration for the default model used by the text_generation tool.
    "text_generation": os.getenv("MODEL_TEXT_GENERATION", "meta-llama/Llama-2-7b-chat-hf"),
  • Function signature and docstring defining the input schema (prompt: str) and output (str) for the text_generation tool.
    async def text_generation(prompt: str) -> str: """Generate text completion using DeepInfra OpenAI-compatible API."""

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/phuihock/mcp-deeinfra'

If you have feedback or need assistance with the MCP directory API, please join our Discord server