Skip to main content
Glama

LibreModel MCP Server

chat

Engage in real-time conversations with LibreModel (Gigi) using customizable parameters like message input, temperature, and token limits. Ideal for interactive chat with local LLM instances via the LibreModel MCP Server.

Instructions

Have a conversation with LibreModel (Gigi)

Input Schema

NameRequiredDescriptionDefault
max_tokensNoMaximum tokens to generate
messageYesYour message to LibreModel
system_promptNoOptional system prompt to prefix the conversation
temperatureNoSampling temperature (0.0-2.0)
top_kNoTop-k sampling parameter
top_pNoNucleus sampling parameter

Input Schema (JSON Schema)

{ "$schema": "http://json-schema.org/draft-07/schema#", "additionalProperties": false, "properties": { "max_tokens": { "default": 512, "description": "Maximum tokens to generate", "maximum": 2048, "minimum": 1, "type": "number" }, "message": { "description": "Your message to LibreModel", "type": "string" }, "system_prompt": { "default": "", "description": "Optional system prompt to prefix the conversation", "type": "string" }, "temperature": { "default": 0.7, "description": "Sampling temperature (0.0-2.0)", "maximum": 2, "minimum": 0, "type": "number" }, "top_k": { "default": 40, "description": "Top-k sampling parameter", "minimum": 1, "type": "number" }, "top_p": { "default": 0.95, "description": "Nucleus sampling parameter", "maximum": 1, "minimum": 0, "type": "number" } }, "required": [ "message" ], "type": "object" }

Other Tools from LibreModel MCP Server

Related Tools

    MCP directory API

    We provide all the information about MCP servers via our MCP API.

    curl -X GET 'https://glama.ai/api/mcp/v1/servers/openconstruct/llama-mcp-server'

    If you have feedback or need assistance with the MCP directory API, please join our Discord server