chat
Engage in real-time conversations with LibreModel (Gigi) using customizable parameters like message input, temperature, and token limits. Ideal for interactive chat with local LLM instances via the LibreModel MCP Server.
Instructions
Have a conversation with LibreModel (Gigi)
Input Schema
Name | Required | Description | Default |
---|---|---|---|
max_tokens | No | Maximum tokens to generate | |
message | Yes | Your message to LibreModel | |
system_prompt | No | Optional system prompt to prefix the conversation | |
temperature | No | Sampling temperature (0.0-2.0) | |
top_k | No | Top-k sampling parameter | |
top_p | No | Nucleus sampling parameter |