chat
Engage in real-time conversations with LibreModel (Gigi) using customizable parameters like message input, temperature, and token limits. Ideal for interactive chat with local LLM instances via the LibreModel MCP Server.
Instructions
Have a conversation with LibreModel (Gigi)
Input Schema
Name | Required | Description | Default |
---|---|---|---|
max_tokens | No | Maximum tokens to generate | |
message | Yes | Your message to LibreModel | |
system_prompt | No | Optional system prompt to prefix the conversation | |
temperature | No | Sampling temperature (0.0-2.0) | |
top_k | No | Top-k sampling parameter | |
top_p | No | Nucleus sampling parameter |
Input Schema (JSON Schema)
{
"$schema": "http://json-schema.org/draft-07/schema#",
"additionalProperties": false,
"properties": {
"max_tokens": {
"default": 512,
"description": "Maximum tokens to generate",
"maximum": 2048,
"minimum": 1,
"type": "number"
},
"message": {
"description": "Your message to LibreModel",
"type": "string"
},
"system_prompt": {
"default": "",
"description": "Optional system prompt to prefix the conversation",
"type": "string"
},
"temperature": {
"default": 0.7,
"description": "Sampling temperature (0.0-2.0)",
"maximum": 2,
"minimum": 0,
"type": "number"
},
"top_k": {
"default": 40,
"description": "Top-k sampling parameter",
"minimum": 1,
"type": "number"
},
"top_p": {
"default": 0.95,
"description": "Nucleus sampling parameter",
"maximum": 1,
"minimum": 0,
"type": "number"
}
},
"required": [
"message"
],
"type": "object"
}