kobold_complete
Generate text completions using OpenAI-compatible API endpoints with Kobold MCP Server. Define a prompt, adjust parameters like max tokens and temperature, and create coherent text outputs for diverse applications.
Instructions
Text completion (OpenAI-compatible)
Input Schema
Name | Required | Description | Default |
---|---|---|---|
apiUrl | No | http://localhost:5001 | |
max_tokens | No | ||
prompt | Yes | ||
stop | No | ||
temperature | No | ||
top_p | No |
Input Schema (JSON Schema)
You must be authenticated.
Other Tools from Kobold MCP Server
- kobold_abort
- kobold_chat
- kobold_complete
- kobold_detokenize
- kobold_generate
- kobold_img2img
- kobold_interrogate
- kobold_last_logprobs
- kobold_max_context_length
- kobold_max_length
- kobold_model_info
- kobold_perf_info
- kobold_sd_models
- kobold_sd_samplers
- kobold_token_count
- kobold_transcribe
- kobold_tts
- kobold_txt2img
- kobold_version
- kobold_web_search
Related Tools
- @pyroprompts/any-chat-completions-mcp
- @PhialsBasement/KoboldCPP-MCP-Server
- @PhialsBasement/KoboldCPP-MCP-Server
- @mzxrai/mcp-openai
- @PhialsBasement/KoboldCPP-MCP-Server