Skip to main content
Glama

kobold_complete

Generate text completions using KoboldAI's language model through an OpenAI-compatible API endpoint for applications requiring AI-powered text generation.

Instructions

Text completion (OpenAI-compatible)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
apiUrlNohttp://localhost:5001
promptYes
max_tokensNo
temperatureNo
top_pNo
stopNo

Implementation Reference

  • Generic handler for all POST-based tools, including kobold_complete. Validates input using the tool's schema, forwards the request to the KoboldAI API endpoint via makeRequest, and returns the JSON response.
    if (postEndpoints[name]) {
        const { endpoint, schema } = postEndpoints[name];
        const parsed = schema.safeParse(args);
        if (!parsed.success) {
            throw new Error(`Invalid arguments: ${parsed.error}`);
        }
    
        const result = await makeRequest(`${apiUrl}${endpoint}`, 'POST', requestData);
        return {
            content: [{ type: "text", text: JSON.stringify(result, null, 2) }],
            isError: false,
        };
  • Zod schema defining the input parameters for the kobold_complete tool, including prompt, max_tokens, temperature, top_p, and stop sequences.
    const CompletionSchema = BaseConfigSchema.extend({
        prompt: z.string(),
        max_tokens: z.number().optional(),
        temperature: z.number().optional(),
        top_p: z.number().optional(),
        stop: z.array(z.string()).optional(),
    });
  • src/index.ts:266-268 (registration)
    Registers the kobold_complete tool in the ListTools response, providing its name, description, and input schema.
    name: "kobold_complete",
    description: "Text completion (OpenAI-compatible)",
    inputSchema: zodToJsonSchema(CompletionSchema),
  • Maps the kobold_complete tool to its KoboldAI API endpoint '/v1/completions' and references the input schema for validation.
    kobold_complete: { endpoint: '/v1/completions', schema: CompletionSchema },
  • Utility function that performs HTTP requests to the KoboldAI API, used by the tool handler to proxy requests and handle responses.
    async function makeRequest(url: string, method = 'GET', body: Record<string, unknown> | null = null) {
        const options: RequestInit = {
            method,
            headers: body ? { 'Content-Type': 'application/json' } : undefined,
        };
        
        if (body && method !== 'GET') {
            options.body = JSON.stringify(body);
        }
    
        const response = await fetch(url, options);
        if (!response.ok) {
            throw new Error(`KoboldAI API error: ${response.statusText}`);
        }
        
        return response.json();
    }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/PhialsBasement/KoboldCPP-MCP-Server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server