Skip to main content
Glama

ollama_generate

Generate AI text responses using local Ollama models for tasks requiring natural language processing within macOS applications.

Instructions

Генерирует ответ используя локальную модель Ollama. Используйте для задач, требующих AI обработки текста

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
modelNoНазвание модели Ollama (например, 'llama3.2', 'deepseek-r1:8b'). По умолчанию 'llama3.2'llama3.2
promptYesЗапрос для модели

Implementation Reference

  • Handler function for 'ollama_generate' tool in TypeScript MCP server. Sends POST request to Ollama API /api/generate with model and prompt, handles errors and returns formatted MCP response.
    private async ollamaGenerate(prompt: string, model: string = "llama3.2") { try { const response = await fetch(`${OLLAMA_API_URL}/api/generate`, { method: "POST", headers: { "Content-Type": "application/json", }, body: JSON.stringify({ model, prompt, stream: false, }), }); if (!response.ok) { const errorText = await response.text(); throw new Error( `Ollama API error: ${response.status} ${errorText}` ); } const data = (await response.json()) as { response?: string }; return { content: [ { type: "text", text: data.response || "Нет ответа от модели", }, ], }; } catch (error) { // Проверяем, доступен ли Ollama сервер if (error instanceof TypeError && error.message.includes("fetch")) { throw new Error( `Не удалось подключиться к Ollama серверу (${OLLAMA_API_URL}). Убедитесь, что Ollama запущен: ollama serve` ); } throw new Error( `Ошибка Ollama: ${error instanceof Error ? error.message : String(error)}` ); } }
  • Handler function for 'ollama_generate' tool in Python MCP server. Sends POST request to Ollama API /api/generate using requests library, handles connection and other errors.
    def ollama_generate(prompt: str, model: str = "llama3.2") -> str: """Generates response via Ollama API""" try: response = requests.post( f"{OLLAMA_API_URL}/api/generate", json={"model": model, "prompt": prompt, "stream": False}, timeout=30, ) response.raise_for_status() data = response.json() return data.get("response", "No response from model") except requests.exceptions.ConnectionError: raise Exception( f"Failed to connect to Ollama server ({OLLAMA_API_URL}). " "Make sure Ollama is running: ollama serve" ) except Exception as e: raise Exception(f"Ollama error: {str(e)}")
  • Input schema definition for 'ollama_generate' tool in the tools list response, including properties for prompt (required) and model (optional, default 'llama3.2').
    { name: "ollama_generate", description: "Генерирует ответ используя локальную модель Ollama. Используйте для задач, требующих AI обработки текста", inputSchema: { type: "object", properties: { model: { type: "string", description: "Название модели Ollama (например, 'llama3.2', 'deepseek-r1:8b'). По умолчанию 'llama3.2'", default: "llama3.2", }, prompt: { type: "string", description: "Запрос для модели", }, }, required: ["prompt"], }, },
  • Input schema definition for 'ollama_generate' tool in get_tools() function, matching the TypeScript version but in English.
    "name": "ollama_generate", "description": "Generates response using local Ollama model. Use for tasks requiring AI text processing", "inputSchema": { "type": "object", "properties": { "model": { "type": "string", "description": "Ollama model name (e.g., 'llama3.2', 'deepseek-r1:8b'). Default 'llama3.2'", "default": "llama3.2", }, "prompt": { "type": "string", "description": "Prompt for the model", }, }, "required": ["prompt"], },
  • src/index.ts:329-333 (registration)
    Dispatch/registration case in CallToolRequestSchema handler that routes 'ollama_generate' calls to the ollamaGenerate method.
    case "ollama_generate": return await this.ollamaGenerate( args?.prompt as string, (args?.model as string) || "llama3.2" );

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/TrueOleg/MCP-expirements'

If you have feedback or need assistance with the MCP directory API, please join our Discord server