Skip to main content
Glama

ollama_chat

Chat with local AI models using conversation messages, supporting system prompts, multi-turn dialogues, and tool calling for interactive AI applications.

Instructions

Chat with a model using conversation messages. Supports system messages, multi-turn conversations, tool calling, and generation options.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
modelYesName of the model to use
messagesYesArray of chat messages
toolsNoTools that the model can call (optional). Provide as JSON array of tool objects.
optionsNoGeneration options (optional). Provide as JSON object with settings like temperature, top_p, etc.
formatNojson

Implementation Reference

  • Core implementation of the chat functionality that calls the Ollama chat API, processes responses, handles tool calls, and formats output.
    export async function chatWithModel( ollama: Ollama, model: string, messages: ChatMessage[], options: GenerationOptions, format: ResponseFormat, tools?: Tool[] ): Promise<string> { // Determine format parameter for Ollama API let ollamaFormat: 'json' | undefined = undefined; if (format === ResponseFormat.JSON) { ollamaFormat = 'json'; } const response = await ollama.chat({ model, messages, tools, options, format: ollamaFormat, stream: false, }); // Extract content with fallback let content = response.message.content; if (!content) { content = ''; } const tool_calls = response.message.tool_calls; // If the response includes tool calls, include them in the output let hasToolCalls = false; if (tool_calls) { if (tool_calls.length > 0) { hasToolCalls = true; } } if (hasToolCalls) { const fullResponse = { content, tool_calls, }; return formatResponse(JSON.stringify(fullResponse), format); } return formatResponse(content, format); }
  • Tool definition export for 'ollama_chat' including name, description, input schema for MCP, and handler function. Automatically discovered and registered by the autoloader.
    export const toolDefinition: ToolDefinition = { name: 'ollama_chat', description: 'Chat with a model using conversation messages. Supports system messages, multi-turn conversations, tool calling, and generation options.', inputSchema: { type: 'object', properties: { model: { type: 'string', description: 'Name of the model to use', }, messages: { type: 'array', description: 'Array of chat messages', items: { type: 'object', properties: { role: { type: 'string', enum: ['system', 'user', 'assistant'], }, content: { type: 'string', }, images: { type: 'array', items: { type: 'string' }, }, }, required: ['role', 'content'], }, }, tools: { type: 'string', description: 'Tools that the model can call (optional). Provide as JSON array of tool objects.', }, options: { type: 'string', description: 'Generation options (optional). Provide as JSON object with settings like temperature, top_p, etc.', }, format: { type: 'string', enum: ['json', 'markdown'], default: 'json', }, }, required: ['model', 'messages'], }, handler: async (ollama: Ollama, args: Record<string, unknown>, format: ResponseFormat) => { const validated = ChatInputSchema.parse(args); return chatWithModel( ollama, validated.model, validated.messages, validated.options || {}, format, validated.tools.length > 0 ? validated.tools : undefined ); }, };
  • Zod input validation schema (ChatInputSchema) for the ollama_chat tool, used in the handler for parsing and validating arguments.
    /** * Schema for ollama_chat tool */ export const ChatInputSchema = z.object({ model: z.string().min(1), messages: z.array(ChatMessageSchema).min(1), tools: parseJsonOrDefault([]).pipe(z.array(ToolSchema)), options: parseJsonOrDefault({}).pipe(GenerationOptionsSchema), format: ResponseFormatSchema.default('json'), stream: z.boolean().default(false), });

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rawveg/ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server