Skip to main content
Glama

ollama_chat

Chat with local language models using conversation messages, system prompts, multi-turn dialogues, and tool calling capabilities.

Instructions

Chat with a model using conversation messages. Supports system messages, multi-turn conversations, tool calling, and generation options.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
modelYesName of the model to use
messagesYesArray of chat messages
toolsNoTools that the model can call (optional). Provide as JSON array of tool objects.
optionsNoGeneration options (optional). Provide as JSON object with settings like temperature, top_p, etc.
formatNojson

Implementation Reference

  • Core implementation of the ollama_chat tool that executes the Ollama chat API call, handles responses including tool calls, and applies response formatting.
    export async function chatWithModel( ollama: Ollama, model: string, messages: ChatMessage[], options: GenerationOptions, format: ResponseFormat, tools?: Tool[] ): Promise<string> { // Determine format parameter for Ollama API let ollamaFormat: 'json' | undefined = undefined; if (format === ResponseFormat.JSON) { ollamaFormat = 'json'; } const response = await ollama.chat({ model, messages, tools, options, format: ollamaFormat, stream: false, }); // Extract content with fallback let content = response.message.content; if (!content) { content = ''; } const tool_calls = response.message.tool_calls; // If the response includes tool calls, include them in the output let hasToolCalls = false; if (tool_calls) { if (tool_calls.length > 0) { hasToolCalls = true; } } if (hasToolCalls) { const fullResponse = { content, tool_calls, }; return formatResponse(JSON.stringify(fullResponse), format); } return formatResponse(content, format); }
  • ToolDefinition export that registers the 'ollama_chat' tool, including its name, description, input schema, and handler function which performs input validation.
    export const toolDefinition: ToolDefinition = { name: 'ollama_chat', description: 'Chat with a model using conversation messages. Supports system messages, multi-turn conversations, tool calling, and generation options.', inputSchema: { type: 'object', properties: { model: { type: 'string', description: 'Name of the model to use', }, messages: { type: 'array', description: 'Array of chat messages', items: { type: 'object', properties: { role: { type: 'string', enum: ['system', 'user', 'assistant'], }, content: { type: 'string', }, images: { type: 'array', items: { type: 'string' }, }, }, required: ['role', 'content'], }, }, tools: { type: 'string', description: 'Tools that the model can call (optional). Provide as JSON array of tool objects.', }, options: { type: 'string', description: 'Generation options (optional). Provide as JSON object with settings like temperature, top_p, etc.', }, format: { type: 'string', enum: ['json', 'markdown'], default: 'json', }, }, required: ['model', 'messages'], }, handler: async (ollama: Ollama, args: Record<string, unknown>, format: ResponseFormat) => { const validated = ChatInputSchema.parse(args); return chatWithModel( ollama, validated.model, validated.messages, validated.options || {}, format, validated.tools.length > 0 ? validated.tools : undefined ); }, };
  • Zod schema (ChatInputSchema) for validating inputs to the ollama_chat tool, including model, messages, tools, options, used in the handler for parsing args.
    * Schema for ollama_chat tool */ export const ChatInputSchema = z.object({ model: z.string().min(1), messages: z.array(ChatMessageSchema).min(1), tools: parseJsonOrDefault([]).pipe(z.array(ToolSchema)), options: parseJsonOrDefault({}).pipe(GenerationOptionsSchema), format: ResponseFormatSchema.default('json'), stream: z.boolean().default(false), });

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rawveg/ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server