Skip to main content
Glama

mcp_ollama_chat_completion

Generate responses using an OpenAI-compatible chat completion API with Ollama models, supported by Ontology MCP for integrating AI with ontology data queries.

Instructions

OpenAI 호환 채팅 완성 API를 사용하여 응답을 생성합니다

Input Schema

NameRequiredDescriptionDefault
messagesYes대화 메시지 배열
modelYes사용할 Ollama 모델 이름
temperatureNo샘플링 온도(0-2)
timeoutNo타임아웃(밀리초 단위, 기본값: 60000)

Input Schema (JSON Schema)

{ "properties": { "messages": { "description": "대화 메시지 배열", "items": { "properties": { "content": { "type": "string" }, "role": { "enum": [ "system", "user", "assistant" ], "type": "string" } }, "required": [ "role", "content" ], "type": "object" }, "type": "array" }, "model": { "description": "사용할 Ollama 모델 이름", "type": "string" }, "temperature": { "description": "샘플링 온도(0-2)", "maximum": 2, "minimum": 0, "type": "number" }, "timeout": { "description": "타임아웃(밀리초 단위, 기본값: 60000)", "minimum": 1000, "type": "number" } }, "required": [ "model", "messages" ], "type": "object" }

Implementation Reference

  • MCP tool handler that delegates to ollamaService.chatCompletion and wraps result in ToolResponse format.
    async handler(args: any): Promise<ToolResponse> { const result = await ollamaService.chatCompletion(args); return { content: [ { type: 'text' as const, text: result } ] }; }
  • Input schema defining parameters for the mcp_ollama_chat_completion tool: model, messages, temperature, timeout.
    inputSchema: { type: 'object', properties: { model: { type: 'string', description: '사용할 Ollama 모델 이름' }, messages: { type: 'array', items: { type: 'object', properties: { role: { type: 'string', enum: ['system', 'user', 'assistant'] }, content: { type: 'string' } }, required: ['role', 'content'] }, description: '대화 메시지 배열' }, temperature: { type: 'number', description: '샘플링 온도(0-2)', minimum: 0, maximum: 2 }, timeout: { type: 'number', description: '타임아웃(밀리초 단위, 기본값: 60000)', minimum: 1000 } }, required: ['model', 'messages'] },
  • Core OllamaService.chatCompletion method: POST to /api/chat endpoint, formats response as OpenAI-compatible chat completion JSON.
    async chatCompletion(args: { model: string; messages: Array<{ role: string; content: string }>; temperature?: number; timeout?: number; }): Promise<string> { try { // 최신 Ollama API는 채팅 메시지 형식을 직접 지원 const response = await axios.post<OllamaChatResponse>( this.getApiUrl('chat'), { model: args.model, messages: args.messages, stream: false, temperature: args.temperature, }, { timeout: args.timeout || DEFAULT_TIMEOUT, } ); // OpenAI 호환 형식으로 응답 포맷팅 return JSON.stringify({ id: 'chatcmpl-' + Date.now(), object: 'chat.completion', created: Math.floor(Date.now() / 1000), model: args.model, choices: [ { index: 0, message: { role: 'assistant', content: response.data.message.content, }, finish_reason: 'stop', }, ], }, null, 2); } catch (error) { if (axios.isAxiosError(error)) { throw new McpError( ErrorCode.InternalError, `Ollama API 오류: ${error.response?.data?.error || error.message}` ); } throw new McpError(ErrorCode.InternalError, `채팅 완성에 실패했습니다: ${formatError(error)}`); } }
  • src/index.ts:24-54 (registration)
    MCP server capabilities declaration registering mcp_ollama_chat_completion as a supported tool.
    capabilities: { tools: { mcp_sparql_execute_query: true, mcp_sparql_update: true, mcp_sparql_list_repositories: true, mcp_sparql_list_graphs: true, mcp_sparql_get_resource_info: true, mcp_ollama_run: true, mcp_ollama_show: true, mcp_ollama_pull: true, mcp_ollama_list: true, mcp_ollama_rm: true, mcp_ollama_chat_completion: true, mcp_ollama_status: true, mcp_http_request: true, mcp_openai_chat: true, mcp_openai_image: true, mcp_openai_tts: true, mcp_openai_transcribe: true, mcp_openai_embedding: true, mcp_gemini_generate_text: true, mcp_gemini_chat_completion: true, mcp_gemini_list_models: true, mcp_gemini_generate_images: false, mcp_gemini_generate_image: false, mcp_gemini_generate_videos: false, mcp_gemini_generate_multimodal_content: false, mcp_imagen_generate: false, mcp_gemini_create_image: false, mcp_gemini_edit_image: false },
  • src/index.ts:73-123 (registration)
    Generic MCP CallToolRequestSchema handler that locates tool by name from tools array and executes its handler.
    server.setRequestHandler(CallToolRequestSchema, async (request: any, _extra: any) => { try { const tool = tools.find(t => t.name === request.params.name); if (!tool) { return { content: [{ type: "text" as const, text: `알 수 없는 도구: ${request.params.name}` }] } as ToolResponse; } // 도구가 인수를 필요로 하는 경우에만 인수 유효성 검사 if (tool.inputSchema.required && tool.inputSchema.required.length > 0) { const args = request.params.arguments || {}; const missingArgs = tool.inputSchema.required.filter( arg => !(arg in args) ); if (missingArgs.length > 0) { return { content: [{ type: "text" as const, text: `필수 인수가 누락되었습니다: ${missingArgs.join(', ')}` }] } as ToolResponse; } } // 도구 실행 - 타입 어설션 사용 const response = await tool.handler(request.params.arguments || {} as any); // 메타데이터가 제공된 경우 추가 if (request.params._meta) { return { ...response, _meta: request.params._meta }; } return response; } catch (error) { console.error('도구 실행 오류:', error); return { content: [{ type: "text" as const, text: error instanceof Error ? error.message : '예기치 않은 오류가 발생했습니다' }] } as ToolResponse; } }) as any; // MCP SDK 호환성을 위한 타입 단언

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bigdata-coss/agent_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server