mcp_ollama_pull
Download models from the Ollama registry to integrate with Ontology MCP, enabling AI-driven querying and manipulation of ontology data via SPARQL endpoints.
Instructions
Ollama 레지스트리에서 모델을 다운로드합니다
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| name | Yes | 다운로드할 모델 이름 |
Input Schema (JSON Schema)
{
"properties": {
"name": {
"description": "다운로드할 모델 이름",
"type": "string"
}
},
"required": [
"name"
],
"type": "object"
}
Implementation Reference
- src/tools/index.ts:344-353 (handler)MCP tool handler function that invokes ollamaService.pullModel and returns the result as ToolResponse.async handler(args: any): Promise<ToolResponse> { const result = await ollamaService.pullModel(args); return { content: [ { type: 'text' as const, text: result } ] };
- src/tools/index.ts:334-343 (schema)Input schema for mcp_ollama_pull tool requiring a model name.inputSchema: { type: 'object', properties: { name: { type: 'string', description: '다운로드할 모델 이름' } }, required: ['name'] },
- src/index.ts:33-33 (registration)Tool listed in MCP server capabilities for availability declaration.mcp_ollama_pull: true,
- Core helper function in OllamaService that handles the actual model pull via Ollama API with streaming response.async pullModel(args: { name: string }): Promise<string> { try { const response = await axios.post( this.getApiUrl('pull'), { name: args.name, }, { responseType: 'stream', } ); // 다운로드 진행 상황을 텍스트로 수집 let result = ''; for await (const chunk of response.data) { result += chunk.toString(); } return result; } catch (error) { if (axios.isAxiosError(error)) { throw new McpError( ErrorCode.InternalError, `Ollama API 오류: ${error.response?.data?.error || error.message}` ); } throw new McpError(ErrorCode.InternalError, `모델 다운로드에 실패했습니다: ${formatError(error)}`); } }