Skip to main content
Glama

mcp_ollama_pull

Download models from the Ollama registry to integrate with Ontology MCP, enabling AI-driven querying and manipulation of ontology data via SPARQL endpoints.

Instructions

Ollama 레지스트리에서 모델을 다운로드합니다

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
nameYes다운로드할 모델 이름

Implementation Reference

  • MCP tool handler function that invokes ollamaService.pullModel and returns the result as ToolResponse.
    async handler(args: any): Promise<ToolResponse> {
      const result = await ollamaService.pullModel(args);
      return {
        content: [
          {
            type: 'text' as const,
            text: result
          }
        ]
      };
  • Input schema for mcp_ollama_pull tool requiring a model name.
    inputSchema: {
      type: 'object',
      properties: {
        name: {
          type: 'string',
          description: '다운로드할 모델 이름'
        }
      },
      required: ['name']
    },
  • src/index.ts:33-33 (registration)
    Tool listed in MCP server capabilities for availability declaration.
    mcp_ollama_pull: true,
  • Core helper function in OllamaService that handles the actual model pull via Ollama API with streaming response.
    async pullModel(args: { name: string }): Promise<string> {
      try {
        const response = await axios.post(
          this.getApiUrl('pull'),
          {
            name: args.name,
          },
          {
            responseType: 'stream',
          }
        );
    
        // 다운로드 진행 상황을 텍스트로 수집
        let result = '';
        for await (const chunk of response.data) {
          result += chunk.toString();
        }
        return result;
      } catch (error) {
        if (axios.isAxiosError(error)) {
          throw new McpError(
            ErrorCode.InternalError,
            `Ollama API 오류: ${error.response?.data?.error || error.message}`
          );
        }
        throw new McpError(ErrorCode.InternalError, `모델 다운로드에 실패했습니다: ${formatError(error)}`);
      }
    }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bigdata-coss/agent_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server