Skip to main content
Glama
NightTrek

Ollama MCP Server

by NightTrek

list

Retrieve available local AI models from Ollama to manage and run them within MCP-powered applications.

Instructions

List models

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • The main handler function for the 'list' tool. It runs the 'ollama list' command using execAsync, captures stdout/stderr, and returns it as text content. Errors are thrown as McpError.
    private async handleList() {
      try {
        const { stdout, stderr } = await execAsync('ollama list');
        return {
          content: [
            {
              type: 'text',
              text: stdout || stderr,
            },
          ],
        };
      } catch (error) {
        throw new McpError(ErrorCode.InternalError, `Failed to list models: ${formatError(error)}`);
      }
    }
  • The schema/declaration for the 'list' tool in the ListTools response, specifying name, description, and empty inputSchema (no parameters required).
    {
      name: 'list',
      description: 'List models',
      inputSchema: {
        type: 'object',
        properties: {},
        additionalProperties: false,
      },
    },
  • src/index.ts:268-269 (registration)
    Registration/dispatch in the CallToolRequestSchema handler switch statement, which calls the handleList() method when the tool name is 'list'.
    case 'list':
      return await this.handleList();

Tool Definition Quality

Score is being calculated. Check back soon.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/NightTrek/Ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server