Skip to main content
Glama

ollama_embed

Generate numerical vector embeddings from text inputs to enable semantic search, similarity analysis, and machine learning applications.

Instructions

Generate embeddings for text input. Returns numerical vector representations.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
modelYesName of the model to use
inputYesText input. For batch processing, provide a JSON-encoded array of strings, e.g., ["text1", "text2"]
formatNojson

Implementation Reference

  • Core handler function that generates embeddings using Ollama's embed API and formats the response.
    export async function embedWithModel( ollama: Ollama, model: string, input: string | string[], format: ResponseFormat ): Promise<string> { const response = await ollama.embed({ model, input, }); return formatResponse(JSON.stringify(response), format); }
  • Zod input validation schema for the ollama_embed tool, handling string or JSON array input.
    export const EmbedInputSchema = z.object({ model: z.string().min(1), input: z.string().transform((val, ctx) => { const trimmed = val.trim(); // If it looks like a JSON array, try to parse it if (trimmed.startsWith('[') && trimmed.endsWith(']')) { try { const parsed = JSON.parse(trimmed); if (Array.isArray(parsed)) { // Validate all elements are strings const allStrings = parsed.every((item) => typeof item === 'string'); if (allStrings) { return parsed as string[]; } else { ctx.addIssue({ code: z.ZodIssueCode.custom, message: 'Input is a JSON array but contains non-string elements', }); return z.NEVER; } } } catch (e) { // Failed to parse as JSON, treat as plain string } } // Return as plain string return trimmed; }), format: ResponseFormatSchema.default('json'), });
  • Tool definition and registration exporting name, description, JSON input schema, and handler for autoloader.
    export const toolDefinition: ToolDefinition = { name: 'ollama_embed', description: 'Generate embeddings for text input. Returns numerical vector representations.', inputSchema: { type: 'object', properties: { model: { type: 'string', description: 'Name of the model to use', }, input: { type: 'string', description: 'Text input. For batch processing, provide a JSON-encoded array of strings, e.g., ["text1", "text2"]', }, format: { type: 'string', enum: ['json', 'markdown'], default: 'json', }, }, required: ['model', 'input'], }, handler: async (ollama: Ollama, args: Record<string, unknown>, format: ResponseFormat) => { const validated = EmbedInputSchema.parse(args); return embedWithModel(ollama, validated.model, validated.input, format); }, };

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rawveg/ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server