Skip to main content
Glama
ollama-client.d.tsโ€ข1.11 kB
export interface OllamaModel { name: string; model: string; modified_at: string; size: number; digest: string; details: { parent_model: string; format: string; family: string; families: string[]; parameter_size: string; quantization_level: string; }; } export interface ChatMessage { role: 'system' | 'user' | 'assistant'; content: string; } export interface ChatResponse { model: string; created_at: string; message: ChatMessage; done: boolean; total_duration?: number; load_duration?: number; prompt_eval_duration?: number; eval_duration?: number; eval_count?: number; } export declare class OllamaClient { private baseUrl; constructor(baseUrl?: string); listModels(): Promise<OllamaModel[]>; chat(model: string, messages: ChatMessage[]): Promise<ChatResponse>; pullModel(model: string): Promise<void>; deleteModel(model: string): Promise<void>; generateResponse(model: string, prompt: string): Promise<string>; } //# sourceMappingURL=ollama-client.d.ts.map

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/etnlbck/ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server