Skip to main content
Glama

Ollama MCP Server

by NightTrek
commit_message.txt782 B
feat: add streaming support for chat completions Implemented real-time streaming capability for the chat completion API to: - Enable progressive output of long responses - Improve user experience with immediate feedback - Reduce perceived latency for large generations - Support interactive applications The streaming is implemented using Server-Sent Events (SSE) protocol: - Added SSE transport handler in OllamaServer - Modified chat_completion tool to support streaming - Configured proper response headers and event formatting - Added streaming parameter to API schema Testing confirmed successful streaming of: - Short responses (sonnets) - Long responses (technical articles) - Various content types and lengths Resolves: #123 (Add streaming support for chat completions)

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/NightTrek/Ollama-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server