Skip to main content
Glama

Ollama MCP Server

by NewAITees
pyproject.toml507 B
[project] name = "ollama-mcp-server" version = "0.1.0" description = "MCP that communicate with ollama" readme = "README.md" requires-python = ">=3.13" dependencies = [ "mcp>=1.3.0", "requests>=2.28.0", "aiohttp>=3.8.0", "pydantic>=2.0.0", "pydantic-settings>=2.0.0", ] [[project.authors]] name = "Kai Kogure" email = "weizard@gmail.com" [build-system] requires = [ "hatchling",] build-backend = "hatchling.build" [project.scripts] ollama-mcp-server = "ollama_mcp_server.__main__:run"

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/NewAITees/ollama-MCP-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server