Skip to main content
Glama
Anwesh43

Apify MCP Server Template

by Anwesh43
test_services.py593 B
from services.ollama_service import OllamaService from dotenv import load_dotenv from models.message import Message load_dotenv() ollamaService = OllamaService() #print(ollamaService.getModels()) #print(ollamaService.getModelDetails("gemma3:12b")) #print(ollamaService.generateResponse("gemma3:12b", "Give a short story involving boat and old man in 100 words")) messages = [Message(role = "system", content = "You are a helpful ai agent that will help in calculation"), Message(role = "user", content = "What is 2 + 2")] print(ollamaService.chatResponse("gemma3:12b", messages=messages))

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Anwesh43/ollama-apify-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server