Skip to main content
Glama
ollama_client.py475 B
import os import httpx OLLAMA_BASE_URL = os.getenv("OLLAMA_BASE_URL", "http://localhost:11434") async def generate(prompt: str): async with httpx.AsyncClient(timeout=120) as client: r = await client.post( f"{OLLAMA_BASE_URL}/api/generate", json={ "model": "llama3", "prompt": prompt, "stream": False } ) r.raise_for_status() return r.json()["response"]

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/lhmpaiPublic/McpLLMServer'

If you have feedback or need assistance with the MCP directory API, please join our Discord server