Skip to main content
Glama
ollama_llm.py212 B
import requests def ollama_chat(prompt: str): body = {"model": "llama3.1:8b", "prompt": prompt} r = requests.post("http://localhost:11434/api/generate", json=body) return r.json().get("response","")

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jamalexfo/mcp-api-tools'

If you have feedback or need assistance with the MCP directory API, please join our Discord server