Skip to main content
Glama
main.py806 B
from fastapi import FastAPI from fastapi.middleware.cors import CORSMiddleware from dotenv import load_dotenv import os from mcp_tools.tools import chat_with_llm load_dotenv() app = FastAPI(title="MCP Ollama Server") # ----------------------- # CORS # ----------------------- origins = os.getenv("CORS_ORIGINS", "").split(",") app.add_middleware( CORSMiddleware, allow_origins=origins, allow_credentials=True, allow_methods=["*"], allow_headers=["*"], ) @app.get("/aa") async def test(): return {"msg": "hello"} # ----------------------- # MCP Tool Endpoint # ----------------------- @app.post("/mcp/tools/llm_chat") async def llm_chat(payload: dict): """ MCP-compatible Tool Endpoint """ prompt = payload.get("prompt") return await chat_with_llm(prompt)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/lhmpaiPublic/McpLLMServer'

If you have feedback or need assistance with the MCP directory API, please join our Discord server