Skip to main content
Glama

AI-Powered Jira MCP Server

by vkhanna2004
server.py1.25 kB
from fastapi import FastAPI from fastapi.staticfiles import StaticFiles from pydantic import BaseModel from mcp_server.llm_orchestrator import orchestrator app = FastAPI() # Serve UI at /static (not root) app.mount("/static", StaticFiles(directory="static"), name="static") class ChatRequest(BaseModel): message: str @app.post("/chat") async def chat(request: ChatRequest): print("inside /chat") result = orchestrator.process_message(request.message) return {"result": result} # Simple redirect root to static/index.html from fastapi.responses import RedirectResponse @app.get("/") def root(): print("inside root") return RedirectResponse(url="/static/index.html") # from fastapi import FastAPI # from fastapi.staticfiles import StaticFiles # from pydantic import BaseModel # # from mcp_server.llm_orchestrator.orchestrator import process_message # from mcp_server.llm_orchestrator import orchestrator # app = FastAPI() # # Serve UI # app.mount("/", StaticFiles(directory="static", html=True), name="static") # class ChatRequest(BaseModel): # message: str # @app.post("/chat") # async def chat(request: ChatRequest): # result = orchestrator.process_message(request.message) # return {"result": result}

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/vkhanna2004/jira-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server