Skip to main content
Glama

MCP Connection Hub

by claymore666
openai.py2.74 kB
""" OpenAI-compatible API endpoints for MCP Connection Hub. This module provides API endpoints compatible with the OpenAI API format, allowing integration with OpenWebUI and other tools designed for OpenAI. """ from fastapi import APIRouter, Depends, HTTPException, Body from pydantic import BaseModel, Field from typing import List, Dict, Any, Optional from app.core.auth import get_api_key router = APIRouter() # Model definitions (will be expanded in future milestones) class FunctionDefinition(BaseModel): """Function definition compatible with OpenAI format.""" name: str description: str parameters: Dict[str, Any] class Message(BaseModel): """Message format compatible with OpenAI chat API.""" role: str content: str name: Optional[str] = None function_call: Optional[Dict[str, Any]] = None class CompletionRequest(BaseModel): """Request format for chat completions API.""" model: str messages: List[Message] temperature: Optional[float] = 1.0 top_p: Optional[float] = 1.0 n: Optional[int] = 1 stream: Optional[bool] = False max_tokens: Optional[int] = None presence_penalty: Optional[float] = 0 frequency_penalty: Optional[float] = 0 functions: Optional[List[FunctionDefinition]] = None function_call: Optional[Any] = None class CompletionResponse(BaseModel): """Response format for chat completions API.""" id: str object: str = "chat.completion" created: int model: str choices: List[Dict[str, Any]] usage: Dict[str, int] # Chat completion endpoint (placeholder implementation) @router.post("/chat/completions", response_model=CompletionResponse) async def create_chat_completion( request: CompletionRequest = Body(...), api_key: str = Depends(get_api_key) ): """ Create a chat completion compatible with OpenAI API format. This is a placeholder that will be implemented in future milestones. """ # This is a placeholder implementation # Actual implementation will be added in future milestones raise HTTPException( status_code=501, detail="Not implemented yet - will be available in future milestone" ) # Models listing endpoint (placeholder implementation) @router.get("/models") async def list_models( api_key: str = Depends(get_api_key) ): """ List available models compatible with OpenAI API format. This is a placeholder that will be implemented in future milestones. """ # This is a placeholder implementation # Actual implementation will be added in future milestones raise HTTPException( status_code=501, detail="Not implemented yet - will be available in future milestone" )

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/claymore666/MCPConnectionHub'

If you have feedback or need assistance with the MCP directory API, please join our Discord server