Skip to main content
Glama
thadius83
by thadius83

ask-openai

Ask direct questions to OpenAI assistant models through MCP integration to get concise or detailed responses for Claude Desktop workflows.

Instructions

Ask my assistant models a direct question

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
queryYesAsk assistant
modelNoo3-mini

Implementation Reference

  • Executes the core tool logic by calling the OpenAI chat completions API with predefined system prompts and the user query.
    async def ask_openai(self, query: str, model: str = "o3-mini") -> str: try: messages = [ { "role": "developer", "content": "You are a helpful assistant that provides clear and accurate technical responses." }, { "role": "system", "content": "Ensure responses are well-structured and technically precise." }, { "role": "user", "content": query } ] response = await self.client.chat.completions.create( messages=messages, model=model ) return response.choices[0].message.content except Exception as e: logger.error(f"Failed to query OpenAI: {str(e)}") raise
  • Input schema definition for the 'ask-openai' tool, specifying required 'query' and optional 'model'.
    inputSchema={ "type": "object", "properties": { "query": {"type": "string", "description": "Ask assistant"}, "model": {"type": "string", "default": "o3-mini", "enum": ["o3-mini", "gpt-4o-mini"]} }, "required": ["query"] }
  • Registers the 'ask-openai' tool with MCP server including its name, description, and schema.
    @server.list_tools() async def handle_list_tools() -> list[types.Tool]: return [ types.Tool( name="ask-openai", description="Ask my assistant models a direct question", inputSchema={ "type": "object", "properties": { "query": {"type": "string", "description": "Ask assistant"}, "model": {"type": "string", "default": "o3-mini", "enum": ["o3-mini", "gpt-4o-mini"]} }, "required": ["query"] } ) ]
  • MCP server tool call handler that dispatches 'ask-openai' calls to the LLMConnector and returns the response as TextContent.
    @server.call_tool() async def handle_tool_call(name: str, arguments: dict | None) -> list[types.TextContent]: try: if not arguments: raise ValueError("No arguments provided") if name == "ask-openai": response = await connector.ask_openai( query=arguments["query"], model=arguments.get("model", "o3-mini") ) return [types.TextContent(type="text", text=response)] raise ValueError(f"Unknown tool: {name}") except Exception as e: logger.error(f"Tool call failed: {str(e)}") return [types.TextContent(type="text", text=f"Error: {str(e)}")]
  • Helper class that initializes the AsyncOpenAI client used by the tool handler.
    class LLMConnector: def __init__(self, openai_api_key: str): self.client = AsyncOpenAI(api_key=openai_api_key)

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/thadius83/mcp-server-openai'

If you have feedback or need assistance with the MCP directory API, please join our Discord server