server_info
Retrieve server status and configuration details for the MCP AI Bridge, enabling users to monitor and manage connections with OpenAI and Google Gemini models efficiently.
Instructions
Get server status and configuration
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Implementation Reference
- src/index.js:271-302 (handler)The handleServerInfo method implements the server_info tool, returning a formatted text response with server name, version, AI client configurations (OpenAI/Gemini), rate limits, and security settings.handleServerInfo() { const info = { name: process.env.MCP_SERVER_NAME || 'AI Bridge', version: process.env.MCP_SERVER_VERSION || '1.0.0', openai: { configured: !!this.openai, models: this.openai ? MODELS.OPENAI : [], }, gemini: { configured: !!this.gemini, models: this.gemini ? MODELS.GEMINI : [], }, rateLimits: { maxRequests: DEFAULTS.RATE_LIMIT.MAX_REQUESTS, windowMs: DEFAULTS.RATE_LIMIT.WINDOW_MS, }, security: { inputValidation: true, rateLimiting: true, promptMaxLength: DEFAULTS.PROMPT.MAX_LENGTH, }, }; return { content: [ { type: 'text', text: `🤖 AI BRIDGE SERVER INFO:\n\n${JSON.stringify(info, null, 2)}`, }, ], }; }
- src/index.js:152-155 (schema)Input schema for server_info tool, defining an empty object (no required parameters).inputSchema: { type: 'object', properties: {}, },
- src/index.js:149-156 (registration)Registers the server_info tool in the tools list used for ListToolsRequestSchema responses.tools.push({ name: 'server_info', description: 'Get server status and configuration', inputSchema: { type: 'object', properties: {}, }, });
- src/index.js:178-179 (registration)Switch case in CallToolRequestSchema handler that dispatches server_info calls to the handler method.case 'server_info': return this.handleServerInfo();