Provides connectivity to OpenAI's API for making LLM calls, with support for custom base URLs (Azure/OpenAI proxies), configurable timeouts, and token usage metrics.
MCP Server with OpenAI Integration
A production-ready Model Context Protocol server implemented in TypeScript. The server provides:
OpenAI connectivity demo – prove the API key works end-to-end via
npm run demo:openai.MCP tool demo – spawn the server and call tools through an MCP client using
npm run demo:tool.Extensibility demo – hot-load third-party tools from disk via
npm run demo:extorMCP_TOOL_MODULES.Browser UI demo – launch an interactive web page that exercises the OpenAI call and knowledge-search tool with
npm run demo:ui.
The codebase focuses on clean abstractions, schema validation, and commercial readiness (logging, config safety, tests).
Requirements
Node.js 18+ (Node 20 recommended to avoid optional engine warnings).
npm 9+.
A valid
OPENAI_API_KEYwith access to the desired models.
Quick start
To run the TypeScript entry directly during development:
Environment variables
Variable | Description |
| Required. API key for OpenAI. |
| Override base URL for Azure/OpenAI proxies. |
| Timeout (ms) applied to OpenAI API calls. Defaults to
. |
| Name advertised to MCP clients. |
|
→
. Defaults to
. |
| Comma-separated absolute paths to extra tool modules (see extensibility demo). |
| Reserved for future transports; defaults to
. |
| Optional port for the browser UI demo. Defaults to
. |
Demo workflows
1. OpenAI connectivity
Verifies credentials and model access:
Outputs the model reply plus token usage metrics via Pino logs.
2. MCP tool invocation
Spawns the compiled MCP server (node dist/index.js) and connects with the official MCP client SDK:
Set MCP_DEMO_SERVER_COMMAND / MCP_DEMO_SERVER_ARGS if you want the client to launch a different command (for example npx tsx src/index.ts). The script lists tools and invokes knowledge_search end-to-end.
3. Extensibility via plugins
Ships with src/examples/plugins/stockQuoteTool.ts. After npm run build the compiled module lives at dist/examples/plugins/stockQuoteTool.js.
Load it either through the demo script:
or by setting an environment variable before starting the server:
The server automatically registers every tool exported from the referenced module(s).
4. Browser UI walkthrough
Launch a lightweight HTTP server that serves public/ui-demo.html:
Visit http://localhost:4399 (or UI_DEMO_PORT) to:
Send prompts directly to OpenAI using the configured API key.
Call the built-in
knowledge_searchtool through a REST façade.
Responses render inline so you can validate both flows without leaving the browser.
Tooling
TypeScript strict mode with
tscfor builds.Vitest for unit testing (
npm test).ESLint + Prettier for linting/formatting (
npm run lint,npm run format).Pino structured logging with pretty printing in development.
Test & quality gates
Coverage reports are emitted under coverage/ via V8 instrumentation.
Project structure
src/config/env.ts– centralized, validated environment loading.src/clients/openaiClient.ts– resilient OpenAI wrapper implementing theLLMProvidercontract.src/mcp/registry.ts– tool lifecycle management + dynamic module loading.src/mcp/server.ts– MCP server wiring, tool adapters, and plugin APIs.src/demos/*– runnable scripts covering the three required scenarios.src/examples/plugins/*– sample plugin(s) for extensibility demos.tests/*– Vitest coverage for critical units.
For a deeper architectural overview, read docs/architecture.md.
This server cannot be installed