Skip to main content
Glama

MCP Server with OpenAI Integration

by code-wgl
README.md4.31 kB
# MCP Server with OpenAI Integration A production-ready [Model Context Protocol](https://modelcontextprotocol.io/) server implemented in TypeScript. The server provides: 1. **OpenAI connectivity demo** – prove the API key works end-to-end via `npm run demo:openai`. 2. **MCP tool demo** – spawn the server and call tools through an MCP client using `npm run demo:tool`. 3. **Extensibility demo** – hot-load third-party tools from disk via `npm run demo:ext` or `MCP_TOOL_MODULES`. 4. **Browser UI demo** – launch an interactive web page that exercises the OpenAI call and knowledge-search tool with `npm run demo:ui`. The codebase focuses on clean abstractions, schema validation, and commercial readiness (logging, config safety, tests). ## Requirements - Node.js 18+ (Node 20 recommended to avoid optional engine warnings). - npm 9+. - A valid `OPENAI_API_KEY` with access to the desired models. ## Quick start ```bash npm install cp .env.example .env # fill in OPENAI_API_KEY npm run build npm start # runs the compiled MCP server on stdio ``` To run the TypeScript entry directly during development: ```bash npm run dev ``` ## Environment variables | Variable | Description | | --- | --- | | `OPENAI_API_KEY` | **Required.** API key for OpenAI. | | `OPENAI_BASE_URL` | Override base URL for Azure/OpenAI proxies. | | `OPENAI_TIMEOUT_MS` | Timeout (ms) applied to OpenAI API calls. Defaults to `20000`. | | `MCP_SERVER_NAME` | Name advertised to MCP clients. | | `LOG_LEVEL` | `fatal` → `trace`. Defaults to `info`. | | `MCP_TOOL_MODULES` | Comma-separated absolute paths to extra tool modules (see extensibility demo). | | `MCP_PORT` | Reserved for future transports; defaults to `7337`. | | `UI_DEMO_PORT` | Optional port for the browser UI demo. Defaults to `4399`. | ## Demo workflows ### 1. OpenAI connectivity Verifies credentials and model access: ```bash npm run demo:openai ``` Outputs the model reply plus token usage metrics via Pino logs. ### 2. MCP tool invocation Spawns the compiled MCP server (`node dist/index.js`) and connects with the official MCP client SDK: ```bash npm run build npm run demo:tool ``` Set `MCP_DEMO_SERVER_COMMAND` / `MCP_DEMO_SERVER_ARGS` if you want the client to launch a different command (for example `npx tsx src/index.ts`). The script lists tools and invokes `knowledge_search` end-to-end. ### 3. Extensibility via plugins Ships with `src/examples/plugins/stockQuoteTool.ts`. After `npm run build` the compiled module lives at `dist/examples/plugins/stockQuoteTool.js`. Load it either through the demo script: ```bash npm run build npm run demo:ext ``` or by setting an environment variable before starting the server: ```bash export MCP_TOOL_MODULES=$(pwd)/dist/examples/plugins/stockQuoteTool.js npm start ``` The server automatically registers every tool exported from the referenced module(s). ### 4. Browser UI walkthrough Launch a lightweight HTTP server that serves `public/ui-demo.html`: ```bash npm run demo:ui ``` Visit `http://localhost:4399` (or `UI_DEMO_PORT`) to: - Send prompts directly to OpenAI using the configured API key. - Call the built-in `knowledge_search` tool through a REST façade. Responses render inline so you can validate both flows without leaving the browser. ## Tooling - **TypeScript** strict mode with `tsc` for builds. - **Vitest** for unit testing (`npm test`). - **ESLint + Prettier** for linting/formatting (`npm run lint`, `npm run format`). - **Pino** structured logging with pretty printing in development. ## Test & quality gates ```bash npm run lint npm test ``` Coverage reports are emitted under `coverage/` via V8 instrumentation. ## Project structure - `src/config/env.ts` – centralized, validated environment loading. - `src/clients/openaiClient.ts` – resilient OpenAI wrapper implementing the `LLMProvider` contract. - `src/mcp/registry.ts` – tool lifecycle management + dynamic module loading. - `src/mcp/server.ts` – MCP server wiring, tool adapters, and plugin APIs. - `src/demos/*` – runnable scripts covering the three required scenarios. - `src/examples/plugins/*` – sample plugin(s) for extensibility demos. - `tests/*` – Vitest coverage for critical units. For a deeper architectural overview, read `docs/architecture.md`.

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/code-wgl/McpServer'

If you have feedback or need assistance with the MCP directory API, please join our Discord server