Skip to main content
Glama

OpenAI MCP Server

by bhjo0930

OpenAI MCP Server

Advanced OpenAI GPT-5 MCP Server with multiple tools, intelligent reasoning, and comprehensive AI utilities for Claude Code integration.

✨ Features

🎯 Multiple AI Tools

  • 5 Specialized Tools: From basic GPT calling to advanced batch processing

  • Modular Architecture: Each tool is independently developed and maintained

  • Extensible Framework: Easy to add new tools and capabilities

🧠 Advanced AI Capabilities

  • GPT-5 by Default: Latest reasoning model with intelligent fallback to GPT-4o

  • Advanced Reasoning: Support for reasoning_effort and verbosity parameters

  • Hybrid Intelligence: GPT-5 reasoning + GPT-4o content generation

  • Task-Specific Optimization: Specialized system prompts for different domains

🚀 Professional Features

  • Token Analysis & Optimization: Analyze and optimize text for token efficiency

  • Context Window Management: Smart context optimization with multiple strategies

  • Batch Processing: Process multiple prompts in parallel with concurrency control

  • Model Management: List and compare available OpenAI models

🔧 Technical Excellence

  • stdio Transport: No ports needed, simple integration

  • Claude Context Integration: Leverages Claude session context

  • NPX Ready: Install and run with npx

  • TypeScript: Full type safety and modern JavaScript features

  • Error Handling: Comprehensive error handling and fallback mechanisms

Quick Start

Installation

# Global install npm install -g openai-mcp-server # Or use npx (no installation needed) npx openai-mcp-server

Setup

  1. Get your OpenAI API key from https://platform.openai.com/api-keys

  2. Set environment variable:

export OPENAI_API_KEY="your-api-key-here"
  1. Add to Claude Code:

claude mcp add --transport stdio openai-gpt5 \\ "OPENAI_API_KEY=your-key-here npx openai-mcp-server"

Available Tools

The server provides 5 specialized tools:

1. call_gpt5 - Enhanced GPT Model Calling

Call OpenAI GPT models with optimized system prompts and advanced reasoning.

Key Parameters:

  • prompt (string, required): Your question or request

  • taskType (enum, required): analysis, generation, reasoning, coding

  • domain (string, optional): Specific domain like "security", "performance", "architecture"

  • reasoningEffort (enum, optional): GPT-5 reasoning depth - "minimal", "low", "medium", "high"

  • verbosity (enum, optional): GPT-5 response detail level - "low", "medium", "high"

  • model (string, optional): Override model ("gpt-5", "gpt-4o", "gpt-4")

2. list_models - Model Information

List available OpenAI models with capabilities and metadata.

Parameters:

  • includeDetails (boolean, optional): Include detailed model information

3. analyze_token_usage - Token Optimization

Analyze text for token usage and get optimization suggestions.

Parameters:

  • text (string, required): Text to analyze

  • model (string, optional): Model for token counting

  • includeOptimization (boolean, optional): Include optimization suggestions

Features:

  • Token count estimation

  • Text composition analysis

  • Cost calculation

  • Optimization recommendations

4. optimize_context_window - Context Management

Optimize long context for efficient token usage while preserving important information.

Parameters:

  • context (string, required): Context text to optimize

  • maxTokens (number, required): Maximum tokens for optimized context

  • preservationStrategy (enum, optional): Strategy for preserving context

    • important_first: Preserve sentences with keywords and importance indicators

    • recent_first: Preserve recent content with keyword protection

    • semantic: Preserve semantically similar content

    • balanced: Balance importance and recency (default)

  • preserveKeywords (array, optional): Keywords to preserve

Use Cases:

  • Large document summarization

  • Chat history optimization

  • Context window management

5. process_batch_prompts - Batch Processing

Process multiple prompts efficiently with parallel execution support.

Parameters:

  • prompts (array, required): Array of prompts to process

  • taskType (enum, optional): Task type for system prompt optimization

  • parallel (boolean, optional): Process prompts in parallel (default: true)

  • maxConcurrency (number, optional): Maximum concurrent requests (1-10, default: 5)

  • model (string, optional): Model to use for all prompts

Features:

  • Parallel processing with concurrency control

  • Automatic retry and error handling

  • Performance metrics and cost estimation

  • Progress tracking

Usage Examples

# Basic GPT-5 call with reasoning claude mcp call openai-gpt5 call_gpt5 '{ "prompt": "Analyze this code for security vulnerabilities", "taskType": "analysis", "domain": "security", "reasoningEffort": "high" }' # Token analysis claude mcp call openai-gpt5 analyze_token_usage '{ "text": "Your text here...", "includeOptimization": true }' # Batch processing claude mcp call openai-gpt5 process_batch_prompts '{ "prompts": ["Question 1", "Question 2", "Question 3"], "parallel": true, "maxConcurrency": 3 }' # Context optimization claude mcp call openai-gpt5 optimize_context_window '{ "context": "Very long text...", "maxTokens": 1000, "preservationStrategy": "important_first", "preserveKeywords": ["key", "important"] }'

Environment Variables

Create a .env file or set environment variables:

# Required OPENAI_API_KEY=your_openai_api_key_here # Optional OPENAI_MODEL=gpt-5 # Default model (gpt-5, gpt-4o, gpt-4) OPENAI_BASE_URL=https://api.openai.com/v1 # Custom API endpoint DEBUG=false # Enable debug logging

Development

# Clone and install git clone <repository> cd openai-mcp-server npm install # Build npm run build # Development with auto-reload npm run dev # Test npm test

Integration with Claude Code

Once configured, Claude can automatically use this comprehensive MCP server to:

🎯 Core AI Capabilities

  1. Enhanced Analysis: Deep code analysis with GPT-5 reasoning capabilities

  2. Alternative Perspectives: Get different AI viewpoints on complex problems

  3. Creative Problem Solving: Leverage GPT's creativity for brainstorming and innovation

  4. Specialized Domain Expertise: Task-specific optimized prompts for security, performance, architecture

🚀 Advanced Features

  1. Token Optimization: Analyze and optimize prompts for cost-effectiveness

  2. Context Management: Handle large documents and conversations efficiently

  3. Batch Operations: Process multiple requests simultaneously for productivity

  4. Model Selection: Choose optimal models based on task requirements

💡 Smart Integration

Claude will intelligently decide when and which tools to use based on:

  • Task complexity and type

  • Content length and optimization needs

  • Batch processing opportunities

  • Resource and cost considerations

🔧 Professional Workflows

  • Development: Code analysis, review, and optimization

  • Content: Large document processing and summarization

  • Research: Multi-query analysis and comparison

  • Optimization: Token usage and cost management

License

MIT

-
security - not tested
-
license - not tested
-
quality - not tested

Related MCP Servers

  • A
    security
    A
    license
    A
    quality
    Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.
    Last updated -
    1
    28
    143
    MIT License
    • Apple
  • A
    security
    A
    license
    A
    quality
    Enables integration with OpenAI models through the MCP protocol, supporting concise and detailed responses for use with Claude Desktop.
    Last updated -
    4
    MIT License
    • Apple
    • Linux
  • -
    security
    -
    license
    -
    quality
    An integration tool that connects Claude AI with Odoo database, allowing users to explore Odoo models, search data, and create reports through natural language commands.
  • -
    security
    A
    license
    -
    quality
    Connects Claude Code with multiple AI models (Gemini, Grok-3, ChatGPT, DeepSeek) simultaneously, allowing users to get diverse AI perspectives, conduct AI debates, and leverage each model's unique strengths.
    Last updated -
    118
    MIT License

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bhjo0930/openai_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server