Supports containerization for easy deployment with Docker, including health checks and volume mounting for module management
Enables configuration through environment variables loaded from .env files for storing API keys and other settings
Integrated in pre-commit hooks to ensure code quality and standards compliance for all JavaScript files
Repository hosting and version control integration for project management and distribution
Uses Hono framework for handling HTTP routing and API endpoints in the modular server architecture
Allows access to thousands of open-source AI models from Hugging Face with support for custom model parameters
Testing framework integration for comprehensive test coverage of both core functionality and modules
Built on Node.js runtime with support for ES Modules, requiring Node.js 18.x or higher
Provides access to GPT models for text generation and Whisper for speech-to-text capabilities with streaming support
Package management integration with support for module dependencies and project scripts
Implements pre-commit hooks using Husky to run ESLint and Prettier on staged files before committing
Code formatting integration in pre-commit hooks to maintain consistent code style
MCP Server (Model Context Protocol)
A generic, modular server for implementing the Model Context Protocol (MCP). This server provides a framework for controlling and interacting with various models through a standardized API.
Features
- Modular architecture for easy extension
- Dynamic module loading
- Core model management functionality
- Standardized API for model context
- Simple configuration system
- Logging utilities
- Enhanced module structure with proper separation of concerns
- Package.json support for modules with dependency management
- Comprehensive testing infrastructure with Mocha and Chai
- Powerful module search functionality
- Module metadata display in API responses
- Integration with real AI model providers (OpenAI, Stability AI, Anthropic, Hugging Face)
- Support for text generation, image generation, and speech-to-text models
- Streaming inference support for compatible models
Getting Started
Prerequisites
- Node.js 18.x or higher
- pnpm 10.x or higher
This project uses ES Modules (ESM) exclusively. All imports use the import
syntax rather than require()
.
Installation
Running the Server
The server will start on http://localhost:3000 by default.
Configuration
Copy the sample environment file and edit it with your API keys:
At minimum, you'll need to add API keys for the model providers you want to use:
You can get these API keys from:
- OpenAI: https://platform.openai.com/api-keys
- Stability AI: https://platform.stability.ai/account/keys
- Anthropic: https://console.anthropic.com/settings/keys
Testing the Server
The repository includes comprehensive testing using Mocha and Chai:
The testing infrastructure includes:
- Core server tests for module loading, routing, and other core functionality
- Module-specific tests for each module's functionality
- Support for ES modules in tests
- Mocking and stubbing utilities with Sinon
Tests are organized in a structured way:
- Core tests in
/test/core/
- Module tests in each module's
test/
directory
This comprehensive testing ensures code quality and makes it easier to detect regressions when making changes.
Pre-commit Hooks
The repository includes pre-commit hooks using Husky and lint-staged:
The pre-commit hooks:
- Run ESLint on JavaScript files
- Run Prettier on all staged files
This ensures that all code committed to the repository follows coding standards and maintains code quality. The test suite is continuously being improved to provide better coverage and reliability, and will be enabled in the pre-commit hook once it's more stable.
Docker Support
The repository includes Docker support for easy containerization and deployment:
The Docker configuration:
- Uses Node.js 20 Alpine as the base image
- Exposes port 3000
- Mounts the modules directory as a volume for easy module management
- Includes health checks
Standard MCP Methods
The MCP server implements a standardized set of methods that all MCP servers should provide:
Server Information
GET /
- Basic server informationGET /status
- Detailed server statusGET /health
- Health check endpointGET /metrics
- Server metrics
Model Management
GET /models
- List available modelsGET /model/:modelId
- Get model informationPOST /model/:modelId/activate
- Activate a specific modelPOST /model/deactivate
- Deactivate the current modelGET /model/active
- Get information about the active model
Inference
POST /model/infer
- Perform inference with the active modelPOST /model/:modelId/infer
- Perform inference with a specific model
Supported Models
The MCP server supports the following model types:
Model Type | Provider | Capabilities | Example IDs |
---|---|---|---|
GPT Models | OpenAI | Text generation | gpt-4, gpt-3.5-turbo |
Whisper | OpenAI | Speech-to-text | whisper, whisper-1 |
Stable Diffusion | Stability AI | Image generation | stable-diffusion-xl-1024-v1-0 |
Claude Models | Anthropic | Text generation | claude-3-opus, claude-3-sonnet |
Custom Models | Hugging Face | Various | (any Hugging Face model ID) |
Inference Examples
Text generation with GPT-4:
Image generation with Stable Diffusion:
Streaming text generation:
Module Management
GET /modules
- List installed modulesGET /modules/:moduleId
- Get module informationGET /modules/search/:query
- Search modules by any field in their package.json or metadata
Tools and Resources
GET /tools
- List available toolsGET /resources
- List available resources
For detailed information about these methods, see MCP Standard Methods.
Configuration
Configuration is loaded from environment variables and stored in src/core/config.js
. The easiest way to configure the server is to edit the .env
file in the project root.
Environment Variables
Key environment variables include:
Variable | Description | Default |
---|---|---|
PORT | Server port | 3000 |
HOST | Server host | localhost |
NODE_ENV | Environment (development/production) | development |
OPENAI_API_KEY | OpenAI API key | (required for OpenAI models) |
STABILITY_API_KEY | Stability AI API key | (required for Stable Diffusion) |
ANTHROPIC_API_KEY | Anthropic API key | (required for Claude models) |
HUGGINGFACE_API_KEY | Hugging Face API key | (required for Hugging Face models) |
See sample.env
for a complete list of configuration options.
Examples
The repository includes several examples to help you get started:
- Client Example:
examples/client.js
demonstrates how to interact with the MCP server from a client application. - Custom Module Example:
examples/custom-module/
shows how to create a custom module that adds a calculator tool to the server.
To run the client example:
To use the custom module example, copy it to the modules directory:
Creating Modules
Modules are the primary way to extend the MCP server. Each module is a self-contained package that can add new functionality to the server.
Module Structure
Modules now follow an enhanced structure with better organization:
Each module should include a package.json
file with:
- Name, version, description
- Author and license information
- Dependencies and dev dependencies
- Scripts (especially for testing)
- Keywords and other metadata
This structure provides better separation of concerns, makes testing easier, and improves module discoverability.
Module Implementation
The main module file (index.js
) must export a register
function that will be called when the module is loaded:
Example Modules
- A simple example module is provided in
mcp_modules/example/
to demonstrate how to create a module. - A more complex example with a calculator tool is provided in
examples/custom-module/
. - A health check module is provided in
mcp_modules/health-check/
for system monitoring. - A template for creating new modules is available in
mcp_modules/template/
.
Creating New Modules
You can create a new module using the provided script:
The script will:
- Create a new module directory in
mcp_modules/
- Copy the template files
- Replace placeholders with your module information
- Provide next steps for implementing your module
Module Search
The MCP server includes a powerful search functionality that allows you to find modules based on any information in their package.json or metadata.
Search Endpoints
GET /modules/search/:query
- Search for modules containing the specified query string in any field
Search Examples
JavaScript Example
The search is comprehensive and will find matches in any field, including nested objects like dependencies, keywords, and other metadata.
Model Providers
The MCP server integrates with several AI model providers:
OpenAI
OpenAI provides GPT models for text generation and Whisper for speech-to-text:
Stability AI
Stability AI provides Stable Diffusion for image generation:
Anthropic
Anthropic provides Claude models for text generation:
Hugging Face
Hugging Face provides access to thousands of open-source models:
Documentation
- MCP Standard Methods: Documentation of the standard methods that all MCP servers should implement.
- MCP Interface: TypeScript interface definitions for the MCP protocol.
- Architecture: Overview of the MCP server architecture.
License
ISC
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
@profullstack/mcp-server
Related MCP Servers
- Python
- PythonMIT License
- TypeScriptApache 2.0
- PythonApache 2.0