Provides containerized deployment of the TianGong AI MCP Server, allowing for isolated and portable execution of the server with Docker.
Enables configuration of the TianGong AI MCP Server through environment variables stored in .env files, providing a secure way to manage API keys and other settings.
Supports execution of the TianGong AI MCP Server in Node.js environments, leveraging the JavaScript runtime for cross-platform compatibility.
Facilitates installation and dependency management for the TianGong AI MCP Server through the npm package registry.
Enables version management for Node.js when running the TianGong AI MCP Server, ensuring compatibility with specific Node.js versions.
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A Model Context Protocol (MCP) server that supports STDIO, SSE and Streamable HTTP protocols for AI model interactions.
Related MCP Servers
- -securityFlicense-qualityA Model Context Protocol server implementation that enables connection between OpenAI APIs and MCP clients for coding assistance with features like CLI interaction, web API integration, and tool-based architecture.Last updated -28Python
- -securityAlicense-qualityA high-performance Model Context Protocol (MCP) server designed for large language models, enabling real-time communication between AI models and applications with support for session management and intelligent tool registration.Last updated -2PythonMIT License
- -securityAlicense-qualityA server that implements the Model Context Protocol, providing a standardized way to connect AI models to different data sources and tools.Last updated -35TypeScriptMIT License
- AsecurityFlicenseAqualityAn all-in-one Model Context Protocol (MCP) server that connects your coding AI to numerous databases, data warehouses, data pipelines, and cloud services, streamlining development workflow through seamless integrations.Last updated -2Python