Skip to main content
Glama

TianGong-AI-MCP

中文 | English

TianGong AI Model Context Protocol (MCP) Local Server supports Streamable Http protocol.

Starting MCP Server

Streamable Http Server

npm install -g @tiangong-ai/mcp-server-local

npx dotenv -e .env -- \
npx -y -p @tiangong-ai/mcp-server-local tiangong-ai-mcp-http

npm i -g pm2

pm2 start "npx --no-install tiangong-ai-mcp-http" --name tiangong-mcp-local --time

pm2 restart tiangong-mcp-local
pm2 stop tiangong-mcp-local
pm2 logs tiangong-mcp-local

pm2 delete tiangong-mcp-local

pm2 status

Using Docker

# Build MCP server image using Dockerfile (optional)
docker build -t linancn/tiangong-ai-mcp-server-local:0.0.1 .

# Pull MCP server image
docker pull linancn/tiangong-ai-mcp-server-local:0.0.1

# Start MCP server using Docker
docker run -d \
    --name tiangong-ai-mcp-server-local \
    --publish 9279:9279 \
    --env-file .env \
    linancn/tiangong-ai-mcp-server-local:0.0.1
-
security - not tested
A
license - permissive license
-
quality - not tested

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/linancn/tiangong-ai-mcp-local'

If you have feedback or need assistance with the MCP directory API, please join our Discord server