THU Agent by CyberCraze
Integrates with OpenAI-compatible APIs via the THU lab proxy, providing access to models like DeepSeek, Qwen, GLM, Kimi, and MiniMax for interactive terminal coding assistance and conversational AI capabilities.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@THU Agent by CyberCrazeinspect this project and explain how to run it"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
NO RATE LIMIT FOR THU STUDENT!! THU Agent by CyberCraze
Interactive terminal coding agent powered by the THU lab proxy OpenAI-compatible API.
The agent runs in your current terminal, works in your current directory, can inspect files, propose shell commands, and wait for your approval before running them.
Platform Use
Linux
Use the built executable:
./dist/thu-agentLinux executable path:
dist/thu-agentTo run it globally, copy or symlink it into a directory on your PATH, for example:
sudo install -m 755 dist/thu-agent /usr/local/bin/thu-agentThen run:
thu-agentWindows
Use the Windows executable after building it on Windows:
.\dist\thu-agent.exeWindows executable path:
dist\thu-agent.exeTo run it globally on Windows, add the repo dist directory to your PATH, or copy the executable into a directory already on PATH.
Example PowerShell command to add the current repo dist directory for your user:
[Environment]::SetEnvironmentVariable(
"Path",
$env:Path + ";C:\Users\USER\Downloads\THU-deepseek-glm-api-mcp-server\dist",
"User"
)Then open a new terminal and run:
thu-agent.exeBuild it from Windows with:
powershell -ExecutionPolicy Bypass -File .\build_agent_windows.ps1macOS
There is no packaged macOS binary in this repo.
Run the Python entrypoint directly:
python3 agent.pyIf you want a global command on macOS, create a small wrapper in /usr/local/bin or another directory on your PATH:
sudo ln -sf "/absolute/path/to/agent.py" /usr/local/bin/thu-agent.pyor run the repo-local command directly from a shell alias.
API Setup
The agent uses the THU lab proxy.
Create an API key first at:
https://lab.cs.tsinghua.edu.cn/ai-platform/c/newBase URL:
https://lab.cs.tsinghua.edu.cn/ai-platform/api/v1Set your key with an environment variable:
export THU_LAB_PROXY_API_KEY='your_proxy_key_here'
export THU_LAB_PROXY_BASE_URL='https://lab.cs.tsinghua.edu.cn/ai-platform/api/v1'On Windows PowerShell:
$env:THU_LAB_PROXY_API_KEY='your_proxy_key_here'
$env:THU_LAB_PROXY_BASE_URL='https://lab.cs.tsinghua.edu.cn/ai-platform/api/v1'You can also launch the agent and paste the key when prompted. The agent saves it into a per-user global config file for reuse.
Config location:
Linux and macOS:
~/.thu-cybercraze-agent/.envWindows:
%USERPROFILE%\.thu-cybercraze-agent\.env
Start the Agent
From the repo root:
./dist/thu-agentOr with Python:
python3 agent.pyYou can also pass the model and key directly:
python3 agent.py --model deepseek-v3.2 --api-key "$THU_LAB_PROXY_API_KEY"Model Selection
The startup picker shows the models currently wired into the agent.
Default model:
deepseek-v3.2Current supported models:
qwen3-max-thinkingqwen3-maxglm-5glm-5-thinkingglm-4.7-thinkingkimi-k2.5kimi-k2.5-thinkingminimax-m2.5minimax-m2.5-thinkingqwen3.5-plusqwen3.5-plus-thinkingqwen3.5-minideepseek-v3.2-thinkingdeepseek-v3.2
In-Agent Commands
Slash commands available in the session:
/help/sessions/load <id|name>/fork <id|name> [new-name]/new [name]/delete <id|name>/update/model/key/pwd/alwaysRun/exit
At startup, the agent compares its embedded version with the GitHub VERSION file. If a newer release exists, it shows a short reminder to run /update.
/update behavior:
Linux: clones the GitHub repo to a temporary directory, rebuilds the binary, installs it to the current executable path or
/usr/local/bin/thu-agent, and removes the temporary clone. If the install target needs elevated permissions, run the agent with appropriate privileges or update manually.Windows: stages a post-exit replacement of the running
.exeafter rebuilding from a temporary clone, then exits so the replacement can complete.
While the agent is thinking or running a command, press Ctrl+C to cancel the current operation and return to the prompt without exiting the whole session.
Typical Workflow
Start the agent.
Choose a model or press Enter for the default.
Reuse the saved API key or paste a new one.
Type requests at the
>prompt.Approve commands when the agent asks.
Example prompts:
list the files in this directorywrite a hello world script in pythoninspect this project and explain how to run itcreate a small bash script that prints the current date
Command Approval
By default, the agent asks before running each command.
To auto-approve commands for the current session:
/alwaysRunUse that carefully.
Build
Linux build
bash build_agent.shResult:
dist/thu-agentThis build uses the current Python environment and PyInstaller, with extra excludes plus strip/optimize enabled to keep the binary smaller.
Windows build
Run this on Windows, not inside WSL:
py -3 -m pip install pyinstaller
powershell -ExecutionPolicy Bypass -File .\build_agent_windows.ps1Result:
dist\thu-agent.exemacOS run path
macOS users should run the Python entrypoint directly:
python3 agent.pyDirect API Test
You can test the proxy directly:
curl --location --request POST \
'https://lab.cs.tsinghua.edu.cn/ai-platform/api/v1/chat/completions' \
--header 'Content-Type: application/json' \
--header "authorization: Bearer $THU_LAB_PROXY_API_KEY" \
--data-raw '{
"model": "deepseek-v3.2",
"messages": [{"role": "user", "content": "Reply with exactly: ok"}],
"temperature": 0.2,
"repetition_penalty": 1.1,
"stream": false
}'Notes
The Linux binary is already buildable from this repo.
The Windows
.exemust be built from a Windows Python environment.macOS users should run
agent.pydirectly unless they package it themselves.The MCP server code in
server.pystill uses the older backend and is separate from the interactive agent inagent.py.
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/cybercrazetech/THU-deepseek-glm-api-mcp-server'
If you have feedback or need assistance with the MCP directory API, please join our Discord server