Search for:
Why this server?
This server can be run locally with Ngrok tunneling or hosted in an Ubuntu 24 Docker container.
Why this server?
This server connects Blender with local AI models via the Model Context Protocol, allowing users to control Blender using natural language prompts for 3D modeling tasks.
Why this server?
This server uses a Docker container to allow code execution.
Why this server?
A Model Context Protocol server that enables Claude to access Deepseek models running in Docker.
Why this server?
A simple MCP Server with shell execution capabilities that can be run locally with Ngrok tunneling or hosted in an Ubuntu 24 Docker container.
Why this server?
A MCP server that can run Kubernetes commands with a given kubeconfig path and provide interpretation of the commands.
Why this server?
The first open-source MCP server that enables AI to fully control remote macOS systems.
Why this server?
A server that enables LLMs to execute Python code in a specified Conda environment, enabling access to necessary libraries and dependencies for efficient code execution.
Why this server?
A server implementation that enables remote Python code execution in Unreal Engine environments, featuring automatic Unreal node discovery and management through a multicast network.
Why this server?
A powerful MCP (Model Control Protocol) server that enables seamless container and compose stack management through Claude AI.