Skip to main content
Glama

Teradata MCP Server

Official
by Teradata
Open_WebUI.md1.06 kB
## Using with Open WebUI [Open WebUI](https://github.com/open-webui/open-webui) is user-friendly self-hosted AI platform designed to operate entirely offline, supporting various LLM runners like Ollama. It provides a convenient way to interact with LLMs and MCP servers from an intuitive GUI. It can be integrated with this MCP server using the REST endpoints. Run the MCP server as a [REST server](./Rest_API.md). ``` python -m venv ./env source ./env/bin/activate pip install open-webui open-webui serve ``` Access the UI at http://localhost:8080. To add the MCP tools, navigate to Settings > Tools > Add Connection, and enter your mcpo server connection details (eg. `localhost:8001`, password = `top-secret` if you have executed the command line in the mcpo section). You should be able to see the tools in the Chat Control Valves section on the right and get your models to use it. You can now access the OpenAPI docs at: [http://localhost:8002/docs](http://localhost:8002/docs) For more details on mcpo, see: https://github.com/open-webui/mcpo

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Teradata/teradata-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server