Skip to main content
Glama

Azure OpenAI

by kimtth
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile # Use a Python image with uv pre-installed FROM ghcr.io/astral-sh/uv:python3.12-bookworm-slim AS uv # Set the working directory WORKDIR /app # Copy the pyproject.toml and the lock file if available COPY pyproject.toml ./ # Copy the entire app directory COPY . . # Install the project's dependencies using uv RUN --mount=type=cache,target=/root/.cache/uv uv sync --frozen --no-install-project --no-dev --no-editable # Build the server RUN --mount=type=cache,target=/root/.cache/uv uv run fastmcp dev ./server/browser_navigator_server.py:app FROM python:3.12-slim-bookworm WORKDIR /app COPY --from=uv /root/.local /root/.local COPY --from=uv --chown=app:app /app/.venv /app/.venv # Place executables in the environment at the front of the path ENV PATH="/app/.venv/bin:$PATH" # Set environment variables for Azure OpenAI from .env ENV $(cat .env | xargs) # Set the entrypoint command ENTRYPOINT ["uv", "run", "fastmcp", "dev", "./server/browser_navigator_server.py:app"] # Expose the necessary port EXPOSE 5173

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/kimtth/mcp-aoai-web-browsing'

If you have feedback or need assistance with the MCP directory API, please join our Discord server