Azure OpenAI

  • Cloud Platforms
Python
4
-
security - not tested
F
license - not found
-
quality - not tested

A minimal server/client application implementation utilizing the Model Context Protocol (MCP) and Azure OpenAI.

  1. Tools
  2. Prompts
  3. Resources
  4. Server Configuration
  5. README.md

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
AZURE_OPEN_AI_API_KEYYesThe API key for Azure OpenAI
AZURE_OPEN_AI_ENDPOINTYesThe endpoint URL for Azure OpenAI
AZURE_OPEN_AI_API_VERSIONYesThe API version for Azure OpenAI
AZURE_OPEN_AI_DEPLOYMENT_MODELYesThe deployment model for Azure OpenAI
README.md

MCP Server & Client implementation for using Azure OpenAI

  • A minimal server/client application implementation utilizing the Model Context Protocol (MCP) and Azure OpenAI.
    1. The MCP server is built with FastMCP.
    2. The client is created by ChatGPT to call the MCP server.
    3. The server will call a function on the server, which is created using Playwright (Web Testing and Automation framework created by MS).
    4. The MCP response about tools will be converted to the OpenAI function calling format.
    5. The bridge that converts the MCP server response to the OpenAI function calling format customises the implementation from the MCP-LLM Bridge. To ensure a stable connection, the server object is passed directly into the bridge.

Model Context Protocol (MCP)

Model Context Protocol (MCP) MCP (Model Context Protocol) is an open protocol that enables secure, controlled interactions between AI applications and local or remote resources.

Official Repositories

Community Resources

Related Projects

  • FastMCP: The fast, Pythonic way to build MCP servers.
  • Chat MCP: MCP client
  • MCP-LLM Bridge: MCP implementation that enables communication between MCP servers and OpenAI-compatible LLMs

MCP Playwright

Configuration

During the development phase in December 2024, the Python project should be initiated with 'uv'. Other dependency management libraries, such as 'pip' and 'poetry', are not yet fully supported by the CLI.

  1. Rename .env.template to .env, then fill in the values in .env for Azure OpenAI:
    AZURE_OPEN_AI_ENDPOINT= AZURE_OPEN_AI_API_KEY= AZURE_OPEN_AI_DEPLOYMENT_MODEL= AZURE_OPEN_AI_API_VERSION=
  2. Execute chatgui.py
    • The sample screen shows the client launching a browser to navigate to the URL.
    <img alt="chatgui" src="doc/chatgui_gpt_generate.png" width="300"/>

w.r.t. 'stdio'

stdio is a transport layer (raw data flow), while JSON-RPC is an application protocol (structured communication). They are distinct but often used interchangeably, e.g., "JSON-RPC over stdio" in protocols.

Tool description

@self.mcp.tool() async def playwright_navigate(url: str, timeout=30000, wait_until="load"): """Navigate to a URL.""" -> This comment provides a description, which may be used in a mechanism similar to function calling in LLMs. # Output Tool(name='playwright_navigate', description='Navigate to a URL.', inputSchema={'properties': {'url': {'title': 'Url', 'type': 'string'}, 'timeout': {'default': 30000, 'title': 'timeout', 'type': 'string'}

uv

pip install uv
uv run: Run a script. uv venv: Create a new virtual environment. By default, '.venv'. uv add --script: Add a dependency to a script uv remove --script: Remove a dependency from a script uv sync: Sync (Install) the project's dependencies with the environment.

Tip

  • taskkill command for python.exe
taskkill /IM python.exe /F
  • Visual Code: Python Debugger: Debugging with launch.json will start the debugger using the configuration from .vscode/launch.json.
<!-- ### Sample query Navigate to website http://eaapp.somee.com and click the login link. In the login page, enter the username and password as "admin" and "password" respectively and perform login. Then click the Employee List page and click "Create New" button and enter realistic employee details to create for Name, Salary, DurationWorked, Select dropdown for Grade as CLevel and Email. -->

GitHub Badge

Glama performs regular codebase and documentation scans to:

  • Confirm that the MCP server is working as expected.
  • Confirm that there are no obvious security issues with dependencies of the server.
  • Extract server characteristics such as tools, resources, prompts, and required parameters.

Our directory badge helps users to quickly asses that the MCP server is safe, server capabilities, and instructions for installing the server.

Copy the following code to your README.md file:

Alternative MCP servers

  • -
    security
    -
    license
    -
    quality
    A Model Context Protocol (MCP) server for managing Microsoft 365 environments through Claude or other AI assistants. This server provides comprehensive management capabilities for users, groups, teams, SharePoint, and device management in Microsoft 365.
  • -
    security
    A
    license
    -
    quality
    A simple MCP server for interacting with OpenAI assistants. This server allows other tools (like Claude Desktop) to create and interact with OpenAI assistants through the Model Context Protocol.
    MIT
    • Apple
  • A
    security
    A
    license
    A
    quality
    Model Context Protocol (MCP) server for Atlassian Cloud products (Confluence and Jira). This integration is designed specifically for Atlassian Cloud instances and does not support Atlassian Server or Data Center deployments.
    MIT