Azure OpenAI
- Cloud Platforms
A minimal server/client application implementation utilizing the Model Context Protocol (MCP) and Azure OpenAI.
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
No tools |
Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
AZURE_OPEN_AI_API_KEY | Yes | The API key for Azure OpenAI | |
AZURE_OPEN_AI_ENDPOINT | Yes | The endpoint URL for Azure OpenAI | |
AZURE_OPEN_AI_API_VERSION | Yes | The API version for Azure OpenAI | |
AZURE_OPEN_AI_DEPLOYMENT_MODEL | Yes | The deployment model for Azure OpenAI |
MCP Server & Client implementation for using Azure OpenAI
- A minimal server/client application implementation utilizing the Model Context Protocol (MCP) and Azure OpenAI.
- The MCP server is built with FastMCP.
- The client is created by ChatGPT to call the MCP server.
- The server will call a function on the server, which is created using Playwright (Web Testing and Automation framework created by MS).
- The MCP response about tools will be converted to the OpenAI function calling format.
- The bridge that converts the MCP server response to the OpenAI function calling format customises the implementation from the MCP-LLM Bridge. To ensure a stable connection, the server object is passed directly into the bridge.
Model Context Protocol (MCP)
Model Context Protocol (MCP) MCP (Model Context Protocol) is an open protocol that enables secure, controlled interactions between AI applications and local or remote resources.
Official Repositories
Community Resources
Related Projects
- FastMCP: The fast, Pythonic way to build MCP servers.
- Chat MCP: MCP client
- MCP-LLM Bridge: MCP implementation that enables communication between MCP servers and OpenAI-compatible LLMs
MCP Playwright
Configuration
During the development phase in December 2024, the Python project should be initiated with 'uv'. Other dependency management libraries, such as 'pip' and 'poetry', are not yet fully supported by the CLI.
- Rename
.env.template
to.env
, then fill in the values in.env
for Azure OpenAI:CopyAZURE_OPEN_AI_ENDPOINT= AZURE_OPEN_AI_API_KEY= AZURE_OPEN_AI_DEPLOYMENT_MODEL= AZURE_OPEN_AI_API_VERSION= - Execute
chatgui.py
- The sample screen shows the client launching a browser to navigate to the URL.
w.r.t. 'stdio'
stdio
is a transport layer (raw data flow), while JSON-RPC is an application protocol (structured communication). They are distinct but often used interchangeably, e.g., "JSON-RPC over stdio" in protocols.
Tool description
uv
Tip
- taskkill command for python.exe
- Visual Code: Python Debugger: Debugging with launch.json will start the debugger using the configuration from .vscode/launch.json.
GitHub Badge
Glama performs regular codebase and documentation scans to:
- Confirm that the MCP server is working as expected.
- Confirm that there are no obvious security issues with dependencies of the server.
- Extract server characteristics such as tools, resources, prompts, and required parameters.
Our directory badge helps users to quickly asses that the MCP server is safe, server capabilities, and instructions for installing the server.
Copy the following code to your README.md file: