Integrates with OpenAI's API to provide LLM capabilities that can be queried through the MCP server, allowing for tools like weather information retrieval to be called via the client interface.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Customized MCP Serverwhat's the weather like in Tokyo today?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Customized MCP Project
This project leverages the mcp library with CLI support and integrates with OpenAI's API.
Requirements
Make sure to install the required dependencies before running the project:
Related MCP server: Weather MCP Server
Usage
Configure your OpenAI API key as an environment variable:
export OPENAI_API_KEY="your-api-key"Start the MCP server:
python server.pyUse the client to interact with the server:
python client.pyAlternatively, use the orchestrator to query the LLM and tools:
python main.py
Example
Querying the Weather Tool
Run the client and call the get_weather tool:
Example interaction:
Dependencies
openai==1.70.0mcp[cli]==1.6.0
License
This project is licensed under the MIT License.