Customized MCP Server

Integrations

  • Integrates with OpenAI's API to provide LLM capabilities that can be queried through the MCP server, allowing for tools like weather information retrieval to be called via the client interface.

Customized MCP Project

This project leverages the mcp library with CLI support and integrates with OpenAI's API.

Requirements

Make sure to install the required dependencies before running the project:

pip install -r requirements.txt

Usage

  1. Configure your OpenAI API key as an environment variable:
    export OPENAI_API_KEY="your-api-key"
  2. Start the MCP server:
    python server.py
  3. Use the client to interact with the server:
    python client.py
  4. Alternatively, use the orchestrator to query the LLM and tools:
    python main.py

Example

Querying the Weather Tool

Run the client and call the get_weather tool:

python client.py

Example interaction:

You: List tools Assistant: { "tools": [ { "name": "get_weather", "description": "Get weather for a city", "parameters": { "city": { "type": "string", "description": "Name of the city" } } } ] } You: Call get_weather with {"city": "Beijing"} Assistant: 北京的天气是晴天

Dependencies

  • openai==1.70.0
  • mcp[cli]==1.6.0

License

This project is licensed under the MIT License.

-
security - not tested
A
license - permissive license
-
quality - not tested

A server that integrates the MCP library with OpenAI's API, allowing users to interact with various tools, such as the weather tool, through natural language queries.

  1. Requirements
    1. Usage
      1. Example
        1. Querying the Weather Tool
      2. Dependencies
        1. License
          ID: b4nj5l9ncf