Skip to main content
Glama

Customized MCP Server

by MorvanZhou

Customized MCP Project

This project leverages the mcp library with CLI support and integrates with OpenAI's API.

Requirements

Make sure to install the required dependencies before running the project:

pip install -r requirements.txt

Related MCP server: Deep Thinking Assistant

Usage

  1. Configure your OpenAI API key as an environment variable:

    export OPENAI_API_KEY="your-api-key"
  2. Start the MCP server:

    python server.py
  3. Use the client to interact with the server:

    python client.py
  4. Alternatively, use the orchestrator to query the LLM and tools:

    python main.py

Example

Querying the Weather Tool

Run the client and call the get_weather tool:

python client.py

Example interaction:

You: List tools Assistant: { "tools": [ { "name": "get_weather", "description": "Get weather for a city", "parameters": { "city": { "type": "string", "description": "Name of the city" } } } ] } You: Call get_weather with {"city": "Beijing"} Assistant: 北京的天气是晴天

Dependencies

  • openai==1.70.0

  • mcp[cli]==1.6.0

License

This project is licensed under the MIT License.

-
security - not tested
A
license - permissive license
-
quality - not tested

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/MorvanZhou/customized_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server