Integrates with OpenAI's API to provide LLM capabilities that can be queried through the MCP server, allowing for tools like weather information retrieval to be called via the client interface.
Customized MCP Project
This project leverages the mcp
library with CLI support and integrates with OpenAI's API.
Requirements
Make sure to install the required dependencies before running the project:
Usage
- Configure your OpenAI API key as an environment variable:
- Start the MCP server:
- Use the client to interact with the server:
- Alternatively, use the orchestrator to query the LLM and tools:
Example
Querying the Weather Tool
Run the client and call the get_weather
tool:
Example interaction:
Dependencies
openai==1.70.0
mcp[cli]==1.6.0
License
This project is licensed under the MIT License.
This server cannot be installed
A server that integrates the MCP library with OpenAI's API, allowing users to interact with various tools, such as the weather tool, through natural language queries.
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol (MCP) server that lets you seamlessly use OpenAI's models right from Claude.Last updated -12428JavaScriptMIT License
- -securityAlicense-qualityA simple MCP server for interacting with OpenAI assistants. This server allows other tools (like Claude Desktop) to create and interact with OpenAI assistants through the Model Context Protocol.Last updated -26PythonMIT License
- -securityFlicense-qualityA Model Context Protocol server implementation that enables connection between OpenAI APIs and MCP clients for coding assistance with features like CLI interaction, web API integration, and tool-based architecture.Last updated -28Python
- AsecurityAlicenseAqualityAn MCP server that provides tools for interacting with Vercel API, enabling management of deployments, DNS records, domains, projects, and environment variables through natural language.Last updated -7653TypeScriptMIT License