Supports deployment through Google Cloud's Vertex AI and other machine learning services as part of the Hugging Face inference provider ecosystem
Provides access to Hugging Face inference providers, allowing users to get comprehensive information about available deployment options including serverless inference, dedicated infrastructure, and various cloud-based AI model hosting services
Enables serverless compute platform deployment for AI model inference through Modal's GPU infrastructure
Provides easy-to-use model deployment capabilities through Replicate's platform for running machine learning models
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Weather & Web Search Agentwhat's the weather in Tokyo today?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP Multi-Service Agent
A collection of Model Context Protocol (MCP) servers providing various AI-powered services, designed to work with Hugging Face's tiny-agents framework.
Features
Core Services
Weather Service: Get weather information for any location
Web Search: Search the web for information (with special support for Hugging Face inference providers)
Sentiment Analysis: Analyze text sentiment with polarity and subjectivity scores
Integration Features
AI Agent Integration: Works seamlessly with tiny-agents for conversational AI
MCP Inspector Support: Debug and inspect server capabilities
Gradio Web Interface: Interactive web UI for sentiment analysis
Prerequisites
Python 3.10 or higher
uv package manager
Node.js (for MCP inspector)
Hugging Face account (for tiny-agents)
Installation
Clone or download this project
git clone https://github.com/Deon62/mcp.git cd mcpsInstall Python dependencies
uv pip install mcp[cli] requestsInstall tiny-agents (if not already installed)
pip install tiny-agentsInstall sentiment analysis dependencies
cd mcp-sentiment python -m venv venv venv\Scripts\activate # On Windows # source venv/bin/activate # On Linux/Mac pip install -r requirements.txt
Quick Start
1. Run the MCP Server
Start the MCP server in one terminal:
uv run --with mcp mcp run server.pyThe server will start and wait for connections.
2. Run the AI Agent
In another terminal, start the agent:
tiny-agents run agent.jsonYou should see:
Agent loaded with 3 tools:
• get_weather
• web_search
• get_hf_inference_providers
»3. Run Sentiment Analysis Web App
For the sentiment analysis service, run the Gradio web interface:
cd mcp-sentiment
venv\Scripts\activate # On Windows
# source venv/bin/activate # On Linux/Mac
python app.pyThe web interface will be available at http://localhost:7860
4. Chat with the Agent
Once the agent is running, you can interact with it:
» Hello! Can you help me find information about Hugging Face inference providers?Available Tools
1. Weather Service
» What's the weather like in New York?2. Web Search
» Search for "Hugging Face inference providers"3. HF Inference Providers
» Get me the list of Hugging Face inference providers4. Sentiment Analysis
» Analyze the sentiment of this text: "I love this new product!"Web Interface Features:
Polarity Score: -1 (negative) to +1 (positive)
Subjectivity Score: 0 (objective) to 1 (subjective)
Assessment: Positive, Negative, or Neutral classification
Real-time Analysis: Instant sentiment analysis as you type
Configuration
Agent Configuration (agent.json)
{
"model": "Qwen/Qwen2.5-72B-Instruct",
"provider": "nebius",
"servers": [
{
"type": "stdio",
"command": "uv",
"args": ["run", "--with", "mcp", "mcp", "run", "server.py"]
}
]
}Server Configuration (server.py)
The server provides three main tools:
get_weather(location)- Returns weather informationweb_search(query)- Performs web searchesget_hf_inference_providers()- Returns comprehensive list of HF inference providers
MCP Inspector Setup
The MCP Inspector allows you to debug and test your MCP server directly.
1. Install MCP Inspector
npm install -g @modelcontextprotocol/inspector2. Run the Inspector
mcp-inspector3. Connect to Your Server
In the inspector:
Click "Add Server"
Choose "stdio" transport
Set command:
uvSet args:
["run", "--with", "mcp", "mcp", "run", "server.py"]Click "Connect"
4. Test Tools
Once connected, you can:
View available tools in the sidebar
Test each tool with different parameters
See the JSON-RPC communication
Debug any issues
Example Usage
Weather Queries
» What's the weather in Tokyo?
» Get weather for London
» How's the weather in San Francisco?Web Search Queries
» Search for "latest AI developments"
» Find information about "MCP protocol"
» Look up "Hugging Face inference providers"Specific HF Provider Queries
» Show me all Hugging Face inference providers
» What inference providers does HF support?
» List the available HF deployment optionsSentiment Analysis Examples
» Analyze the sentiment of: "This product is amazing and I love it!"
» What's the sentiment of: "I'm not sure about this decision"
» Check the sentiment of: "The weather is okay today"Example Output:
{
"polarity": 0.8,
"subjectivity": 0.9,
"assessment": "positive"
}Troubleshooting
Common Issues
"ModuleNotFoundError: No module named 'mcp'"
uv pip install mcp[cli]"KeyError: 'command'"
Check your
agent.jsonconfigurationEnsure the server configuration is correct
"Connection closed" errors
Make sure the MCP server is running
Check that all dependencies are installed
Agent shows "0 tools"
Verify the server is running
Check the agent.json configuration
Ensure the server command is correct
Sentiment Analysis Import Errors
Ensure virtual environment is activated
Install NLTK data:
python -c "import nltk; nltk.download('punkt'); nltk.download('brown')"Check TextBlob installation:
python -c "from textblob import TextBlob; print('OK')"
Gradio Interface Issues
Update Gradio:
pip install --upgrade gradioCheck for port conflicts (default: 7860)
Verify MCP server parameter compatibility
Debug Steps
Test the server directly:
python server.pyCheck MCP server with inspector:
mcp-inspectorVerify dependencies:
uv pip list | grep mcpTest sentiment analysis:
cd mcp-sentiment venv\Scripts\activate python -c "from textblob import TextBlob; print(TextBlob('Hello world').sentiment)"
Project Structure
mcps/
├── server.py # Main MCP server implementation
├── agent.json # Agent configuration
├── requirements.txt # Python dependencies
├── uv.lock # Dependency lock file
├── app.py # Main application entry point
├── mcp-sentiment/ # Sentiment analysis service
│ ├── app.py # Gradio web interface for sentiment analysis
│ ├── requirements.txt # Sentiment analysis dependencies
│ └── venv/ # Virtual environment for sentiment analysis
└── README.md # This fileDevelopment
Adding New Tools
To add a new tool to the server:
@mcp.tool()
def your_new_tool(param: str) -> str:
"""Description of what this tool does"""
return f"Result for {param}"Modifying Agent Configuration
Edit agent.json to:
Change the AI model
Add more MCP servers
Modify server configurations
Hugging Face Inference Providers
The server includes comprehensive information about HF inference providers:
Amazon SageMaker - Serverless inference with custom Inferentia2 chips
Novita AI - Integrated serverless inference directly on model pages
Together AI - Serverless inference with competitive pricing
Nscale - Official HF provider with high-performance GPU clusters
Inference Endpoints - Dedicated, fully managed infrastructure
Google Cloud - Vertex AI and other deployment options
Microsoft Azure - Azure Machine Learning services
Replicate - Easy-to-use model deployment platform
Banana - Serverless GPU inference platform
Modal - Serverless compute platform
RunPod - GPU cloud computing
Lambda Labs - GPU cloud infrastructure
Contributing
Fork the repository
Create a feature branch
Make your changes
Test with both the agent and inspector
Submit a pull request
License
[Add your license information here]
Support
For issues and questions:
Check the troubleshooting section
Use the MCP inspector to debug
Open an issue on GitHub
Check the MCP documentation
**Happy coding! **
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.