Skip to main content
Glama
Deon62

Weather & Web Search Agent

by Deon62

MCP Multi-Service Agent

A collection of Model Context Protocol (MCP) servers providing various AI-powered services, designed to work with Hugging Face's tiny-agents framework.

Features

Core Services

  • Weather Service: Get weather information for any location

  • Web Search: Search the web for information (with special support for Hugging Face inference providers)

  • Sentiment Analysis: Analyze text sentiment with polarity and subjectivity scores

Integration Features

  • AI Agent Integration: Works seamlessly with tiny-agents for conversational AI

  • MCP Inspector Support: Debug and inspect server capabilities

  • Gradio Web Interface: Interactive web UI for sentiment analysis

Prerequisites

  • Python 3.10 or higher

  • uv package manager

  • Node.js (for MCP inspector)

  • Hugging Face account (for tiny-agents)

Installation

  1. Clone or download this project

    git clone https://github.com/Deon62/mcp.git
    cd mcps
  2. Install Python dependencies

    uv pip install mcp[cli] requests
  3. Install tiny-agents (if not already installed)

    pip install tiny-agents
  4. Install sentiment analysis dependencies

    cd mcp-sentiment
    python -m venv venv
    venv\Scripts\activate  # On Windows
    # source venv/bin/activate  # On Linux/Mac
    pip install -r requirements.txt

Quick Start

1. Run the MCP Server

Start the MCP server in one terminal:

uv run --with mcp mcp run server.py

The server will start and wait for connections.

2. Run the AI Agent

In another terminal, start the agent:

tiny-agents run agent.json

You should see:

Agent loaded with 3 tools:
 • get_weather
 • web_search
 • get_hf_inference_providers
»

3. Run Sentiment Analysis Web App

For the sentiment analysis service, run the Gradio web interface:

cd mcp-sentiment
venv\Scripts\activate  # On Windows
# source venv/bin/activate  # On Linux/Mac
python app.py

The web interface will be available at http://localhost:7860

4. Chat with the Agent

Once the agent is running, you can interact with it:

» Hello! Can you help me find information about Hugging Face inference providers?

Available Tools

1. Weather Service

» What's the weather like in New York?
» Search for "Hugging Face inference providers"

3. HF Inference Providers

» Get me the list of Hugging Face inference providers

4. Sentiment Analysis

» Analyze the sentiment of this text: "I love this new product!"

Web Interface Features:

  • Polarity Score: -1 (negative) to +1 (positive)

  • Subjectivity Score: 0 (objective) to 1 (subjective)

  • Assessment: Positive, Negative, or Neutral classification

  • Real-time Analysis: Instant sentiment analysis as you type

Configuration

Agent Configuration (agent.json)

{
    "model": "Qwen/Qwen2.5-72B-Instruct",
    "provider": "nebius",
    "servers": [
        {
            "type": "stdio",
            "command": "uv",
            "args": ["run", "--with", "mcp", "mcp", "run", "server.py"]
        }
    ]
}

Server Configuration (server.py)

The server provides three main tools:

  • get_weather(location) - Returns weather information

  • web_search(query) - Performs web searches

  • get_hf_inference_providers() - Returns comprehensive list of HF inference providers

MCP Inspector Setup

The MCP Inspector allows you to debug and test your MCP server directly.

1. Install MCP Inspector

npm install -g @modelcontextprotocol/inspector

2. Run the Inspector

mcp-inspector

3. Connect to Your Server

In the inspector:

  1. Click "Add Server"

  2. Choose "stdio" transport

  3. Set command: uv

  4. Set args: ["run", "--with", "mcp", "mcp", "run", "server.py"]

  5. Click "Connect"

4. Test Tools

Once connected, you can:

  • View available tools in the sidebar

  • Test each tool with different parameters

  • See the JSON-RPC communication

  • Debug any issues

Example Usage

Weather Queries

» What's the weather in Tokyo?
» Get weather for London
» How's the weather in San Francisco?

Web Search Queries

» Search for "latest AI developments"
» Find information about "MCP protocol"
» Look up "Hugging Face inference providers"

Specific HF Provider Queries

» Show me all Hugging Face inference providers
» What inference providers does HF support?
» List the available HF deployment options

Sentiment Analysis Examples

» Analyze the sentiment of: "This product is amazing and I love it!"
» What's the sentiment of: "I'm not sure about this decision"
» Check the sentiment of: "The weather is okay today"

Example Output:

{
  "polarity": 0.8,
  "subjectivity": 0.9,
  "assessment": "positive"
}

Troubleshooting

Common Issues

  1. "ModuleNotFoundError: No module named 'mcp'"

    uv pip install mcp[cli]
  2. "KeyError: 'command'"

    • Check your agent.json configuration

    • Ensure the server configuration is correct

  3. "Connection closed" errors

    • Make sure the MCP server is running

    • Check that all dependencies are installed

  4. Agent shows "0 tools"

    • Verify the server is running

    • Check the agent.json configuration

    • Ensure the server command is correct

  5. Sentiment Analysis Import Errors

    • Ensure virtual environment is activated

    • Install NLTK data: python -c "import nltk; nltk.download('punkt'); nltk.download('brown')"

    • Check TextBlob installation: python -c "from textblob import TextBlob; print('OK')"

  6. Gradio Interface Issues

    • Update Gradio: pip install --upgrade gradio

    • Check for port conflicts (default: 7860)

    • Verify MCP server parameter compatibility

Debug Steps

  1. Test the server directly:

    python server.py
  2. Check MCP server with inspector:

    mcp-inspector
  3. Verify dependencies:

    uv pip list | grep mcp
  4. Test sentiment analysis:

    cd mcp-sentiment
    venv\Scripts\activate
    python -c "from textblob import TextBlob; print(TextBlob('Hello world').sentiment)"

Project Structure

mcps/
├── server.py              # Main MCP server implementation
├── agent.json             # Agent configuration
├── requirements.txt       # Python dependencies
├── uv.lock               # Dependency lock file
├── app.py                # Main application entry point
├── mcp-sentiment/        # Sentiment analysis service
│   ├── app.py            # Gradio web interface for sentiment analysis
│   ├── requirements.txt  # Sentiment analysis dependencies
│   └── venv/             # Virtual environment for sentiment analysis
└── README.md             # This file

Development

Adding New Tools

To add a new tool to the server:

@mcp.tool()
def your_new_tool(param: str) -> str:
    """Description of what this tool does"""
    return f"Result for {param}"

Modifying Agent Configuration

Edit agent.json to:

  • Change the AI model

  • Add more MCP servers

  • Modify server configurations

Hugging Face Inference Providers

The server includes comprehensive information about HF inference providers:

  1. Amazon SageMaker - Serverless inference with custom Inferentia2 chips

  2. Novita AI - Integrated serverless inference directly on model pages

  3. Together AI - Serverless inference with competitive pricing

  4. Nscale - Official HF provider with high-performance GPU clusters

  5. Inference Endpoints - Dedicated, fully managed infrastructure

  6. Google Cloud - Vertex AI and other deployment options

  7. Microsoft Azure - Azure Machine Learning services

  8. Replicate - Easy-to-use model deployment platform

  9. Banana - Serverless GPU inference platform

  10. Modal - Serverless compute platform

  11. RunPod - GPU cloud computing

  12. Lambda Labs - GPU cloud infrastructure

Contributing

  1. Fork the repository

  2. Create a feature branch

  3. Make your changes

  4. Test with both the agent and inspector

  5. Submit a pull request

License

[Add your license information here]

Support

For issues and questions:

  1. Check the troubleshooting section

  2. Use the MCP inspector to debug

  3. Open an issue on GitHub

  4. Check the MCP documentation


**Happy coding! **

-
security - not tested
F
license - not found
-
quality - not tested

Resources

Unclaimed servers have limited discoverability.

Looking for Admin?

If you are the server author, to access and configure the admin panel.

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Deon62/mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server