Fetches real-time news articles from RapidAPI, providing the data source for the agent's AI-powered sentiment analysis and summarization tools.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@News Analysis Agent MCP ServerAnalyze the sentiment of recent news about climate change"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
mcp-news-analysis-agent
A comprehensive Model Context Protocol (MCP) implementation for intelligent news analysis, featuring advanced LLM-powered sentiment analysis, AI summarization, and natural language query processing using Mistral AI.
π§ LLM-Powered Architecture
This project leverages Mistral AI for advanced natural language processing capabilities:
Sentiment Analysis: Uses Mistral's LLM for nuanced sentiment understanding with confidence scoring
Text Summarization: AI-powered content summarization with customizable length and style
Intent Detection: Smart query interpretation for natural language interaction
Structured Outputs: JSON-formatted responses with detailed reasoning and metadata
π Features
News Fetching: Retrieve real-time news articles from RapidAPI
Sentiment Analysis: Advanced sentiment analysis using Mistral AI with confidence scoring and detailed reasoning
Text Summarization: AI-powered summarization using Mistral AI
Intelligent Agent: Natural language query processing with enhanced intent detection
MCP Architecture: Fully compliant with Model Context Protocol standards
π Project Structure
MCPDemo/
βββ server/
β βββ mcp_server.py # MCP server implementation
βββ client/
β βββ mcp_client.py # Enhanced MCP client with intelligent intent detection and quote parsing
βββ tools/
β βββ news_tool.py # News fetching from RapidAPI
β βββ sentiment_tool.py # sentiment analysis tools
β βββ summary_tool.py # Text summarization tools
β βββ __init__.py
βββ config/
β βββ settings.py # Configuration management
β βββ .env # Environment variables
β βββ __init__.py
βββ requirements.txt # Python dependencies
βββ README.md # This fileβοΈ Setup Instructions
1. Environment Setup
First, create a Python virtual environment:
# Navigate to project directory
cd "C:\Users\mayssen\Desktop\mcp project\MCPDemo"
# Create virtual environment
python -m venv .venv
# Activate virtual environment
.\.venv\Scripts\Activate # Windows
# source .venv/bin/activate # macOS/Linux
# Install dependencies
pip install -r requirements.txt2. API Keys Configuration
Edit the config/.env file and add your API keys:
# News API key from RapidAPI (already provided)
RAPIDAPI_KEY=6d35e9aa82msh4c8550ffb3e08b4p15bf78jsna3f5a47eeb4d
RAPIDAPI_HOST=real-time-news-data.p.rapidapi.com
# Get your Mistral AI API key from https://console.mistral.ai/
MISTRAL_API_KEY=your_mistral_api_key_here
# Optional: Adjust other settings as needed
MAX_NEWS_ARTICLES=10
MAX_SUMMARY_LENGTH=500
LOG_LEVEL=INFO3. Install Additional Dependencies
Some packages might need special installation:
# Ensure Mistral AI client is properly installed
pip install mistralai
# If you encounter any import errors, install packages individually:
pip install httpx langchain-mistralai fastmcpπ§ Usage
Running the MCP Server
# Make sure you're in the project directory with activated virtual environment
python server/mcp_server.pyRunning the Interactive Agent
In a separate terminal:
# Activate the same virtual environment
.\.venv\Scripts\Activate
# Run the client
python client/mcp_client.pyExample Queries
Once the agent is running, try these natural language queries:
- "Get latest news about technology"
- "Analyze sentiment of recent news about climate change"
- "Summarize news about the economy"
- "Show me top 5 news from UK"
- "How do people feel about the latest political news?"
- "Get French news about sports and analyze sentiment"
- "This new AI technology is amazing but also quite expensive" (direct text analysis)π Available Tools
1. News Tool
Function:
fetch_newsPurpose: Fetches news articles from RapidAPI
Parameters: topic, country, language, limit
Example: Retrieve tech news from US in English
2. Sentiment Analysis Tool
Function:
analyze_sentimentPurpose: Analyzes sentiment using advanced Mistral AI LLM with confidence scoring and detailed reasoning
Parameters: text, analysis_type (simple/detailed)
Features:
Structured JSON responses with confidence scores
Detailed reasoning and emotion detection
Support for complex, nuanced sentiment analysis
Direct text analysis through quotes
Example: Determine sentiment with confidence: "Mixed sentiment (0.80 confidence) - expresses both excitement and concern"
3. Summary Tool (Requires Mistral AI)
Function:
summarize_textPurpose: Summarizes text using Mistral AI
Parameters: text, max_length, summary_type
Example: Create brief summaries of long articles
4. Combined Workflows
Function:
analyze_news_sentimentPurpose: Fetches news and analyzes sentiment in one step
Parameters: topic, country, language, limit
Example: Get tech news and determine public sentiment
π Integration with Claude Desktop
To use this server with Claude Desktop, add this to your claude_desktop_config.json:
{
"mcpServers": {
"news-analysis": {
"command": "python",
"args": ["C:/Users/mayssen/Desktop/mcp project/MCPDemo/server/mcp_server.py"],
"env": {
"PYTHONPATH": "C:/Users/mayssen/Desktop/mcp project/MCPDemo"
}
}
}
}π― Advanced Features
Intelligent Text Detection
The client automatically detects quoted text in user queries and analyzes it directly:
Input:
"This product is amazing but expensive"Result: Direct sentiment analysis of the quoted text
Structured LLM Responses
All LLM operations return structured JSON with:
Classification: Primary sentiment/summary category
Confidence: Numerical confidence score (0.0-1.0)
Reasoning: Detailed explanation of the analysis
Emotions: Additional emotional context (for detailed analysis)
π Troubleshooting
Common Issues
Import Errors: Make sure all dependencies are installed and virtual environment is activated
Mistral API Key Errors: Verify your Mistral AI API key is correctly set in
.envfileRapidAPI Errors: Check if the provided RapidAPI key is still valid
MCP Connection Issues: Ensure both server and client are using the same transport method
LLM Response Issues: Verify Mistral AI API connectivity and sufficient API credits
Checking Logs
The system uses Python logging. Check console output for detailed error messages. You can adjust log level in .env:
LOG_LEVEL=DEBUG # For more detailed logsTesting Individual Components
Test each tool separately:
# Test LLM-powered sentiment analysis
from tools.sentiment_tool import SentimentTool
import asyncio
async def test_sentiment():
tool = SentimentTool()
result = await tool.analyze_sentiment(
"This new AI technology is amazing but also quite expensive",
"detailed"
)
print(result)
asyncio.run(test_sentiment())π Dependencies
Core MCP Dependencies
mcp>=1.2.0- Model Context Protocol implementationhttpx>=0.25.0- HTTP client for API requestspython-dotenv>=1.0.0- Environment variable management
AI/ML Dependencies
mistralai>=1.0.0- Mistral AI client for both summarization and sentiment analysislangchain-mistralai>=0.1.0- LangChain integration for enhanced LLM capabilitiesfastmcp>=2.11.0- FastMCP framework for efficient MCP implementation
Utility Dependencies
requests>=2.31.0- HTTP requestspydantic>=2.0.0- Data validationrich>=13.0.0- Pretty terminal output
π€ Contributing
Fork the repository
Create a feature branch
Make your changes
Test thoroughly
Submit a pull request
π License
This project is open source. Feel free to modify and distribute according to your needs.
π Support
If you encounter issues:
Check the troubleshooting section
Review logs for error messages
Verify API keys and configuration
Test individual components
For additional help, review the MCP documentation at Model Context Protocol.
76537c4 (initial commit)