Orchestrates autonomous multi-agent workflows by integrating MCP tools for market research and data analysis.
Provides a news intelligence tool that retrieves global news data using the Google Search API via Serper.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP + CrewAI Agentic IntegrationResearch latest AI trends and save a summary to my local notes"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
π€ MCP + CrewAI Agentic Integration π
A powerful demonstration of Model Context Protocol (MCP) integrated with CrewAI orchestrations, featuring full observability through AgentOps and high-speed inference via Groq.
π Overview
This project bridges the gap between context-aware tools and autonomous agents. It provides a custom MCP server for real-time external data (Weather, News, Notes) while leveraging CrewAI to orchestrate multi-agent workflows.
ποΈ Architecture
MCP Layer: A
FastMCPserver exposing tools for real-time data retrieval.
Agentic Layer:
CrewAIagents specialized in Market Analysis and Research.
Inference Layer: Ultra-fast LLMs (Llama 3.1) hosted on
Groq.Observability Layer:
AgentOpsfor tracing, cost management, and debugging.
β¨ Key Features
π οΈ Custom MCP Server Tools
βοΈ Weather Engine: Real-time meteorology data via WeatherAPI.
π° News Intelligence: Global news retrieval via Serper (Google Search API).
π Contextual Notes: Locally persistent note management for long-term memory.
οΏ½ Auto-Summary: Intelligent summarization of collected context.
π₯ Intelligence Crew
π Market Researcher: Scours data to identify emerging trends.
π Data Analyst: Synthesizes research into actionable market insights.
π Sequential Workflow: Fully orchestrated execution path for reliable results.
π οΈ Tech Stack
Framework: CrewAI
Server: FastMCP
LLM Engine: Groq (Llama 3.1 8B/70B)
Tracing: AgentOps
Package Manager: uv
π Getting Started
1. Prerequisites
Ensure you have the following installed:
uv (Recommended) or Python 3.13+
A valid Groq API Key
A valid AgentOps API Key
A Serper API Key (for News)
2. Installation
Clone the repository and sync dependencies:
3. Configuration
Create a .env file in the root directory:
4. Running the Project
π Start the MCP Server
π’ Run the CrewAI Integration
π Run Diagnostics
π Observability with AgentOps
This project is fully instrumented. Every run generates a unique replay URL allowed you to:
Watch Agent Self-Correction: See exactly how agents reason through tasks.
Trace LLM Calls: Monitor every prompt and completion.
Analyze Latency: Visualize the execution timeline of your crew.
Check your dashboard at: app.agentops.ai
π Project Structure
π€ Contributing
Contributions are what make the open-source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
Fork the Project
Create your Feature Branch (
git checkout -b feature/AmazingFeature)Commit your Changes (
git commit -m 'Add some AmazingFeature')Push to the Branch (
git push origin feature/AmazingFeature)Open a Pull Request
π‘οΈ License
Distributed under the MIT License. See LICENSE for more information.
Developed with β€οΈ for the AI Community.