Integrates with local Ollama LLM instances to power four specialized agents (Market Analyst, Portfolio Manager, Risk Analyst, and Explainability Agent) for OpenBanking applications. Supports configuration of models like Llama 3.2 and Gemma3 for different agent roles.
Leverages Python as the runtime environment for the MCP server, requiring Python 3.8+ with a conda environment for execution of the OpenBanking functionality.
OpenBanking MCP Server
An MCP (Model Context Protocol) server that integrates with local Ollama LLMs for OpenBanking applications.
Features
- Multiple LLM Agents: Support for 4 different specialized agents
- Ollama Integration: Local LLM model support
- Tool System: Comprehensive toolset for various operations
- OpenBanking Focus: Specialized tools for banking and financial data
Architecture
This MCP server follows the architecture shown in the diagram:
- MCP Host communicates with MCP Server
- 4 Specialized Agents (Agent 1-4) with LLM/Model capabilities
- Tool system for specific operations
- Banking Services/API integration
Agents
- Agent 1 - Market Analyst: Market data analysis and volatile situations
- Agent 2 - Portfolio Manager: Portfolio management and strategy finding
- Agent 3 - Risk Analyst: Risk analysis for users
- Agent 4 - Explainability Agent: LLM for explainability and SWOT analysis
Project Structure
Prerequisites
- Python 3.8+ with conda environment
- Ollama installed and running locally
- Conda environment named
openbanking-backend
Setup
1. Create Conda Environment (if not exists)
2. Install Dependencies
3. Install and Start Ollama
Download and install Ollama from https://ollama.ai
4. Configure the Server
Edit config/config.yaml
to customize:
- Ollama connection settings
- Agent configurations
- Model assignments
- Tool settings
Running the Server
Option 1: Using PowerShell Script (Recommended for Windows)
Option 2: Using Python Startup Script
Option 3: Direct Execution
Option 4: Development Mode
Tools Available
Portfolio Analysis
analyze_portfolio
: Comprehensive portfolio analysisportfolio_optimization
: Portfolio allocation optimization
Market Analysis
market_analysis
: Current market conditions analysisvolatility_analysis
: Market volatility assessment
Risk Assessment
assess_risk
: User-specific risk assessmentrisk_simulation
: Scenario-based risk simulation
Strategy Recommendations
recommend_strategy
: Investment strategy recommendations
Analysis & Explainability
swot_analysis
: SWOT analysis for any subjectexplain_concept
: Explain financial concepts simplyreverse_simulation
: Reverse engineering analysis
Usage Examples
Run Demo
Test the Server
Configuration
The server uses config/config.yaml
for configuration. Key sections:
Troubleshooting
Common Issues
- Ollama Connection Failed
- Ensure Ollama is installed and running (
ollama serve
) - Check if the default port 11434 is available
- Verify models are pulled (
ollama list
)
- Ensure Ollama is installed and running (
- Conda Environment Issues
- Make sure
openbanking-backend
environment exists - Activate the environment before running
- Install dependencies in the correct environment
- Make sure
- Import Errors
- Ensure all dependencies are installed
- Check Python path and working directory
Logs
Check the console output for detailed error messages and debugging information.
Development
Adding New Tools
- Create tool class in
src/tools/
- Add tool registration in
tool_registry.py
- Update configuration if needed
Adding New Agents
- Add agent configuration in
config/config.yaml
- Implement custom agent logic if needed
- Test with demo scripts
License
MIT License
This server cannot be installed
An MCP server that integrates with local Ollama LLMs to provide financial analysis through four specialized agents (Market Analyst, Portfolio Manager, Risk Analyst, and Explainability Agent) with comprehensive banking tools.
Related MCP Servers
- AsecurityAlicenseAqualityMCP Ollama server integrates Ollama models with MCP clients, allowing users to list models, get detailed information, and interact with them through questions.Last updated -325PythonMIT License
- AsecurityAlicenseAqualityAn MCP server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question for more comprehensive answers.Last updated -254TypeScriptMIT License
- -securityFlicense-qualityA server that enables seamless integration between local Ollama LLM instances and MCP-compatible applications, providing advanced task decomposition, evaluation, and workflow management capabilities.Last updated -6Python
- AsecurityFlicenseAqualityA lightweight MCP server that provides a unified interface to various LLM providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama.Last updated -6545Python