Skip to main content
Glama

OpenBanking MCP Server

by NBWolfer

OpenBanking MCP Server

An MCP (Model Context Protocol) server that integrates with local Ollama LLMs for OpenBanking applications.

Features

  • Multiple LLM Agents: Support for 4 different specialized agents
  • Ollama Integration: Local LLM model support
  • Tool System: Comprehensive toolset for various operations
  • OpenBanking Focus: Specialized tools for banking and financial data

Architecture

This MCP server follows the architecture shown in the diagram:

  • MCP Host communicates with MCP Server
  • 4 Specialized Agents (Agent 1-4) with LLM/Model capabilities
  • Tool system for specific operations
  • Banking Services/API integration

Agents

  1. Agent 1 - Market Analyst: Market data analysis and volatile situations
  2. Agent 2 - Portfolio Manager: Portfolio management and strategy finding
  3. Agent 3 - Risk Analyst: Risk analysis for users
  4. Agent 4 - Explainability Agent: LLM for explainability and SWOT analysis

Project Structure

mcpOpenbankingMCPServer/ ├── src/ │ ├── __init__.py │ ├── main.py # Main server entry point │ ├── agents/ │ │ ├── __init__.py │ │ └── agent_manager.py # Agent management system │ ├── tools/ │ │ ├── __init__.py │ │ ├── tool_registry.py # Tool registration and management │ │ ├── portfolio_tools.py # Portfolio analysis tools │ │ ├── market_tools.py # Market data tools │ │ ├── risk_tools.py # Risk assessment tools │ │ ├── strategy_tools.py # Strategy recommendation tools │ │ └── analysis_tools.py # SWOT and explanation tools │ ├── config/ │ │ ├── __init__.py │ │ └── config.py # Configuration management │ └── utils/ │ ├── __init__.py │ └── utils.py # Utility functions ├── config/ │ └── config.yaml # Server configuration ├── requirements.txt # Python dependencies ├── startup.py # Python startup script ├── start_server.ps1 # PowerShell startup script ├── start_server.bat # Batch file for Windows ├── demo.py # Demo examples ├── test_server.py # Test script └── README.md # This file

Prerequisites

  1. Python 3.8+ with conda environment
  2. Ollama installed and running locally
  3. Conda environment named openbanking-backend

Setup

1. Create Conda Environment (if not exists)

conda create -n openbanking-backend python=3.11 conda activate openbanking-backend

2. Install Dependencies

pip install -r requirements.txt

3. Install and Start Ollama

Download and install Ollama from https://ollama.ai

# Pull a model (example with Llama 3.2) ollama pull llama3.2:latest # Start Ollama server ollama serve

4. Configure the Server

Edit config/config.yaml to customize:

  • Ollama connection settings
  • Agent configurations
  • Model assignments
  • Tool settings

Running the Server

.\start_server.ps1

Option 2: Using Python Startup Script

python startup.py

Option 3: Direct Execution

conda activate openbanking-backend python src/main.py

Option 4: Development Mode

python src/main.py --dev

Tools Available

Portfolio Analysis

  • analyze_portfolio: Comprehensive portfolio analysis
  • portfolio_optimization: Portfolio allocation optimization

Market Analysis

  • market_analysis: Current market conditions analysis
  • volatility_analysis: Market volatility assessment

Risk Assessment

  • assess_risk: User-specific risk assessment
  • risk_simulation: Scenario-based risk simulation

Strategy Recommendations

  • recommend_strategy: Investment strategy recommendations

Analysis & Explainability

  • swot_analysis: SWOT analysis for any subject
  • explain_concept: Explain financial concepts simply
  • reverse_simulation: Reverse engineering analysis

Usage Examples

Run Demo

python demo.py

Test the Server

python test_server.py

Configuration

The server uses config/config.yaml for configuration. Key sections:

# Ollama connection ollama: host: "localhost" port: 11434 timeout: 30 # Agents configuration agents: - name: "market_analyst" model: "gemma3:4b" role: "Market Data Analyst" # ... more config # Tools configuration tools: - name: "portfolio_analysis" enabled: true

Troubleshooting

Common Issues

  1. Ollama Connection Failed
    • Ensure Ollama is installed and running (ollama serve)
    • Check if the default port 11434 is available
    • Verify models are pulled (ollama list)
  2. Conda Environment Issues
    • Make sure openbanking-backend environment exists
    • Activate the environment before running
    • Install dependencies in the correct environment
  3. Import Errors
    • Ensure all dependencies are installed
    • Check Python path and working directory

Logs

Check the console output for detailed error messages and debugging information.

Development

Adding New Tools

  1. Create tool class in src/tools/
  2. Add tool registration in tool_registry.py
  3. Update configuration if needed

Adding New Agents

  1. Add agent configuration in config/config.yaml
  2. Implement custom agent logic if needed
  3. Test with demo scripts

License

MIT License

Related MCP Servers

  • A
    security
    A
    license
    A
    quality
    MCP Ollama server integrates Ollama models with MCP clients, allowing users to list models, get detailed information, and interact with them through questions.
    Last updated -
    3
    25
    Python
    MIT License
    • Apple
  • A
    security
    A
    license
    A
    quality
    An MCP server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question for more comprehensive answers.
    Last updated -
    2
    54
    TypeScript
    MIT License
    • Apple
  • -
    security
    F
    license
    -
    quality
    A server that enables seamless integration between local Ollama LLM instances and MCP-compatible applications, providing advanced task decomposition, evaluation, and workflow management capabilities.
    Last updated -
    6
    Python
    • Apple
  • A
    security
    F
    license
    A
    quality
    A lightweight MCP server that provides a unified interface to various LLM providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama.
    Last updated -
    6
    545
    Python

View all related MCP servers

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/NBWolfer/mcpOpenbankingMCPServer'

If you have feedback or need assistance with the MCP directory API, please join our Discord server