Skip to main content
Glama

Cisco Catalyst Center MCP Server

by martynrees

MCP Server for Cisco Catalyst Center APIs

This project implements a Model Context Protocol (MCP) server that wraps the Cisco Catalyst Center APIs defined in a Swagger JSON file. The MCP server allows you to use these APIs as tools within AI agents built with LangChain or any other framework supporting the Model Context Protocol.

Features

  • Automatically parses Swagger/OpenAPI JSON to extract API endpoints as tools
  • Implements the Model Context Protocol for tool discovery and execution
  • Provides a FastAPI server to expose the tools via MCP
  • Includes authentication with the Cisco Catalyst Center API
  • Example client to demonstrate using the MCP server with LangChain

Prerequisites

  • Python 3.9 or higher
  • Cisco Catalyst Center with API access
  • OpenAI API key

Setup

  1. Clone the repository and navigate to the project directory:
cd /mcp
  1. Install the required dependencies:
pip install -r requirements.txt
  1. Configure the environment variables:

Edit the .env file with your configuration details:

OPENAI_API_KEY=your_openai_api_key CISCO_API_BASE_URL=https://your-cisco-catalyst-center-url CISCO_API_USERNAME=your_username CISCO_API_PASSWORD=your_password HOST=0.0.0.0 PORT=8000 SWAGGER_JSON_PATH=../intent_api_2_3_7_9.json

Running the MCP Server

Option 1: Running Directly

Start the MCP server directly with:

python server.py

The server can also be run as a Docker container, which ensures consistent environments and easier deployment.

  1. Make sure Docker and Docker Compose are installed on your system.
  2. Run the Docker container:
./run_docker.sh

This script will build the Docker image and start a container. The server will be available at http://localhost:8000.

API Endpoints

  • GET /: Health check endpoint
  • POST /mcp/tools: Get available tools in MCP format
  • POST /mcp/execute_tool/{tool_name}: Execute a specific tool with parameters

Using with LangChain

The project includes an example client (example_client.py) that demonstrates how to use the MCP server with LangChain.

Run the example client with:

python example_client.py

The client will fetch the available tools from the MCP server and create a LangChain agent that can use these tools.

Project Structure

  • server.py: Main FastAPI server implementing the MCP
  • swagger_parser.py: Parser for Swagger/OpenAPI JSON files
  • api_client.py: Client for making requests to Cisco APIs
  • tool_handler.py: Handler for MCP tools that wrap API endpoints
  • custom_tools.py: Implementation of custom tools that combine multiple API calls
  • config.py: Configuration module
  • example_client.py: Example client using LangChain
  • Dockerfile: Configuration for building the Docker image
  • docker-compose.yml: Docker Compose configuration for services
  • run_docker.sh: Script to build and run the Docker container

MCP Integration

This server implements the Model Context Protocol (MCP) for AI agents. The MCP defines a standardized way for AI models to discover and use tools.

The key components of MCP integration are:

  1. Tool Discovery: The /mcp/tools endpoint returns tool schemas in the MCP format.
  2. Tool Execution: The /mcp/execute_tool/{tool_name} endpoint executes tools and returns results.

Customization

You can customize the MCP server by:

  1. Modifying the SwaggerParser to handle different Swagger/OpenAPI formats
  2. Extending the ApiClient to add authentication methods or error handling
  3. Adding new endpoints to the FastAPI server
  4. Creating custom tools in the custom_tools.py file that combine multiple API calls for advanced functionality

Custom Tools

The server includes a custom_tools.py module that demonstrates how to create custom tools that combine multiple API calls. These tools can provide higher-level functionality that is not directly exposed by the API.

Currently implemented custom tools:

  • network_health_report: Generates a comprehensive network health report by combining data from multiple API endpoints

Troubleshooting

  • If authentication fails, check your Cisco API credentials in the .env file
  • If tool execution fails, check the logs for more detailed error messages
  • If parsing fails, ensure your Swagger JSON is valid and follows the OpenAPI 3.0.x format

Docker-specific troubleshooting

  • If the Docker container fails to start, check Docker logs: docker-compose logs -f
  • If the container can't access the Swagger file, ensure the context path in docker-compose.yml is correct
  • To rebuild the Docker image completely: docker-compose down && docker-compose build --no-cache && docker-compose up -d
  • To enter the running container for debugging: docker exec -it mcp-server bash

License

MIT

-
security - not tested
F
license - not found
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

A server that wraps Cisco Catalyst Center APIs into tools accessible via the Model Context Protocol, allowing AI agents to discover and execute network management operations.

  1. Features
    1. Prerequisites
      1. Setup
        1. Running the MCP Server
          1. Option 1: Running Directly
          2. Option 2: Running with Docker (Recommended)
        2. API Endpoints
          1. Using with LangChain
            1. Project Structure
              1. MCP Integration
                1. Customization
                  1. Custom Tools
                2. Troubleshooting
                  1. Docker-specific troubleshooting
                3. License

                  Related MCP Servers

                  • A
                    security
                    A
                    license
                    A
                    quality
                    A Model Context Protocol server that enables AI agents to interact with 30+ Ethereum-compatible blockchain networks, providing services like token transfers, contract interactions, and ENS resolution through a unified interface.
                    Last updated -
                    28
                    94
                    146
                    TypeScript
                    MIT License
                  • -
                    security
                    F
                    license
                    -
                    quality
                    A Model Context Protocol server that provides a comprehensive interface for interacting with the ConnectWise Manage API, simplifying API discovery, execution, and management for both developers and AI assistants.
                    Last updated -
                    46
                    2
                    Python
                    • Linux
                    • Apple
                  • -
                    security
                    A
                    license
                    -
                    quality
                    A Model Context Protocol server that allows AI tools to connect to and interact with your Directus API, enabling automated access to collections, items, and user data.
                    Last updated -
                    69
                    15
                    TypeScript
                    MIT License
                    • Linux
                    • Apple
                  • A
                    security
                    A
                    license
                    A
                    quality
                    A Model Context Protocol server that enables AI interfaces to seamlessly interact with Plane's project management system, allowing management of projects, issues, states, and other work items through a standardized API.
                    Last updated -
                    46
                    43
                    23
                    TypeScript
                    MIT License

                  View all related MCP servers

                  MCP directory API

                  We provide all the information about MCP servers via our MCP API.

                  curl -X GET 'https://glama.ai/api/mcp/v1/servers/martynrees/mcp'

                  If you have feedback or need assistance with the MCP directory API, please join our Discord server