Autotask MCP Server

Autotask MCP Server

A Model Context Protocol (MCP) server for Kaseya's Autotask PSA that enables natural language querying of your Autotask data through AI assistants like Claude.

Features

  • Query customer information (contracts, revenue)
    • Active contract analysis
    • Revenue by contract type
    • Contract status distribution
  • Analyze support agent activities and appointments
    • Recent appointment schedules
    • Time entry analysis
    • Resource utilization metrics
    • Work type distribution
  • Track project status and progress
    • Project status overview
    • Task completion metrics
    • Priority distribution
  • Support ticket analysis
    • Ticket status distribution
    • Resolution time metrics
    • Queue performance
  • Natural language query interface with advanced capabilities:
    • Date range parsing (e.g., "this week", "last month")
    • Status filtering (e.g., "open tickets", "closed tickets")
    • Queue and category filtering
    • Priority filtering
    • Sorting and limiting results

Prerequisites

  • Node.js (v16 or higher)
  • Autotask API credentials (username, secret, integration code)
  • Access to Autotask REST API endpoints

Setup

  1. Clone the repository
    git clone https://github.com/yourusername/autotask-mcp.git cd autotask-mcp
  2. Install dependencies:
    npm install
  3. Copy .env.example to .env and fill in your Autotask API credentials:
    cp .env.example .env
    Edit the .env file with your credentials:
    AUTOTASK_USER=your_autotask_username AUTOTASK_SECRET=your_autotask_secret AUTOTASK_INTEGRATION_CODE=your_integration_code
  4. Configure the API endpoint in src/mcp-server.js if your Autotask instance uses a different endpoint:
    autotask = new AutotaskRestApi( process.env.AUTOTASK_USER, process.env.AUTOTASK_SECRET, process.env.AUTOTASK_INTEGRATION_CODE, 'https://webservices14.autotask.net/atservicesrest' // Change if needed );
  5. Start the server:
    node src/mcp-server.js

Connecting to AI Assistants

This MCP server is designed to work with AI assistants that support the Model Context Protocol. To connect:

  1. Start the server as described above
  2. In your AI assistant interface, configure the connection to point to your running MCP server
  3. Once connected, you can start querying your Autotask data using natural language

Example Queries

Contract and Revenue Analysis

  • "What is our total contract revenue?"
  • "Show me active contracts by type"
  • "What's the distribution of contract status?"

Support Agent Analysis

  • "Show me all support agent appointments for this week"
  • "What's the time spent by each resource?"
  • "Show me work type distribution for time entries"

Project Analysis

  • "What's the status of our active projects?"
  • "Show me task completion metrics"
  • "What's the priority distribution of projects?"

Ticket Analysis

  • "Show me ticket status distribution"
  • "What's our average ticket resolution time?"
  • "Show me tickets by queue"
  • "Show me high priority tickets created this week"
  • "Show me the oldest 10 tickets in the support queue"

Time Entry Analysis

  • "Show me time entries for the last week"
  • "Get hours logged by customer/contract in the last 7 days"
  • "Show me billable time entries for ticket 12345"
  • "Show me time entries by John for yesterday"

Troubleshooting

Common Issues

  1. Connection Errors
    • Verify your Autotask API credentials in the .env file
    • Check that your API endpoint is correct
    • Ensure your IP address is whitelisted in Autotask API settings
  2. Permission Issues
    • The API user must have appropriate permissions in Autotask
    • Check that the integration code has access to the entities being queried
  3. Response Format Issues
    • The server includes detailed logging to help diagnose API response format issues
    • Check the console output for debugging information

Debugging

The server outputs detailed logs to the console. To capture these logs to a file:

node src/mcp-server.js 2> server.log

Deployment Options

Running as a Service

To run the MCP server as a background service, you can use tools like:

  • PM2 (recommended for Linux/macOS/Windows):
    npm install -g pm2 pm2 start src/mcp-server.js --name "autotask-mcp" pm2 save
  • systemd (Linux): Create a service file at /etc/systemd/system/autotask-mcp.service
    [Unit] Description=Autotask MCP Server After=network.target [Service] Type=simple User=yourusername WorkingDirectory=/path/to/autotask-mcp ExecStart=/usr/bin/node /path/to/autotask-mcp/src/mcp-server.js Restart=on-failure [Install] WantedBy=multi-user.target
    Then enable and start the service:
    sudo systemctl enable autotask-mcp sudo systemctl start autotask-mcp

Docker Deployment

A Dockerfile is included for containerized deployment:

docker build -t autotask-mcp . docker run -d -p 3000:3000 --env-file .env --name autotask-mcp autotask-mcp

Security Considerations

  • Store API credentials securely and never commit them to version control
  • Consider using environment variables or a secrets manager in production
  • Implement proper access controls if exposing the server beyond your local network
  • Review and limit the permissions of the Autotask API user to only what's necessary

Contributing

We welcome contributions to the Autotask MCP Server! Here's how you can contribute:

  1. Fork the repository
    • Click the "Fork" button at the top right of the repository page
  2. Clone your fork
    git clone https://github.com/your-username/autotask-mcp.git cd autotask-mcp
  3. Create a branch for your changes
    git checkout -b feature/your-feature-name
  4. Make your changes
    • Implement your feature or bug fix
    • Add or update tests as necessary
    • Run tests and linting to ensure your changes pass
      npm test npm run lint
  5. Commit your changes
    git commit -m "Add your descriptive commit message"
  6. Push to your fork
    git push origin feature/your-feature-name
  7. Create a Pull Request
    • Go to the original repository
    • Click "New Pull Request"
    • Select "compare across forks"
    • Select your fork and branch
    • Fill out the PR template with details about your changes

All pull requests will be reviewed by the maintainers. The CI pipeline will automatically run tests and linting checks on your PR.

For more detailed information about contributing, please see CONTRIBUTING.md.

License

This project is licensed under the MIT License - see the LICENSE file for details.

-
security - not tested
-
license - not tested
-
quality - not tested

A Model Context Protocol server that enables natural language querying of Kaseya's Autotask PSA data through AI assistants, supporting contract analysis, ticket tracking, agent activities, and project status monitoring.

  1. Features
    1. Prerequisites
      1. Setup
        1. Connecting to AI Assistants
          1. Example Queries
            1. Contract and Revenue Analysis
            2. Support Agent Analysis
            3. Project Analysis
            4. Ticket Analysis
            5. Time Entry Analysis
          2. Troubleshooting
            1. Common Issues
            2. Debugging
          3. Deployment Options
            1. Running as a Service
            2. Docker Deployment
          4. Security Considerations
            1. Contributing
              1. License
                ID: wddhx3c22o