Search for:
Why this server?
This server enables AI assistants to interact with n8n workflows through natural language, supporting actions like listing, creating, updating, executing and monitoring workflows.
Why this server?
A Model Context Protocol server that allows AI assistants to interact with Prefect's workflow automation platform through natural language, enabling users to manage flows, deployments, tasks, and other Prefect resources via conversational commands.
Why this server?
A server that enables interacting with Prefect workflow automation tools through the Model Context Protocol, offering enhanced reliability through uvx running mechanism and seamless integration with Cursor IDE.
Why this server?
GenAIScript is a JavaScript runtime dedicated to build relaible, automatable LLM scripts. Every GenAIScript can be exposed as a MCP server automatically.
Why this server?
AI-driven task management application that operates via MCP, enabling autonomous creation, organization, and execution of tasks with support for subtasks, priorities, and progress tracking.
Why this server?
A server that enables seamless integration between local Ollama LLM instances and MCP-compatible applications, providing advanced task decomposition, evaluation, and workflow management capabilities.
Why this server?
A comprehensive system for managing AI-assisted agile development workflows with a modern, resource-based API using FastMCP.
Why this server?
Enables AI agents to control web browsers via a standardized interface for operations like launching, interacting with, and closing browsers.
Why this server?
Allows AI models to run JavaScript/TypeScript code through Model Context Protocol tool calls, supporting both one-time script execution and stateful REPL sessions with npm package integration.
Why this server?
A server that enables AI assistants like Claude to safely run Python code and access websites, processing data for better AI understanding while providing helpful error messages.