Search for:
Why this server?
Provides a foundation for managing and configuring large language model interactions from providers like Anthropic and OpenAI, essential for integrating AI into engineering workflows.
Why this server?
Integrates OpenAPI-described REST APIs into MCP workflows, allowing dynamic exposure of API endpoints as MCP tools, which can be useful for interacting with various engineering tools.
Why this server?
Enables LLMs to generate and execute Azure CLI commands, facilitating management and automation of Azure resources, which is valuable for infrastructure engineering tasks.
Why this server?
Bridges LLMs with Language Server Protocol interfaces, allowing access to code intelligence features for improved code suggestions, completions, and diagnostics.
Why this server?
Provides RAG capabilities for semantic document search, allowing users to add, search, list, and delete documentation with metadata support. It helps in quickly retrieving relevant documentation for engineering tasks.
Why this server?
Allows AI models to safely run Python code and access websites, processing data for better AI understanding, and providing helpful error messages.
Why this server?
An MCP server that provides tools for reading, writing, and editing files on the local filesystem.
Why this server?
Enables LLMs to interact with Plane.so for streamlined project management workflows, allowing management of projects and issues.
Why this server?
Analyzes codebases using Repomix and LLMs to provide structured code reviews with specific issues and recommendations, supporting multiple LLM providers.
Why this server?
A Model Context Protocol server implementation that enables LLMs to interact with NebulaGraph database for graph exploration, supporting schema understanding, queries, and graph algorithms.