Search for:

Resources and guidance for coding, developing, and training AI models

  • Why this server?

    This server facilitates the invocation of AI models from providers like Anthropic, OpenAI, and Groq, enabling users to manage and configure large language model interactions seamlessly.

    -
    security
    A
    license
    -
    quality
    This server facilitates the invocation of AI models from providers like Anthropic, OpenAI, and Groq, enabling users to manage and configure large language model interactions seamlessly.
    4
    Python
    MIT License
  • Why this server?

    A Model Context Protocol server that allows LLMs to interact with Python environments, execute code, and manage files within a specified working directory, useful for developing and training models.

    A
    security
    F
    license
    A
    quality
    A Model Context Protocol server that allows LLMs to interact with Python environments, execute code, and manage files within a specified working directory.
    9
    8
    Python
    • Linux
    • Apple
  • Why this server?

    A foundation for creating custom Model Context Protocol servers that can integrate with AI systems.

    -
    security
    F
    license
    -
    quality
    A foundation for creating custom Model Context Protocol servers that can integrate with AI systems, providing a simple BMI calculator tool as an example implementation.
    2
    TypeScript
  • Why this server?

    Allows LLMs to generate and execute Azure CLI commands, enabling management of Azure resources, which can be relevant in cloud-based AI development and training.

    -
    security
    A
    license
    -
    quality
    An MCP server that wraps the Azure CLI. As LLMs are very good at generating Azure CLI commands, this server allows your LLM to list resources, update/create/delete them, fix errors (by looking at the logs), fix security issues...
    50
    MIT License
    • Linux
    • Apple
  • Why this server?

    FastMCP is a comprehensive MCP server allowing secure and standardized data and functionality exposure to LLM applications, offering resources, tools, and prompt management for efficient LLM interactions.

    -
    security
    A
    license
    -
    quality
    FastMCP is a comprehensive MCP server allowing secure and standardized data and functionality exposure to LLM applications, offering resources, tools, and prompt management for efficient LLM interactions.
    3
    Python
    MIT License
  • Why this server?

    A server that enables seamless integration between local Ollama LLM instances and MCP-compatible applications, providing advanced task decomposition, evaluation, and workflow management capabilities, facilitating the use of local models.

    -
    security
    F
    license
    -
    quality
    A server that enables seamless integration between local Ollama LLM instances and MCP-compatible applications, providing advanced task decomposition, evaluation, and workflow management capabilities.
    1
    Python
    • Apple
  • Why this server?

    MCP Server provides a simpler API to interact with the Model Context Protocol by allowing users to define custom tools and services to streamline workflows and processes.

    -
    security
    A
    license
    -
    quality
    MCP Server provides a simpler API to interact with the Model Context Protocol by allowing users to define custom tools and services to streamline workflows and processes.
    13
    2
    TypeScript
    MIT License
  • Why this server?

    A lightweight MCP server that provides a unified interface to various LLM providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama.

    -
    security
    F
    license
    -
    quality
    A lightweight MCP server that provides a unified interface to various LLM providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama.
    6
    Python
  • Why this server?

    Analyzes codebases using Repomix and LLMs to provide structured code reviews with specific issues and recommendations, supporting multiple LLM providers including OpenAI, Anthropic, and Gemini.

    -
    security
    F
    license
    -
    quality
    Analyzes codebases using Repomix and LLMs to provide structured code reviews with specific issues and recommendations, supporting multiple LLM providers including OpenAI, Anthropic, and Gemini.
    2
    JavaScript
  • Why this server?

    A system that manages context for language model interactions, allowing the model to remember previous interactions across multiple independent sessions using Gemini API, helpful for iterative model development.

    -
    security
    F
    license
    -
    quality
    A system that manages context for language model interactions, allowing the model to remember previous interactions across multiple independent sessions using Gemini API.
    Python
    • Linux
    • Apple