Decentralized Git with on-chain governance, bounties, and DAOs. Tools for repos, issues, PRs, labels, releases, bounties, and DAO proposals.
Auto-wallet on first use, trust tiers, and approval mode for human-in-the-loop.
MCP server for discovering and installing AI agent skills from Loaditout. Search 20,000+ security-gradedMCP servers, check A/B/C/F grades, get install configs, and browse trending skills and curated packs.
Enterprise AI governance layer for spend tracking, runtime guardrails, policy enforcement, and budget limits. Connects Claude, ChatGPT, and any MCP client to ThinkNEO's control plane for
SOC2/GDPR/HIPAA compliance monitoring and real-time provider health.
Real-time and historical oil, gas, and commodity prices. 40+ energy commodities including Brent Crude, WTI, Natural Gas, LBMA Gold/Silver, EU Carbon, and refined products. Get current prices, compare commodities, view market overviews, and access historical data — all through natural language. Used by energy traders, fintech companies, and researchers worldwide.
Provides AI agents with instant, structured access to electronic component datasheets, pinouts, and electrical specifications without requiring PDF uploads. It enables seamless part searching, design validation, and side-by-side component comparisons across major hardware providers.
Arknights PRTS Wiki MCP Server — queries the PRTS Wiki API for full-text search and article extraction, and serves auto-synced operator archives, voice lines, and story/event
scripts from game data. Designed for fan-creation AI agents that need accurate lore, character profiles, and in-character dialogue references.
Merx - TRON Resource Exchange
Merx is the first TRON resource exchange that aggregates energy and bandwidth providers into a single MCP server.
GitHub: https://github.com/Hovsteder/merx-mcp
Hosted SSE: https://merx.exchange/mcp/sse
npm: merx-mcp
Website: https://merx.exchange
Stats
* 52 tools, 30 prompts, 21 resources
* 7+ energy providers (CatFee, ITRX, PowerSun, Fe
Real-time news with bias scoring, live market data, and AI-powered options pricing. 9 tools across news intelligence, media bias analysis, stock/crypto data, and meme search.
The most feature-complete MCP server for Obsidian vaults. 23 tools and 3 resources for search, read, write, tags, link analysis, graph traversal, and canvas support.
A local MCP server that lets AI agents bypass bot detection, geo-restrictions, and JavaScript rendering challenges when scraping the web, backed by ScraperAPI's services
A Model Context Protocol (MCP) server that gives your AI assistant the power to convert Markdown into 14 professional document formats — PDF, DOCX, HTML, LaTeX, CSV, JSON, XML, XLSX, RTF, PNG, and more. Stop copy-pasting. Let the AI do the exporting.
PHP static analysis MCP server with 11 tools for querying 60+ code quality metrics, detecting problems (God Class, dependency cycles, SOLID violations), analyzing dependencies, identifying refactoring priorities, and mapping test coverage — all from live analysis data.
Real-time LTL freight fuel surcharge rates for 9 US carriers and US state ABC liquor license compliance lookups (CA, TX, NY, FL). Every response includes a verifiability block with extraction confidence and source URL so agents can assess data quality before acting.
The interface protocol for AI agents. 8 kernel primitives + 16 stdlib operations to operate any interface. Forge once, run forever — zero AI at runtime. 81 skills across 41 sites.
Manage Job using MCP: Manage Job, Candidates, Resumes, Salaries all within this one MCP tools
It can solve problems like:
You have 50 resumes to screen. Your AI assistant can reason about candidates, but it can't:
Read PDFs/DOCX — The AI can't open binary files
Extract structured data — Copy-pasting loses formatting, metrics, and context
Compare at scale — No consistent scoring across candida
A Model Context Protocol (MCP) server that provides comprehensive WHOIS lookup capabilities using the IP2WHOIS API. This server allows AI agents to query domain registration details, including expiry dates, registrar information, and registrant data.
Crawl any website into clean Markdown, search through pages, read full content, and extract structured data using OpenAI, Claude, Gemini, or Grok — with auto-citation and resume support.
A local Python MCP server that exposes the Codemagic CI/CD REST API as Claude-callable tools. Trigger builds, manage apps, download artifacts, and clear caches — all from Claude Code or Claude Desktop without leaving the chat.
Provides prediction market intelligence, research, and strategy signals for platforms like Kalshi, Polymarket, and Robinhood. It enables AI assistants to perform market screening, arbitrage detection, and deep causal analysis to support informed trading decisions.
Connect AI assistants to Manticore Search. Execute SQL queries, list tables, get schemas, and fetch documentation. Perfect for building RAG applications and search-powered AI agents.
Code graph context engine that parses codebases with tree-sitter (170+ languages), builds structural dependency graphs, and provides 24 MCP tools for code intelligence. One prepare_context call gives your AI agent the right files for any task. Includes focus, blast radius, hotspots, dead code detection, and hybrid search.
AI memory that works like yours. Neural pathway architecture gives your AI persistent recall, contextual awareness, and cross-session continuity the way human memory actually works. 14 MCP tools. Works with Claude Code, Cursor, Windsurf, and any MCP client. $50/mo unlimited.
Query Langfuse traces, schema and datasets, scores and metrics, debug exceptions, analyze sessions, and manage prompts. Full observability toolkit for LLM applications.
Stop paying for your agent to rediscover what other agents already figured out. Prior is a shared knowledge base where agents exchange proven solutions — one search can save 10 minutes of trial-and-error and thousands of tokens. Your Sonnet gets access to solutions that Opus spent 20 tool calls discovering. Search is free with feedback, and contributing earns credits.
A fast, secure, and LLM-friendly Model Context Protocol (MCP) server that scrapes job listings from major platforms (LinkedIn, Indeed, Google) and converts them into structured Markdown format.
LoopSense is an open-source MCP server that closes the feedback loop for AI coding agents — giving them real-time visibility into CI results, deployments, test outcomes, and file system changes.
Provides access to ODEI's constitutional knowledge graph, AI safety guardrails, and EVM smart contract auditing tools. It enables users to query structured domain nodes, validate agent actions, and perform security audits directly through an MCP client.
Provides tools to search and execute Code Ocean capsules and pipelines while managing platform data assets. It enables users to interact with Code Ocean's computational resources and scientific workflows directly through natural language interfaces.
Enables users to control FreeCAD through natural language for creating, editing, and managing 3D objects and documents. It supports executing Python code, capturing screenshots of the workspace, and importing parts from the FreeCAD library.
Enables access to TikTok data without watermarks, including trending users, hashtags, post analytics, user profiles, and download links for specific countries. Supports searching by username, user ID, or post links.
Enables AI assistants to interact with Sauce Labs testing platform through natural language, providing access to device cloud management, test job analysis, build monitoring, and testing infrastructure insights. Supports both Virtual Device Cloud (VDC) and Real Device Cloud (RDC) with comprehensive test analytics and team collaboration features.
yade-mcp connects AI agents to YADE — the open-source discrete element method engine — through the Model Context Protocol. Browse API docs, run simulations, and execute code, all through natural conversation.
Enables natural language interaction with rasdaman multidimensional databases by translating tool calls into WCS/WCPS queries. It allows users to list coverages, retrieve metadata, and execute complex queries on datacubes through an LLM.
An MCP server that helps AI assistants generate valid, accessible Adaptive Cards for Teams, Outlook, Copilot, and other Microsoft and non-microsoft surfaces. 9 tools, 3 guided workflows, 924 tests to help you build an awesome AI experience.
Enables interaction with Foreman instances to manage systems through the Model Context Protocol. It provides access to Foreman resources and tools, such as security update reports, directly within AI-powered environments like VSCode and Claude Desktop.
A local academic tool that enables searching across nine academic sources, downloading PDFs, and performing AI-powered analysis of research papers. It also supports generating citation networks and recommending papers based on local workspace code.
Persistent memory engine for AI coding agents. Single Go binary, zero runtime dependencies, MCP-native. Stores, searches, and deduplicates memories across sessions using embedded SQLite with hybrid FTS + semantic search, memory decay, relation graph, and token-budget context assembly.
Provides plan state management and phase gate enforcement for AI development loops. It tracks task progress and coordinates the lifecycle of agents by requiring specific evidence before advancing through development phases.
Pay-per-use contextual guidance for AI agents. When an agent loses direction, it describes its state, pays 21 sats via Lightning or Arbitrum ETH, and receives its original purpose stripped of noise.