HyperStore MCP
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@HyperStore MCPFind me a free AI tool that summarises PDFs."
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
HyperStore MCP
Plug 6,500+ AI apps into any LLM via the Model Context Protocol.
HyperStore is a curated directory of 6,500+ AI applications, developed by HyperGPT. This MCP server exposes the HyperStore catalog to any LLM client — Claude, ChatGPT, Cursor, Windsurf, Cline, Zed, Gemini, and anything else that speaks MCP.
Ask your LLM:
"Find me a free AI tool that summarises PDFs." "Compare ChatGPT, Claude, and Gemini side-by-side." "Show me the top 5 image-generation apps with an API."
The LLM calls HyperStore MCP behind the scenes and answers with up-to-date, curated results.
What you get
8 tools:
Tool | Purpose |
| Full-text keyword search |
| Embedding-based semantic search |
| Full app detail (features, screenshots, pricing) |
| Paginated apps with filters (category, pricing) |
| Browse all 30+ categories |
| Apps within a category |
| A-Z directory listing |
| Trending + top categories overview |
3 resources:
hyperstore://app/{slug}— markdown rendering of any apphyperstore://category/{slug}— top apps in a categoryhyperstore://catalog— full category index
3 prompts:
find_tool_for_task— guided discovery for a taskcompare_apps— side-by-side app comparisondiscover_category— explore a topic
Install
Option A — uvx (zero install, recommended)
Requires uv. One command and you're done:
uvx hyperstore-mcpOption B — pipx
pipx install hyperstore-mcp
hyperstore-mcpOption C — Docker (for remote hosting)
docker run --rm -p 8080:8080 ghcr.io/deficlow/hyperstore-mcp
# Now MCP Streamable HTTP at http://localhost:8080/mcpOption D — Hosted endpoint (no install)
Use our managed Streamable HTTP server:
https://mcp.store.hypergpt.ai/mcpConnect from your LLM client
Claude Desktop
Edit ~/Library/Application Support/Claude/claude_desktop_config.json
(macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}Restart Claude → tools appear in the 🛠 menu.
Claude Code
claude mcp add hyperstore -- uvx hyperstore-mcpCursor
.cursor/mcp.json (project) or ~/.cursor/mcp.json (global):
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}Windsurf
~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}Cline (VS Code)
settings.json:
{
"cline.mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}Zed
~/.config/zed/settings.json:
{
"context_servers": {
"hyperstore": {
"command": {
"path": "uvx",
"args": ["hyperstore-mcp"]
}
}
}
}Gemini CLI
~/.gemini/settings.json:
{
"mcpServers": {
"hyperstore": {
"command": "uvx",
"args": ["hyperstore-mcp"]
}
}
}ChatGPT (Pro / Team / Enterprise)
Settings → Connectors → Add custom connector:
Name: HyperStore
MCP Server URL:
https://mcp.store.hypergpt.ai/mcpAuthentication: None
OpenAI Responses API
from openai import OpenAI
client = OpenAI()
response = client.responses.create(
model="gpt-4.1",
tools=[{
"type": "mcp",
"server_label": "hyperstore",
"server_url": "https://mcp.store.hypergpt.ai/mcp",
"require_approval": "never",
}],
input="Find me 3 free AI tools for writing unit tests.",
)
print(response.output_text)Anthropic Messages API
from anthropic import Anthropic
client = Anthropic()
response = client.messages.create(
model="claude-opus-4-7",
max_tokens=1024,
mcp_servers=[{
"type": "url",
"url": "https://mcp.store.hypergpt.ai/mcp",
"name": "hyperstore",
}],
messages=[{"role": "user", "content": "Top 5 AI image generators?"}],
)See examples/ for ready-to-paste configs for every supported client.
Run as a remote server
# Streamable HTTP (modern, ChatGPT/OpenAI/Anthropic)
hyperstore-mcp --transport http --host 0.0.0.0 --port 8080
# Legacy SSE (older MCP clients)
hyperstore-mcp --transport sse --port 8080The hosted endpoint at https://mcp.store.hypergpt.ai runs the Docker image
behind a CDN — no auth, rate-limited per IP.
Configuration
All settings come from environment variables (see .env.example):
Variable | Default | Purpose |
|
| Upstream API base URL |
|
| HTTP timeout in seconds |
|
| UA string |
|
| Bind host (http/sse only) |
|
| Bind port (http/sse only) |
|
| Logging level |
Development
git clone https://github.com/deficlow/HyperStore-MCP
cd HyperStore-MCP
uv sync --all-extras
uv run pytest
uv run hyperstore-mcp # stdio mode for local testingInspect the running server with the official MCP Inspector:
npx @modelcontextprotocol/inspector uvx hyperstore-mcpHow it works
HyperStore MCP is a thin async wrapper around the HyperStore public REST API. It is read-only — no credentials, no writes, no PII. The same data that powers the website powers the MCP server. Updates land in your LLM the moment they land on the site.
LLM client ──MCP──▶ hyperstore-mcp ──HTTPS──▶ store.hypergpt.ai/apiLicense
MIT © HyperGPT
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/deficlow/HyperStore-MCP'
If you have feedback or need assistance with the MCP directory API, please join our Discord server