Provides asynchronous Q&A capabilities by connecting to any OpenAI-compatible API, featuring support for parallel queries, streaming responses, and chain-of-thought extraction.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@askrResearch the best practices for implementing rate limiting in a Node.js API"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
askr
MCP Q&A Assistant / MCP 问答助手
Async Q&A capabilities for AI clients via MCP protocol, powered by any OpenAI-compatible API.
通过 MCP 协议为 AI 客户端提供异步问答能力,后端对接任意 OpenAI 兼容 API。
🤖 100% Vibe Coding — Built entirely through AI-assisted development.
⭐ If you like this project, please give it a star!
Quick Start
# First time: launch the management panel to configure your provider
npx @sweatent/askr -m
# Start the MCP stdio server
npx @sweatent/askrFeatures
question — Ask a single question with streaming support
agentq — Ask multiple independent questions in parallel (configurable concurrency)
list — View recent session statuses
check — Retrieve session results with blocking wait, full content, and chain-of-thought support
MCP Client Integration
Claude Desktop
Add to claude_desktop_config.json:
{
"mcpServers": {
"askr": {
"command": "npx",
"args": ["askr"]
}
}
}Claude Code
Add via CLI or global config:
claude mcp add askr -- npx @sweatent/askrOr manually edit .claude/settings.json:
{
"mcpServers": {
"askr": {
"command": "npx",
"args": ["askr"]
}
}
}Cursor / Other MCP Clients
Configure a stdio-type MCP server with command npx @sweatent/askr per your client's documentation.
Management Panel
npx @sweatent/askr -m
# or
npx @sweatent/askr --manageMenu options:
Manage Provider — Configure Base URL, API Key, model (auto-fetch model list or manual input)
Manage Questions — View/close active sessions
View Logs — Browse session history with question, chain-of-thought, and answer (live refresh for running sessions)
More Settings — Max concurrency, timeout, fold characters, system prompt, language
Supports Chinese and English interfaces.
Configuration
Config file is stored in the system data directory:
Platform | Path |
Windows |
|
macOS |
|
Linux |
|
{
"language": "en",
"provider": {
"baseUrl": "https://api.example.com",
"apiKey": "sk-xxx",
"model": "gpt-4",
"stream": true
},
"settings": {
"maxConcurrent": 5,
"timeout": 150,
"foldChars": 1000,
"systemPrompt": "You are a search assistant..."
}
}Base URL
Standard:
https://api.example.com— automatically appends/v1/chat/completionsCustom endpoint:
https://api.example.com/custom/path#— trailing#means use as-is (strip the#)
Tool Reference
question
{ "content": "What is REST and its core principles?" }Ask a single question. Answers exceeding foldChars are truncated; use check for full content. Returns session ID on timeout.
agentq
{ "questions": ["What is Docker?", "What is Kubernetes?"] }Parallel questions, up to maxConcurrent. Each question runs in isolated context, returning its own result or session ID.
list
{ "count": 5 }Returns the most recent N session summaries (ID, timestamp, status, question preview).
check
{ "id": "a3f8k2", "showFull": true, "showThinking": true }Retrieve session result. Blocks if the session is still running. Parameters:
showFull— Return full answer without truncationshowThinking— Include chain-of-thought content
Highlights
Streaming — SSE streaming with real-time session file updates
Auto fallback — Automatically falls back to non-streaming if streaming fails
Chain-of-thought separation — Detects
<think>tags,reasoning_content, andthinkingfieldsHot reload — Config changes from the management panel take effect immediately, no restart needed
Session persistence — All Q&A records saved locally with full history browsing
Requirements
Node.js >= 18
Recommended Prompt
Add the following to your project's CLAUDE.md or other AI client system prompt for best results:
## askr Usage Guidelines
You can use askr MCP tools to search and ask an external AI to assist with your tasks.
### When to Use
- **Pre-implementation research**: Before using unfamiliar or uncertain public APIs / library functions, query their signatures, parameters, usage, and caveats to avoid writing incorrect code from memory
- **Unfamiliar concepts**: Look up terms, protocols, or design patterns you don't fully understand
- **Latest information**: Query recently released versions, changelogs, news, or announcements that may not be covered in your training data
- **Troubleshooting**: When facing hard-to-solve errors or unexpected behavior, search for solutions including GitHub Issues, Stack Overflow discussions, etc.
- **Solution comparison**: When deciding between multiple technical approaches, query their trade-offs and community recommendations
- **Configuration & compatibility**: Look up platform-specific configurations or known compatibility issues for particular environments or versions
### Principles
1. Prefer using askr to search first; only rely on your own knowledge when you are fully confident it is accurate and up-to-date
2. Be specific in your questions, include necessary context (language, framework, version, etc.), avoid vague queries
3. Combine related sub-questions into one question; use agentq for unrelated parallel questions
4. Use check to retrieve timed-out results; re-ask if check times out twice
5. Use check(id, showFull: true) when you need the full untruncated answerAcknowledgements
Thanks to the LinuxDo community for the support!
License
MIT
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.