accessibility-mcp
Integrates with Ollama to provide natural-language semantic search and RAG (Retrieval-Augmented Generation) capabilities, enabling users to query WAI-ARIA Authoring Practices Guide (APG) patterns and examples using embeddings.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@accessibility-mcpshow me the keyboard guidance for an accessible accordion"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
accessibility-mcp
A Model Context Protocol server (built with FastMCP) that exposes WAI-ARIA Authoring Practices Guide (APG) patterns: narrative requirements, keyboard/ARIA guidance as Markdown, official example source (HTML, CSS, JS) from w3c/aria-practices, and optional RAG via Ollama + LangChain.js (apg_semantic_search after npm run rag:index).
This is APG (widget patterns), not the full WCAG spec. For WCAG success criteria text, use W3C’s WCAG materials separately; APG is the right source for patterns like Carousel and the patterns index.
Dataset
npm run ingest— shallow-clonesw3c/aria-practicesinto.cache/, writes:data/manifest.json— compact index (ids, titles, example slugs, bundle paths)data/patterns/<id>.md— pattern doc as Markdown (from*-pattern.html)data/bundles/<id>/<example>.json— referenced HTML/CSS/JS per demo (binary assets listed but omitted)
npm run rag:index— (after ingest +.env) calls Ollama embeddings and writesdata/rag/chunks.json: chunked pattern docs plus one combined text blob per example (HTML/CSS/JS). Re-run after ingest or when you changeOLLAMA_EMBEDDING_MODEL.Vendored
data/is intended to be committed (or published in the npm package) so the server works without a network. The RAG file can be large; commit or regenerate per machine.
Usage
npm install
npm run ingest # refresh from GitHub (re-run when you want newer APG)
npm run rag:index # build embeddings (needs Ollama + OLLAMA_EMBEDDING_MODEL)
npm run build
npm start # stdio MCP servernpx (after publish to npm)
npx -y accessibility-mcpAfter the package is on npm, most clients can use command + args with npx / -y / accessibility-mcp instead of a local node path.
Environment (.env)
At startup the server loads .env from the package root (same folder as package.json). Copy .env.example → .env and adjust.
Variable | Purpose |
| Ollama HTTP API root, e.g. |
| Model id for LangChain.js |
| Model id for |
| If |
| Optional. Directory that contains |
On npm start, the CLI calls Ollama GET /api/tags then POST /api/pull (streaming) for any configured chat/embedding model that is not already present. Progress lines go to stderr only so stdout stays valid for MCP stdio.
Ollama + LangChain.js helpers (for RAG scripts or future MCP tools):
loadEnv()— load.envexplicitly (also runs viagetOllamaConfig()/resolveDataDir()).getOllamaConfig()— parsed{ baseUrl, chatModel, embeddingModel }.ensureOllamaModels()—GET /api/tags+POST /api/pullfor missing models (same as MCP startup).createChatOllama()/createOllamaEmbeddings()—@langchain/ollamainstances using those settings.
import { createChatOllama, createOllamaEmbeddings } from "accessibility-mcp";The apg_semantic_search tool calls Ollama at query time (embed query → cosine similarity vs data/rag/chunks.json). npm run rag:index builds that index with createOllamaEmbeddings().
Sanity check (requires Ollama reachable at OLLAMA_BASE_URL with OLLAMA_CHAT_MODEL pulled):
cp .env.example .env # then edit if needed
npm run ollama:smokeIDE and agent setup
MCP wiring differs by product: some use a top-level mcpServers object; VS Code uses servers inside mcp.json. Below, replace /absolute/path/to/accessibility-mcp with your clone (or use npx once published).
Shared snippets
Stdio via local build (mcpServers shape — Cursor, Claude Desktop, Claude Code, Gemini CLI):
{
"mcpServers": {
"apg-patterns": {
"command": "node",
"args": ["/absolute/path/to/accessibility-mcp/dist/cli.js"]
}
}
}Stdio via npx (after npm publish):
{
"mcpServers": {
"apg-patterns": {
"command": "npx",
"args": ["-y", "accessibility-mcp"]
}
}
}Custom data directory (any client that supports env on the server process):
{
"mcpServers": {
"apg-patterns": {
"command": "node",
"args": ["/absolute/path/to/accessibility-mcp/dist/cli.js"],
"env": {
"APG_MCP_DATA_DIR": "/absolute/path/to/accessibility-mcp/data"
}
}
}
}Visual Studio Code (GitHub Copilot agent / MCP)
VS Code stores MCP config in mcp.json using a servers object (not mcpServers). See Add and manage MCP servers in VS Code and the MCP configuration reference.
Workspace:
.vscode/mcp.jsonUser: Command Palette → MCP: Open User Configuration
Example (local checkout):
{
"servers": {
"apg-patterns": {
"type": "stdio",
"command": "node",
"args": ["/absolute/path/to/accessibility-mcp/dist/cli.js"]
}
}
}Example (npx, after publish):
{
"servers": {
"apg-patterns": {
"type": "stdio",
"command": "npx",
"args": ["-y", "accessibility-mcp"]
}
}
}You can also use MCP: Add Server in the Command Palette or install from the Extensions view (@mcp gallery) if this server is listed there.
Cursor
Cursor merges MCP config from:
Project:
.cursor/mcp.jsonGlobal:
~/.cursor/mcp.json(project entries override global)
Use the mcpServers JSON shape from the shared snippets above. See Model Context Protocol (MCP) | Cursor Docs. Restart Cursor after changes if tools do not appear.
Claude Desktop
Edit the Claude desktop config file and merge under mcpServers:
OS | Typical path |
macOS |
|
Windows |
|
Linux |
|
Use the shared mcpServers snippet. Restart Claude Desktop after saving.
Claude Code
Claude Code supports project .mcp.json, user ~/.claude/settings.json, and other scopes; stdio servers use the same mcpServers structure. See Connect Claude Code to tools via MCP and Claude Code settings.
CLI (stdio server):
claude mcp add apg-patterns -- node /absolute/path/to/accessibility-mcp/dist/cli.js(or with npx after publish: claude mcp add apg-patterns -- npx -y accessibility-mcp)
Gemini CLI
Configure mcpServers in Gemini CLI settings. User vs project scope:
User:
~/.gemini/settings.jsonProject:
.gemini/settings.jsonin the repo
Details: MCP servers with the Gemini CLI.
CLI (stdio; user scope — writes ~/.gemini/settings.json):
gemini mcp add --scope user apg-patterns node /absolute/path/to/accessibility-mcp/dist/cli.jsUse --scope project to write .gemini/settings.json instead. Run gemini mcp add --help for flags (-e for env, --trust, etc.).
OpenAI Codex (CLI and IDE extension)
Codex stores MCP servers in config.toml, default ~/.codex/config.toml, or project .codex/config.toml on trusted projects. CLI and IDE share this file. See Model Context Protocol – Codex.
TOML example (stdio):
[mcp_servers.apg-patterns]
command = "node"
args = ["/absolute/path/to/accessibility-mcp/dist/cli.js"]CLI:
codex mcp add apg-patterns -- node /absolute/path/to/accessibility-mcp/dist/cli.jsOther editors
Windsurf / JetBrains / etc.: If the product documents MCP stdio support, reuse the same
command/argsas above; the wrapper key name may differ—check that product’s MCP docs.VS Code discovery: With
chat.mcp.discovery.enabled, VS Code can pick up MCP definitions from some other apps (e.g. Claude Desktop). See the VS Code MCP article.
Tools
Tool | Purpose |
| Source commit, generation time, patterns index URL |
| All pattern ids/titles; optional |
| Markdown spec + example list ( |
| Example sources ( |
| RAG: natural-language search ( |
Resources
apg://manifest— full manifest JSONapg://pattern/{patternId}— pattern Markdownapg://example/{patternId}/{slug}— example sources as Markdown
License
ISC (this package). APG content is W3C documentation; see W3C document license.
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/nicolasgalvez/accessibility-mcp'
If you have feedback or need assistance with the MCP directory API, please join our Discord server