Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@mcp-ToseaAITurn the attached project-specs.pdf into a 10-slide presentation."
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
mcp-ToseaAI
Official MCP server for ToseaAI document-to-presentation workflows.
This server wraps the production ToseaAI HTTP contract at /api/mcp/v1 and exposes a stable MCP tool surface for Claude Code, Cursor, Codex, and other MCP clients.
API keys stay server-side and are never echoed back to the agent.
Long-running operations use
presentation_idplus polling, not raw SSE.Mutating tools support explicit idempotency keys where the backend supports them.
File uploads stay local until the MCP server streams them to ToseaAI over HTTPS.
Why a separate repo
This repository should stay independent from the main application repository.
Release cadence is different from the backend.
Breaking changes to tool names and prompts must be versioned separately.
Nested git repos or submodules add unnecessary operational friction for MCP users.
Install
npm install
npm run buildRequired environment variables
TOSEA_API_KEY=sk_...
TOSEA_API_BASE_URL=https://tosea.aiOptional:
TOSEA_TIMEOUT_MSTOSEA_MAX_RETRIESTOSEA_POLL_INTERVAL_MSTOSEA_MAX_POLL_MSTOSEA_LOG_LEVEL
Claude Code example
{
"mcpServers": {
"tosea": {
"command": "node",
"args": ["C:/new/mcp-ToseaAI/dist/index.js"],
"env": {
"TOSEA_API_KEY": "sk_...",
"TOSEA_API_BASE_URL": "https://tosea.ai"
}
}
}
}Client-specific examples live in examples/README.md.
Cursor example
Use examples/cursor.mcp.json as the starting point for your local mcp.json.
OpenAI Agents SDK example
OpenAI's Agents SDK supports stdio MCP servers, so this repo can be used directly as a local subprocess MCP without needing a hosted HTTP wrapper. See examples/openai-agents-typescript.ts.
If you later need OpenAI Responses API hosted remote MCP mode, add a separate Streamable HTTP transport wrapper instead of changing this stdio package in place.
Tool summary
tosea_healthtosea_get_permissions_summarytosea_get_quota_statustosea_list_presentationstosea_get_presentation_full_datatosea_parse_pdftosea_generate_outlinetosea_edit_outline_pagetosea_render_slidestosea_edit_slide_pagetosea_export_presentationtosea_pdf_to_presentationtosea_wait_for_jobtosea_list_exportstosea_list_export_filestosea_redownload_export
Reliability model
GETrequests use bounded retries with backoff and jitter.Upload-creating endpoints do not auto-retry because the current backend does not accept idempotency keys on those routes.
outline edit,slide edit, andexportsupportidempotency_key; reuse the same value only when retrying the same logical action.wait_for_jobpolls untilcompletedorfailed, then returns the final job payload as JSON.
Security notes
API keys must start with
sk_.The server redacts bearer secrets from surfaced errors.
The MCP tool layer does not expose JWT-only account operations.
Export history only exposes user-visible files returned by the backend.
Smoke test
This repository includes a non-billing smoke test that checks auth, health, permissions, and list access without creating presentations:
npm run smokeOptional flags:
--feature-key outline_generate--expect-tier pro--list-limit 5
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.