Maintains compatibility with existing N8N workflows by providing the same endpoint and prompt structure for AI-powered customer service automation.
Provides integration with OpenAI's API for AI-powered customer service responses, with configurable model selection and API key authentication.
Stores and manages logs of AI interaction inputs and outputs in a PostgreSQL database with persistent data storage.
qi140-mcp-multi
MCP Server multifinalitario MCP Atendimento AI (Docker)
Mantém o endpoint POST /webhook/atendimento-ai com o mesmo payload do fluxo atual [2].
Prompt carregado de prompts/prompt.txt, equivalente ao usado no N8N [1].
Logs de entrada/saída em PostgreSQL.
Documentação OpenAPI/Swagger em /docs.
Uso
Configure .env (ou use defaults): cp .env.example .env
Suba com Docker: docker-compose up --build
Teste: curl --location 'http://localhost:8080/webhook/atendimento-ai'
--header 'Content-Type: application/json'
--data '{ "body": { "texts": [ { "cn": { "id": "x" }, "patient_id": "123" } ] } }'
Provedores de LLM
GENERIC: ajuste LLM_BASE_URL, LLM_PORT, LLM_MODEL (API compatível com OpenAI).
OPENAI: defina PROVIDER=OPENAI e OPENAI_API_KEY (e LLM_MODEL).
Banco de dados
Volume pgdata persiste os dados.
Esquema criado por db/init/01_schema.sql.
Swagger
GET /docs e /docs/openapi.json
Observação MCP
Inicialização opcional (MCP_ENABLE=true).
O arquivo src/mcp/server.ts possui o placeholder para registrar tools/resources conforme o SDK.
Referências
Fluxo e prompt (ResumoClínico) [1]
Exemplo de request e rota /webhook/atendimento-ai [2]
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
A multi-purpose MCP server for AI-powered healthcare assistance that processes clinical data through configurable LLM providers. Maintains webhook endpoints for patient data processing with PostgreSQL logging and OpenAPI documentation.