Skip to main content
Glama
cpp_local_deployment.en.md193 B
# C++ Local Deployment Linux: [C++ Local Deployment for General OCR Pipeline - Linux](./OCR.en.md) Windows: [C++ Local Deployment for General OCR Pipeline - Windows](./OCR_windows.en.md)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/PaddlePaddle/PaddleOCR'

If you have feedback or need assistance with the MCP directory API, please join our Discord server