Skip to main content
Glama
README.md1.1 kB
English | [简体中文](README_ch.md) # PP-OCR Deployment - [PP-OCR Deployment](#pp-ocr-deployment) - [Paddle Deployment Introduction](#paddle-deployment-introduction) - [PP-OCR Deployment](#pp-ocr-deployment-1) <a name="1"></a> ## Paddle Deployment Introduction Paddle provides a variety of deployment schemes to meet the deployment requirements of different scenarios. Please choose according to the actual situation: <div align="center"> <img src="../doc/deployment_en.png" width="800"> </div> <a name="2"></a> ## PP-OCR Deployment PP-OCR has supported multi deployment schemes. Click the link to get the specific tutorial. - [Python Inference](../doc/doc_en/inference_ppocr_en.md) - [C++ Inference](./cpp_infer/readme.md) - [Serving (Python/C++)](./pdserving/README.md) - [Paddle-Lite (ARM CPU/OpenCL ARM GPU)](./lite/readme.md) - [Paddle2ONNX](./paddle2onnx/readme.md) If you need the deployment tutorial of academic algorithm models other than PP-OCR, please directly enter the main page of corresponding algorithms, [entrance](../doc/doc_en/algorithm_overview_en.md)。

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/PaddlePaddle/PaddleOCR'

If you have feedback or need assistance with the MCP directory API, please join our Discord server