This code is derived from(modified some codes) paddle_ocr
project: https://github.com/PaddlePaddle/PaddleOCR/tree/release/2.6/deploy/cpp_infer
how to install paddle_inference sdk (by downloading tar package from official web)?
- Choose the right tar file according to your version of CUDA and GPUs
- Put the tar file at anywhere and unzip it
- Run
build/install_paddle_inference.sh
, it will help you to install it automatically, please modify the root path of paddle_inference at the first line in script.
CUDA 11.1 CUDNN 8.0.5 tensorrt7.2.1 for this repo.
cd ./build
, and runsh ./build.sh
.- It will generate a
*.so
library namedlibpaddle_ocr.so
.
Make sure you have build paddle_ocr correctly
- Change model paths to your specific values in
./main/main.cpp
. - Select ./main/main.cpp file and click
Run
button in vscode, choose one launch item at the top of window(paddle_ocr
).