飞道的博客

目标检测 YOLOv5 - ncnn模型的加密 C++实现封装库和Android调用库示例

391人阅读  评论(0)

目标检测 YOLOv5 - ncnn模型的加密 C++实现封装库和Android调用库示例

flyfish

前言

源码下载地址在文章末尾
将模型和重要代码全部封装到库中,生成静态库a或者动态库so,如果是windows下就是lib或者dll。
上层应用程序使用库和一个头文件,对于应用程序开发者,模型和重要代码是不可见的,达到加密的目的。

如需更多的加密方法,请参考nihui如何加密ncnn模型
此repo包括C++编写的库qt_android_ncnn_lib和Android调用库示例qt_android_ncnn_example
此代码演示如何模型如何加密的,Android是如何调用的。

模型版本

YOLOV5 6.2

库的版本

该示例部署环境是arm64-v8a,应用于android
opencv4.6.0 下载地址

ncnn20220420 下载地址
这里选的是CPU版本的静态库

代码目录如下

示例程序的编译环境

使用Qt C++开发的Android程序
Qt 6.2.3
Qt Creator 6.0.2
android-ndk-r21e
整个编写代码过程不涉及Java,只有C++
代码目录如下

准备工作

模型的转换

当一个YOLOv5 6.2模型训练完成后
1 先使用该repo导出onnx格式
shaoshengsong/yolov5_62_export_ncnn (github.com)

2 再按照该文档将onnx格式的模型转换为ncnn格式

https://flyfish.blog.csdn.net/article/details/127669328

3 最后ncnn 的 ncnn2mem 工具将ncnn模型转换为 param.bin + bin形式

$ncnn2mem yolov5s_6.2.param yolov5s_6.2.bin yolov5s_6.2.id.h yolov5s_6.2.mem.h

主要使用yolov5s_6.2.id.h 和yolov5s_6.2.mem.h这两个文件

库的制作

以静态库为例
在YOLOv5类的实现文件加入

#include "yolov5s_6.2.id.h"
#include "yolov5s_6.2.mem.h"

这两个文件对上层应用开发者不可见

配置如下

主要接口部分

public:
    int init(int bgr_rgb,float prob_threshold,float nms_threshold);
    int inference(const cv::Mat& bgr, std::vector<Object>& objects);
    cv::Mat draw_objects(const cv::Mat& bgr, const std::vector<Object>& objects);


private:
     ncnn::Net * model_;
     int bgr_rgb_=0;//输入图片的通道顺序 0:bgr   1:rgb
     float prob_threshold_ = 0.25f;
     float nms_threshold_ = 0.45f;

模型初始化部分

int YOLOv5::init(int bgr_rgb,float prob_threshold,float nms_threshold)
{
   

     bgr_rgb_=bgr_rgb;//输入图片的通道顺序 0:bgr   1:rgb
     prob_threshold_ = prob_threshold;
     nms_threshold_ = nms_threshold;

    //reference https://github.com/Tencent/ncnn/wiki/ncnn-load-model
    model_ = new ncnn::Net();

    model_->load_param(yolov5s_6_2_param_bin);

    model_->load_model(yolov5s_6_2_bin);

    return 0;

}

 

yolov5s_6_2_param_bin 和yolov5s_6_2_bin这两个变量在
yolov5s_6.2.mem.h中原型分别是

static const unsigned char yolov5s_6_2_bin[]
static const unsigned char yolov5s_6_2_param_bin[] 

推理的输入输出

ex.input(yolov5s_6_2_param_id::BLOB_images, in_pad);
ex.extract(yolov5s_6_2_param_id::BLOB_output, out);
ex.extract(yolov5s_6_2_param_id::BLOB_353, out);
ex.extract(yolov5s_6_2_param_id::BLOB_367, out);

CMakeList的配置

cmake_minimum_required(VERSION 3.14)
project(yolov5lib LANGUAGES CXX)


set(CMAKE_INCLUDE_CURRENT_DIR ON)
set(CMAKE_AUTOUIC ON)
set(CMAKE_AUTOMOC ON)
set(CMAKE_AUTORCC ON)
set(CMAKE_CXX_STANDARD 11)
set(CMAKE_CXX_STANDARD_REQUIRED ON)


include_directories(
${CMAKE_SOURCE_DIR}/opencv4.6.0/native/jni/include/opencv2
${CMAKE_SOURCE_DIR}/opencv4.6.0/native/jni/include
${CMAKE_SOURCE_DIR}/ncnn20220420/include
${CMAKE_SOURCE_DIR}/ncnn20220420/include/ncnn

)


set(STATIC_LIBS
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_calib3d.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_gapi.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_objdetect.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_core.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_highgui.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_photo.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_dnn.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_imgcodecs.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_stitching.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_features2d.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_imgproc.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_video.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_flann.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_ml.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/staticlibs/arm64-v8a/libopencv_videoio.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/3rdparty/libs/arm64-v8a/libtbb.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/3rdparty/libs/arm64-v8a/libIlmImf.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/3rdparty/libs/arm64-v8a/liblibjpeg-turbo.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/3rdparty/libs/arm64-v8a/liblibtiff.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/3rdparty/libs/arm64-v8a/libtegra_hal.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/3rdparty/libs/arm64-v8a/libade.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/3rdparty/libs/arm64-v8a/liblibopenjp2.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/3rdparty/libs/arm64-v8a/liblibwebp.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/3rdparty/libs/arm64-v8a/libcpufeatures.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/3rdparty/libs/arm64-v8a/liblibpng.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/3rdparty/libs/arm64-v8a/libquirc.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/3rdparty/libs/arm64-v8a/libittnotify.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/3rdparty/libs/arm64-v8a/liblibprotobuf.a
        ${CMAKE_SOURCE_DIR}/opencv4.6.0/native/3rdparty/libs/arm64-v8a/libtbb.a
        ${CMAKE_SOURCE_DIR}/ncnn20220420/lib/libncnn.a

)


add_library(yolov5lib STATIC
  YOLOv5.cpp
  YOLOv5.h
  yolov5s_6.2.id.h
  yolov5s_6.2.mem.h
)
target_link_libraries(yolov5lib PRIVATE  ${STATIC_LIBS})
target_compile_definitions(yolov5lib  PRIVATE YOLOv5_LIBRARY)

 

编译结果,输出库libyolov5lib.a


其他部分详看

示例部分

提供给示例代码包括一个YOLOv5.h和libyolov5lib.a库
看不到模型了,因为模型已经在libyolov5lib.a文件中

yolov5_.init(1,0.25,0.45); //初始化
std::vector<Object> objects;
yolov5_.inference(orig_img, objects);//推理
cv::Mat ret= yolov5_.draw_objects(orig_img, objects);//结果显示

推理结果存储在objects中,ret用于图像显示。


本文源码地址
https://github.com/shaoshengsong/qt_android_ncnn_lib_encrypt_example


转载:https://blog.csdn.net/flyfish1986/article/details/127982597
查看评论
* 以上用户言论只代表其个人观点,不代表本网站的观点或立场