Flutter 實(shí)現(xiàn)手機(jī)端 App,如果想利用 AI 模型添加新穎的功能抱怔,那么 ncnn 就是一種可考慮的手機(jī)端推理模型的框架。
本文即是 Flutter 上使用 ncnn 做模型推理的實(shí)踐分享幻捏。有如下內(nèi)容:
- ncnn 體驗(yàn):環(huán)境準(zhǔn)備凡资、模型轉(zhuǎn)換及測試
- Flutter 項(xiàng)目體驗(yàn): 本文 demo_ncnn 體驗(yàn)
- Flutter 項(xiàng)目實(shí)現(xiàn)
- 創(chuàng)建 FFI plugin幽七,實(shí)現(xiàn) dart 綁定 C 接口
- 創(chuàng)建 App,于 Linux 應(yīng)用 plugin 做推理
- 適配 App离福,于 Android 能編譯運(yùn)行
demo_ncnn 代碼: https://github.com/ikuokuo/start-flutter/tree/main/demo_ncnn
ncnn 體驗(yàn)
ncnn 環(huán)境準(zhǔn)備
獲取 ncnn 源碼杖狼,并編譯。以下是 Ubuntu 上的步驟:
# demo 用的預(yù)編譯庫妖爷,建議與其版本一致
export YYYYMMDD=20230517
git clone -b $YYYYMMDD --depth 1 https://github.com/Tencent/ncnn.git
# Build for Linux
# https://github.com/Tencent/ncnn/wiki/how-to-build#build-for-linux
sudo apt install build-essential git cmake libprotobuf-dev protobuf-compiler libvulkan-dev vulkan-tools libopencv-dev
cd ncnn/
git submodule update --init
mkdir -p build; cd build
# cmake -LAH ..
cmake -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_INSTALL_PREFIX=$HOME/ncnn-$YYYYMMDD \
-DNCNN_VULKAN=ON \
-DNCNN_BUILD_EXAMPLES=ON \
-DNCNN_BUILD_TOOLS=ON \
..
make -j$(nproc); make install
配置 ncnn 環(huán)境蝶涩,
# 軟鏈,以便替換
sudo ln -sfT $HOME/ncnn-$YYYYMMDD /usr/local/ncnn
cat <<-EOF >> ~/.bashrc
# ncnn
export NCNN_HOME=/usr/local/ncnn
export PATH=\$NCNN_HOME/bin:\$PATH
EOF
# 測試 tools
ncnnoptimize
測試 YOLOX 推理樣例,
# 下載 YOLOX ncnn 模型绿聘,解壓進(jìn)工作目錄 ncnn/build/examples
# 說明可見 ncnn/examples/yolox.cpp 的注釋
# https://github.com/Megvii-BaseDetection/YOLOX/releases/download/0.1.1rc0/yolox_s_ncnn.tar.gz
tar -xzvf yolox_s_ncnn.tar.gz
# 下載 YOLOX 測試圖片嗽上,拷貝進(jìn)工作目錄 ncnn/build/examples
# https://github.com/Megvii-BaseDetection/YOLOX/blob/main/assets/dog.jpg
# 進(jìn)入工作目錄
cd ncnn/build/examples
# 運(yùn)行 YOLOX ncnn 樣例
./yolox dog.jpg
ncnn 模型轉(zhuǎn)換
上述 YOLOX 推理,用的是已轉(zhuǎn)換好的模型熄攘。實(shí)際推理某一個模型兽愤,得了解如何做轉(zhuǎn)換。
這里還以 YOLOX 模型為例鲜屏,體驗(yàn) ncnn 轉(zhuǎn)換烹看、修改、量化模型的過程洛史。步驟依照的 YOLOX/demo/ncnn 的說明惯殊。此外,ncnn/tools 下有各類模型轉(zhuǎn)換工具的說明也殖。
Step 1) 下載 YOLOX 模型
- yolox_nano.onnx: YOLOX-Nano ONNX 模型
Step 2) onnx2ncnn 轉(zhuǎn)換模型
# onnx 簡化
# https://github.com/daquexian/onnx-simplifier
# pip3 install onnxsim
python3 -m onnxsim yolox_nano.onnx yolox_nano_sim.onnx
# onnx 轉(zhuǎn)換為 ncnn
onnx2ncnn yolox_nano_sim.onnx yolox_nano.param yolox_nano.bin
報(bào)錯 Unsupported slice step !
可忽略土思。Focus layer 已經(jīng)于 demo 的 yolox.cpp
里實(shí)現(xiàn)了。
Step 3) 修改 yolox_nano.param
修改 yolox_nano.param
把第一個 Convolution
前的層都刪掉忆嗜,另加個 YoloV5Focus
層己儒,并修改層數(shù)值。
修改前:
291 324
Input images 0 1 images
Split splitncnn_input0 1 4 images images_splitncnn_0 images_splitncnn_1 images_splitncnn_2 images_splitncnn_3
Crop 630 1 1 images_splitncnn_3 630 -23309=2,0,0 -23310=2,2147483647,2147483647 -23311=2,1,2
Crop 635 1 1 images_splitncnn_2 635 -23309=2,0,1 -23310=2,2147483647,2147483647 -23311=2,1,2
Crop 640 1 1 images_splitncnn_1 640 -23309=2,1,0 -23310=2,2147483647,2147483647 -23311=2,1,2
Crop 650 1 1 images_splitncnn_0 650 -23309=2,1,1 -23310=2,2147483647,2147483647 -23311=2,1,2
Concat Concat_40 4 1 630 640 635 650 683 0=0
Convolution Conv_41 1 1 683 1177 0=16 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=1728
修改后:
286 324
Input images 0 1 images
YoloV5Focus focus 1 1 images 683
注:onnx 簡化這里用處不大捆毫,合了本來要刪除的幾個
Crop
層闪湾。
Step 4) ncnnoptimize 量化模型
ncnnoptimize
轉(zhuǎn)為 fp16,減少一半權(quán)重:
ncnnoptimize yolox_nano.param yolox_nano.bin yolox_nano_fp16.param yolox_nano_fp16.bin 65536
如果量化為 int8绩卤,可見 Post Training Quantization Tools途样。
ncnn 推理實(shí)踐
修改 ncnn/examples/yolox.cpp
detect_yolox()
里模型路徑,重編譯后測試:
cd ncnn/build/examples
./yolox dog.jpg
demo_ncnn 體驗(yàn)
demo_ncnn 是本文實(shí)踐的演示項(xiàng)目濒憋,可以運(yùn)行體驗(yàn)何暇。效果如下:
準(zhǔn)備 Flutter 環(huán)境
Flutter 請依照官方文檔 Get started 進(jìn)行準(zhǔn)備。
準(zhǔn)備 demo_ncnn 項(xiàng)目
獲取 demo_ncnn 源碼凛驮,
git clone --depth 1 https://github.com/ikuokuo/start-flutter.git
其中裆站,
-
demo_ncnn/
: 選擇圖片進(jìn)行 ncnn 推理的 Flutter 應(yīng)用 -
plugins/ncnn_yolox/
: ncnn 推理 yolox 模型的 Flutter FFI 插件
安裝依賴,
cd demo_ncnn/
flutter pub get
sudo apt-get install libclang-dev libomp-dev
準(zhǔn)備 Linux 預(yù)編譯庫黔夭,
解壓進(jìn) plugins/ncnn_yolox/linux/
宏胯。
準(zhǔn)備 Android 預(yù)編譯庫,
解壓進(jìn) plugins/ncnn_yolox/android/
本姥。
確認(rèn) ncnn_yolox/src/CMakeLists.txt
里 ncnn_DIR
OpenCV_DIR
的路徑正確胳嘲。
體驗(yàn) demo_ncnn 項(xiàng)目
運(yùn)行體驗(yàn),
cd demo_ncnn/
flutter run
# 或查看設(shè)備扣草,-d 指定運(yùn)行
flutter devices
flutter run -d linux
demo_ncnn 實(shí)現(xiàn)
demo_ncnn 實(shí)現(xiàn)了牛,分為兩部分:
- Flutter FFI 插件:實(shí)現(xiàn) dart 綁定 C 接口
- Flutter App 應(yīng)用:實(shí)現(xiàn) UI 并應(yīng)用插件做推理
創(chuàng)建 FFI 插件
# 創(chuàng)建 FFI 插件
flutter create --org dev.flutter -t plugin_ffi --platforms=android,ios,linux ncnn_yolox
cd ncnn_yolox
# 更新 ffigen 版本
# 不然颜屠,可能報(bào)錯 Error: The type 'YoloX' must be 'base', 'final' or 'sealed'
flutter pub outdated
flutter pub upgrade --major-versions
之后,只需在 src/ncnn_yolox.h
里定義 C 接口并實(shí)現(xiàn)鹰祸,然后用 package:ffigen 自動生成 Dart 綁定就可以了甫窟。
Step 1) 定義 C 接口
#ifdef __cplusplus
extern "C" {
#endif
FFI_PLUGIN_EXPORT typedef int yolox_err_t;
#define YOLOX_OK 0
#define YOLOX_ERROR -1
FFI_PLUGIN_EXPORT struct YoloX {
const char *model_path; // path to model file
const char *param_path; // path to param file
float nms_thresh; // nms threshold
float conf_thresh; // threshold of bounding box prob
float target_size; // target image size after resize, might use 416 for small model
};
// ncnn::Mat::PixelType
FFI_PLUGIN_EXPORT enum PixelType {
PIXEL_RGB = 1,
PIXEL_BGR = 2,
PIXEL_GRAY = 3,
PIXEL_RGBA = 4,
PIXEL_BGRA = 5,
};
FFI_PLUGIN_EXPORT struct Rect {
float x;
float y;
float w;
float h;
};
FFI_PLUGIN_EXPORT struct Object {
int label;
float prob;
struct Rect rect;
};
FFI_PLUGIN_EXPORT struct DetectResult {
int object_num;
struct Object *object;
};
FFI_PLUGIN_EXPORT struct YoloX *yoloxCreate();
FFI_PLUGIN_EXPORT void yoloxDestroy(struct YoloX *yolox);
FFI_PLUGIN_EXPORT struct DetectResult *detectResultCreate();
FFI_PLUGIN_EXPORT void detectResultDestroy(struct DetectResult *result);
FFI_PLUGIN_EXPORT yolox_err_t detectWithImagePath(
struct YoloX *yolox, const char *image_path, struct DetectResult *result);
FFI_PLUGIN_EXPORT yolox_err_t detectWithPixels(
struct YoloX *yolox, const uint8_t *pixels, enum PixelType pixelType,
int img_w, int img_h, struct DetectResult *result);
#ifdef __cplusplus
}
#endif
Step 2) 實(shí)現(xiàn) C 接口
src/ncnn_yolox.cc 實(shí)現(xiàn)參考 ncnn/examples/yolox.cpp
來做的蛙婴。
Step 3) 更新 Dart 綁定接口
lib/ncnn_yolox_bindings_generated.dart粗井,
flutter pub run ffigen --config ffigen.yaml
如果要了解 dart 怎么與 C 交互,可見:C interop using dart:ffi街图。
Step 4) 準(zhǔn)備依賴庫
準(zhǔn)備 ncnn opencv 的預(yù)編譯庫浇衬,
- Linux,解壓進(jìn)
linux/
- ncnn-YYYYMMDD-ubuntu-2204-shared.zip
- opencv-mobile-4.6.0-ubuntu-2204.zip
- Android餐济,解壓進(jìn)
android/
- ncnn-YYYYMMDD-android-vulkan-shared.zip
- opencv-mobile-4.6.0-android.zip
Step 5) 寫構(gòu)建腳本
# packages
if(CMAKE_SYSTEM_NAME STREQUAL "Linux")
set(ncnn_DIR "${MY_PROJ}/linux/ncnn-20230517-ubuntu-2204-shared/lib/cmake")
set(OpenCV_DIR "${MY_PROJ}/linux/opencv-mobile-4.6.0-ubuntu-2204/lib/cmake")
elseif(CMAKE_SYSTEM_NAME STREQUAL "Android")
set(ncnn_DIR "${MY_PROJ}/android/ncnn-20230517-android-vulkan-shared/${ANDROID_ABI}/lib/cmake/ncnn")
set(OpenCV_DIR "${MY_PROJ}/android/opencv-mobile-4.6.0-android/sdk/native/jni")
else()
message(FATAL_ERROR "system not support: ${CMAKE_SYSTEM_NAME}")
endif()
if(NOT EXISTS ${ncnn_DIR})
message(FATAL_ERROR "ncnn_DIR not exists: ${ncnn_DIR}")
endif()
if(NOT EXISTS ${OpenCV_DIR})
message(FATAL_ERROR "OpenCV_DIR not exists: ${OpenCV_DIR}")
endif()
## ncnn
find_package(ncnn REQUIRED)
message(STATUS "ncnn_FOUND: ${ncnn_FOUND}")
## opencv
find_package(OpenCV 4 REQUIRED)
message(STATUS "OpenCV_VERSION: ${OpenCV_VERSION}")
message(STATUS "OpenCV_INCLUDE_DIRS: ${OpenCV_INCLUDE_DIRS}")
message(STATUS "OpenCV_LIBS: ${OpenCV_LIBS}")
# targets
include_directories(
${MY_PROJ}/src
${OpenCV_INCLUDE_DIRS}
)
## ncnn_yolox
add_library(ncnn_yolox SHARED
"ncnn_yolox.cc"
)
target_link_libraries(ncnn_yolox ncnn ${OpenCV_LIBS})
set_target_properties(ncnn_yolox PROPERTIES
PUBLIC_HEADER ncnn_yolox.h
OUTPUT_NAME "ncnn_yolox"
)
target_compile_definitions(ncnn_yolox PUBLIC DART_SHARED_LIB)
測試 ncnn 推理
首先,把準(zhǔn)備好的模型放進(jìn) assets
目錄絮姆。如:
assets/
├── dog.jpg
├── yolox_nano_fp16.bin
└── yolox_nano_fp16.param
之后醉冤,于 Linux 可以自測 C & Dart 接口實(shí)現(xiàn)。
Step 1) C 接口測試
std::string assets_dir("../assets/");
std::string image_path = assets_dir + "dog.jpg";
std::string model_path = assets_dir + "yolox_nano_fp16.bin";
std::string param_path = assets_dir + "yolox_nano_fp16.param";
auto yolox = yoloxCreate();
yolox->model_path = model_path.c_str();
yolox->param_path = param_path.c_str();
yolox->nms_thresh = 0.45;
yolox->conf_thresh = 0.25;
yolox->target_size = 416;
// yolox->target_size = 640;
auto detect_result = detectResultCreate();
auto err = detectWithImagePath(yolox, image_path.c_str(), detect_result);
if (err == YOLOX_OK) {
auto num = detect_result->object_num;
printf("yolox detect ok, num=%d\n", num);
for (int i = 0; i < num; i++) {
Object *obj = detect_result->object + i;
printf(" object[%d] label=%d prob=%.2f rect={x=%.2f y=%.2f w=%.2f h=%.2f}\n",
i, obj->label, obj->prob, obj->rect.x, obj->rect.y, obj->rect.w, obj->rect.h);
}
} else {
printf("yolox detect fail, err=%d\n", err);
}
draw_objects(image_path.c_str(), detect_result);
detectResultDestroy(detect_result);
yoloxDestroy(yolox);
Step 2) Dart 接口測試
final yoloxLib = NcnnYoloxBindings(dlopen('ncnn_yolox', 'build/shared'));
const assetsDir = '../assets';
final imagePath = '$assetsDir/dog.jpg'.toNativeUtf8();
final modelPath = '$assetsDir/yolox_nano_fp16.bin'.toNativeUtf8();
final paramPath = '$assetsDir/yolox_nano_fp16.param'.toNativeUtf8();
final yolox = yoloxLib.yoloxCreate();
yolox.ref.model_path = modelPath.cast();
yolox.ref.param_path = paramPath.cast();
yolox.ref.nms_thresh = 0.45;
yolox.ref.conf_thresh = 0.25;
yolox.ref.target_size = 416;
// yolox.ref.target_size = 640;
final detectResult = yoloxLib.detectResultCreate();
final err =
yoloxLib.detectWithImagePath(yolox, imagePath.cast(), detectResult);
if (err == YOLOX_OK) {
final num = detectResult.ref.object_num;
print('yolox detect ok, num=$num');
for (int i = 0; i < num; i++) {
var obj = detectResult.ref.object.elementAt(i).ref;
print(' object[$i] label=${obj.label}'
' prob=${obj.prob.toStringAsFixed(2)} rect=${obj.rect.str()}');
}
} else {
print('yolox detect fail, err=$err');
}
calloc.free(imagePath);
calloc.free(modelPath);
calloc.free(paramPath);
yoloxLib.detectResultDestroy(detectResult);
yoloxLib.yoloxDestroy(yolox);
Step 3) 運(yùn)行測試
cd ncnn_yolox/linux
make
# cpp test
./build/ncnn_yolox_test
# dart test
dart ncnn_yolox_test.dart
創(chuàng)建 App 寫 UI
創(chuàng)建 App 項(xiàng)目,
flutter create --project-name demo_ncnn --org dev.flutter --android-language java --ios-language objc --platforms=android,ios,linux demo_ncnn
本文項(xiàng)目添加了如下些依賴:
cd demo_ncnn
dart pub add path logging image easy_debounce
flutter pub add mobx flutter_mobx provider path_provider
flutter pub add -d build_runner mobx_codegen
App 狀態(tài)管理用的 MobX鸽照。若要了解使用螺捐,可見:
App 主要就兩個功能:選圖片、做推理矮燎。對應(yīng)實(shí)現(xiàn)了兩個 Store 類:
- image_store.dart: 給圖片路徑定血,異步加載圖片數(shù)據(jù)
- yolox_store.dart: 給圖片數(shù)據(jù),異步預(yù)測圖片對象
因?yàn)榧虞d漏峰、預(yù)測都比較耗時(shí)糠悼,故用的 MobX ObservableFuture 異步方式届榄。若要了解使用浅乔,可見:
以上就是 App 實(shí)現(xiàn)的關(guān)鍵內(nèi)容,也可采取不同方案铝条。
應(yīng)用插件做推理
App 里應(yīng)用插件靖苇,首先要于 pubspec.yaml
里加上插件的依賴:
dependencies:
ncnn_yolox:
path: ../plugins/ncnn_yolox
然后,yolox_store.dart 應(yīng)用了插件做推理班缰,過程與之前 Dart 接口測試基本一致贤壁。差異主要在:
- 多了將
assets
里的模型拷貝進(jìn)臨時(shí)路徑的操作,因?yàn)?App 里無法獲取資源的絕對路徑埠忘。要么改 C 接口脾拆,模型以字節(jié)給到馒索。 - 多了將圖片數(shù)據(jù)從
Uint8List
到Pointer<Uint8>
的拷貝,因?yàn)橐獜?Dart 堆內(nèi)存進(jìn) C 堆內(nèi)存名船〈律希可見注釋的 Issue 了解。
import 'dart:ffi';
import 'dart:io';
import 'package:ffi/ffi.dart';
import 'package:flutter/services.dart';
import 'package:image/image.dart' as img;
import 'package:mobx/mobx.dart';
import 'package:ncnn_yolox/ncnn_yolox_bindings_generated.dart' as yo;
import 'package:path/path.dart' show join;
import 'package:path_provider/path_provider.dart';
import '../util/image.dart';
import '../util/log.dart';
import 'future_store.dart';
part 'yolox_store.g.dart';
class YoloxStore = YoloxBase with _$YoloxStore;
class YoloxObject {
int label = 0;
double prob = 0;
Rect rect = Rect.zero;
}
class YoloxResult {
List<YoloxObject> objects = [];
Duration detectTime = Duration.zero;
}
abstract class YoloxBase with Store {
late yo.NcnnYoloxBindings _yolox;
YoloxBase() {
final dylib = Platform.isAndroid || Platform.isLinux
? DynamicLibrary.open('libncnn_yolox.so')
: DynamicLibrary.process();
_yolox = yo.NcnnYoloxBindings(dylib);
}
@observable
FutureStore<YoloxResult> detectFuture = FutureStore<YoloxResult>();
@action
Future detect(ImageData data) async {
try {
detectFuture.errorMessage = null;
detectFuture.future = ObservableFuture(_detect(data));
detectFuture.data = await detectFuture.future;
} catch (e) {
detectFuture.errorMessage = e.toString();
}
}
Future<YoloxResult> _detect(ImageData data) async {
final timebeg = DateTime.now();
// await Future.delayed(const Duration(seconds: 5));
final modelPath = await _copyAssetToLocal('assets/yolox_nano_fp16.bin',
package: 'ncnn_yolox', notCopyIfExist: false);
final paramPath = await _copyAssetToLocal('assets/yolox_nano_fp16.param',
package: 'ncnn_yolox', notCopyIfExist: false);
log.info('yolox modelPath=$modelPath');
log.info('yolox paramPath=$paramPath');
final modelPathUtf8 = modelPath.toNativeUtf8();
final paramPathUtf8 = paramPath.toNativeUtf8();
final yolox = _yolox.yoloxCreate();
yolox.ref.model_path = modelPathUtf8.cast();
yolox.ref.param_path = paramPathUtf8.cast();
yolox.ref.nms_thresh = 0.45;
yolox.ref.conf_thresh = 0.45;
yolox.ref.target_size = 416;
// yolox.ref.target_size = 640;
final detectResult = _yolox.detectResultCreate();
final pixels = data.image.getBytes(order: img.ChannelOrder.bgr);
// Pass Uint8List to Pointer<Void>
// https://github.com/dart-lang/ffi/issues/27
// https://github.com/martin-labanic/camera_preview_ffi_image_processing/blob/master/lib/image_worker.dart
final pixelsPtr = calloc.allocate<Uint8>(pixels.length);
for (int i = 0; i < pixels.length; i++) {
pixelsPtr[i] = pixels[i];
}
final err = _yolox.detectWithPixels(
yolox,
pixelsPtr,
yo.PixelType.PIXEL_BGR,
data.image.width,
data.image.height,
detectResult);
final objects = <YoloxObject>[];
if (err == yo.YOLOX_OK) {
final num = detectResult.ref.object_num;
for (int i = 0; i < num; i++) {
final o = detectResult.ref.object.elementAt(i).ref;
final obj = YoloxObject();
obj.label = o.label;
obj.prob = o.prob;
obj.rect = Rect.fromLTWH(o.rect.x, o.rect.y, o.rect.w, o.rect.h);
objects.add(obj);
}
}
calloc
..free(pixelsPtr)
..free(modelPathUtf8)
..free(paramPathUtf8);
_yolox.detectResultDestroy(detectResult);
_yolox.yoloxDestroy(yolox);
final result = YoloxResult();
result.objects = objects;
result.detectTime = DateTime.now().difference(timebeg);
return result;
}
// ...
}
最后渠驼,于 UI home_page.dart 里使用蜈块,
class HomePage extends StatefulWidget {
const HomePage({super.key, required this.title});
final String title;
@override
State<HomePage> createState() => _HomePageState();
}
class _HomePageState extends State<HomePage> {
late ImageStore _imageStore;
late YoloxStore _yoloxStore;
late OptionStore _optionStore;
@override
void didChangeDependencies() {
_imageStore = Provider.of<ImageStore>(context);
_yoloxStore = Provider.of<YoloxStore>(context);
_optionStore = Provider.of<OptionStore>(context);
_imageStore.load();
super.didChangeDependencies();
}
void _pickImage() async {
final result = await FilePicker.platform.pickFiles(type: FileType.image);
if (result == null) return;
final image = result.files.first;
_imageStore.load(imagePath: file.path);
}
void _detectImage() {
if (_imageStore.loadFuture.futureState != FutureState.loaded) return;
_yoloxStore.detect(_imageStore.loadFuture.data!);
}
@override
Widget build(BuildContext context) {
const pad = 20.0;
return Scaffold(
appBar: AppBar(
backgroundColor: Theme.of(context).colorScheme.inversePrimary,
title: Text(widget.title),
),
body: Padding(
padding: const EdgeInsets.all(pad),
child: Column(
mainAxisAlignment: MainAxisAlignment.spaceBetween,
crossAxisAlignment: CrossAxisAlignment.stretch,
children: [
// 圖片與結(jié)果
Expanded(
flex: 1,
child: Observer(builder: (context) {
if (_imageStore.loadFuture.futureState ==
FutureState.loading) {
return const Center(child: CircularProgressIndicator());
}
if (_imageStore.loadFuture.errorMessage != null) {
return Center(
child: Text(_imageStore.loadFuture.errorMessage!));
}
final data = _imageStore.loadFuture.data;
if (data == null) {
return const Center(child: Text('Image load null :('));
}
_yoloxStore.detectFuture.reset();
return Container(
decoration: BoxDecoration(
border: Border.all(color: Colors.orangeAccent)),
child: DetectResultPage(imageData: data),
);
})),
const SizedBox(height: pad),
// 三個按鈕:選圖、推理迷扇、是否顯示框
Row(
mainAxisAlignment: MainAxisAlignment.center,
children: [
Expanded(
child: ElevatedButton(
child: const Text('Pick image'),
onPressed: () => _debounce('_pickImage', _pickImage),
),
),
const SizedBox(width: pad),
Expanded(
child: ElevatedButton(
child: const Text('Detect objects'),
onPressed: () => _debounce('_detectImage', _detectImage),
),
),
const SizedBox(width: pad),
Expanded(
child: Observer(builder: (context) {
return ElevatedButton.icon(
icon: Icon(_optionStore.bboxesVisible
? Icons.check_box_outlined
: Icons.check_box_outline_blank),
label: const Text('Binding boxes'),
onPressed: () => _optionStore
.setBboxesVisible(!_optionStore.bboxesVisible),
);
}),
),
],
),
],
),
),
);
}
}
適配 Android 工程
Android 構(gòu)建腳本在 android/build.gradle
百揭,也用的 CMake,與 Linux 共享了 src/CMakeLists.txt
蜓席。不過要把 minSdkVersion
改成 24器一,以使用 Vulkan。
Vulkan 于 Android 7.0 (Nougat), API level 24 or higher 開始支持瓮床,可見 NDK / Get started with Vulkan盹舞。
plugins/ncnn_yolox/android/build.gradle
配置:
android {
defaultConfig {
minSdkVersion 24
ndk {
moduleName "ncnn_yolox"
abiFilters "armeabi-v7a", "arm64-v8a", "x86", "x86_64"
}
}
}
demo_ncnn/android/app/build.gradle
也一樣修改 minSdkVersion
為 24
。
最后隘庄,即可 flutter run
運(yùn)行踢步。更多可見 Build and release an Android app。
適配 iOS 工程
本文項(xiàng)目未適配 iOS丑掺。如何適配 iOS获印,請見:
Xcode 14 不再支持提交含有 bitcode 的應(yīng)用,F(xiàn)lutter 3.3.x 之后也移除了 bitcode 的支持街州,可見 Creating an iOS Bitcode enabled app兼丰。