ora*_*nge 5 c++ gradle android-ndk android-studio tensorflow-lite
implementation 'org.tensorflow:tensorflow-lite:+'在我的build.gradle依赖项下推理时间不是那么好,所以现在我想在 Android 的 NDK 中使用 TFL。
所以我在 Android Studio 的 NDK 中构建了 Java 应用程序的精确副本,现在我试图在项目中包含 TFL 库。我按照TensorFlow-Lite 的 Android 指南在本地构建了 TFL 库(并获得了一个 AAR 文件),并将该库包含在我在 Android Studio 中的 NDK 项目中。
现在我试图在我的 C++ 文件中使用 TFL 库,尝试#include在代码中使用它,但我收到一条错误消息:(cannot find tensorflow或我尝试使用的任何其他名称,根据我在我的CMakeLists.txt文件)。
应用程序build.gradle:
apply plugin: 'com.android.application'
android {
compileSdkVersion 29
buildToolsVersion "29.0.3"
defaultConfig {
applicationId "com.ndk.tflite"
minSdkVersion 28
targetSdkVersion 29
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
externalNativeBuild {
cmake {
cppFlags ""
}
}
ndk {
abiFilters 'arm64-v8a'
}
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
// tf lite
aaptOptions {
noCompress "tflite"
}
externalNativeBuild {
cmake {
path "src/main/cpp/CMakeLists.txt"
version "3.10.2"
}
}
}
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation 'androidx.appcompat:appcompat:1.1.0'
implementation 'androidx.constraintlayout:constraintlayout:1.1.3'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.1'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.2.0'
// tflite build
compile(name:'tensorflow-lite', ext:'aar')
}
Run Code Online (Sandbox Code Playgroud)
项目build.gradle:
buildscript {
repositories {
google()
jcenter()
}
dependencies {
classpath 'com.android.tools.build:gradle:3.6.2'
}
}
allprojects {
repositories {
google()
jcenter()
// native tflite
flatDir {
dirs 'libs'
}
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
Run Code Online (Sandbox Code Playgroud)
CMakeLists.txt:
cmake_minimum_required(VERSION 3.4.1)
add_library( # Sets the name of the library.
native-lib
# Sets the library as a shared library.
SHARED
# Provides a relative path to your source file(s).
native-lib.cpp )
add_library( # Sets the name of the library.
tensorflow-lite
# Sets the library as a shared library.
SHARED
# Provides a relative path to your source file(s).
native-lib.cpp )
find_library( # Sets the name of the path variable.
log-lib
# Specifies the name of the NDK library that
# you want CMake to locate.
log )
target_link_libraries( # Specifies the target library.
native-lib tensorflow-lite
# Links the target library to the log library
# included in the NDK.
${log-lib} )
Run Code Online (Sandbox Code Playgroud)
本机 lib.cpp:
#include <jni.h>
#include <string>
#include "tensorflow"
extern "C" JNIEXPORT jstring JNICALL
Java_com_xvu_f32c_1jni_MainActivity_stringFromJNI(
JNIEnv* env,
jobject /* this */) {
std::string hello = "Hello from C++";
return env->NewStringUTF(hello.c_str());
}
class FlatBufferModel {
// Build a model based on a file. Return a nullptr in case of failure.
static std::unique_ptr<FlatBufferModel> BuildFromFile(
const char* filename,
ErrorReporter* error_reporter);
// Build a model based on a pre-loaded flatbuffer. The caller retains
// ownership of the buffer and should keep it alive until the returned object
// is destroyed. Return a nullptr in case of failure.
static std::unique_ptr<FlatBufferModel> BuildFromBuffer(
const char* buffer,
size_t buffer_size,
ErrorReporter* error_reporter);
};
Run Code Online (Sandbox Code Playgroud)
我也尝试遵循这些:
但就我而言,我使用 Bazel 来构建 TFL 库。
尝试构建 ( label_image )的分类演示,我设法构建它并连接adb push到我的设备,但是在尝试运行时出现以下错误:
ERROR: Could not open './mobilenet_quant_v1_224.tflite'.
Failed to mmap model ./mobilenet_quant_v1_224.tflite
Run Code Online (Sandbox Code Playgroud)
android_sdk_repository/android_ndk_repository在WORKSPACE让我出错:WORKSPACE:149:1: Cannot redefine repository after any load statement in the WORKSPACE file (for repository 'androidsdk'),并且将这些语句定位在不同的位置导致了相同的错误。WORKSPACEzimenglyu 的帖子的这些更改并继续:我已经编译libtensorflowLite.so和编辑,CMakeLists.txt以便libtensorflowLite.so引用该文件,但忽略了该FlatBuffer部分。Android项目编译成功,但没有明显变化,我仍然无法包含任何TFLite库。尝试编译 TFL,我添加了一个cc_binaryto tensorflow/tensorflow/lite/BUILD(按照label_image 示例):
cc_binary(
name = "native-lib",
srcs = [
"native-lib.cpp",
],
linkopts = tflite_experimental_runtime_linkopts() + select({
"//tensorflow:android": [
"-pie",
"-lm",
],
"//conditions:default": [],
}),
deps = [
"//tensorflow/lite/c:common",
"//tensorflow/lite:framework",
"//tensorflow/lite:string_util",
"//tensorflow/lite/delegates/nnapi:nnapi_delegate",
"//tensorflow/lite/kernels:builtin_ops",
"//tensorflow/lite/profiling:profiler",
"//tensorflow/lite/tools/evaluation:utils",
] + select({
"//tensorflow:android": [
"//tensorflow/lite/delegates/gpu:delegate",
],
"//tensorflow:android_arm64": [
"//tensorflow/lite/delegates/gpu:delegate",
],
"//conditions:default": [],
}),
)
Run Code Online (Sandbox Code Playgroud)
并尝试为 构建它x86_64,但arm64-v8a我收到一个错误:cc_toolchain_suite rule @local_config_cc//:toolchain: cc_toolchain_suite '@local_config_cc//:toolchain' does not contain a toolchain for cpu 'x86_64'。
external/local_config_cc/BUILD在第 47 行检查(提供错误):
cc_toolchain_suite(
name = "toolchain",
toolchains = {
"k8|compiler": ":cc-compiler-k8",
"k8": ":cc-compiler-k8",
"armeabi-v7a|compiler": ":cc-compiler-armeabi-v7a",
"armeabi-v7a": ":cc-compiler-armeabi-v7a",
},
)
Run Code Online (Sandbox Code Playgroud)
这些是唯一cc_toolchain找到的2秒。在存储库中搜索“cc-compiler-”,我只找到了“ aarch64 ”,我认为它适用于 64 位 ARM,但没有找到“x86_64”。不过有“x64_windows”——我在 Linux 上。
尝试使用 aarch64 进行构建,如下所示:
bazel build -c opt --fat_apk_cpu=aarch64 --cpu=aarch64 --host_crosstool_top=@bazel_tools//tools/cpp:toolchain //tensorflow/lite/java:tensorflow-lite
Run Code Online (Sandbox Code Playgroud)
导致错误:
ERROR: /.../external/local_config_cc/BUILD:47:1: in cc_toolchain_suite rule @local_config_cc//:toolchain: cc_toolchain_suite '@local_config_cc//:toolchain' does not contain a toolchain for cpu 'aarch64'
Run Code Online (Sandbox Code Playgroud)
我能够x86_64通过更改soname构建配置并在CMakeLists.txt. 这导致了.so共享库。另外 - 我能够arm64-v8a通过调整aarch64_makefile.inc文件来构建用于使用 TFLite Docker 容器的库,但我没有更改任何构建选项,并让它build_aarch64_lib.sh构建任何东西。这导致了一个.a静态库。
所以现在我有两个 TFLite 库,但我仍然无法使用它们(#include "..."例如我不能使用它们)。
尝试构建项目时,仅使用可以x86_64正常工作,但尝试包含arm64-v8a库会导致 ninja 错误:'.../libtensorflow-lite.a', needed by '.../app/build/intermediates/cmake/debug/obj/armeabi-v7a/libnative-lib.so', missing and no known rule to make it.
lite目录,并创建了一个类似的结构中app/src/main/cpp,其中我包括(A)tensorflow,(B)ABSL和(C)flatbuffers文件#include "tensorflow/...所有 tensorflow 头文件中的行更改为相对路径,以便编译器可以找到它们。build.gradle我为.tflite文件添加了一个无压缩行:aaptOptions { noCompress "tflite" }assets应用程序添加了一个目录native-lib.cpp我添加了一些来自 TFLite 网站的示例代码arm64-v8a)。我收到一个错误:
/path/to/Android/Sdk/ndk/20.0.5594570/toolchains/llvm/prebuilt/linux-x86_64/sysroot/usr/include/c++/v1/memory:2339: error: undefined reference to 'tflite::impl::Interpreter::~Interpreter()'
Run Code Online (Sandbox Code Playgroud)
在<memory>,第 2339"delete __ptr;"行是:
_LIBCPP_INLINE_VISIBILITY void operator()(_Tp* __ptr) const _NOEXCEPT {
static_assert(sizeof(_Tp) > 0,
"default_delete can not delete incomplete type");
static_assert(!is_void<_Tp>::value,
"default_delete can not delete incomplete type");
delete __ptr;
}
Run Code Online (Sandbox Code Playgroud)
如何在 Android Studio 中包含 TFLite 库,以便从 NDK 运行 TFL 推理?
或者 - 如何使用 gradle (目前使用cmake)来构建和编译源文件?
我通过以下方式将 Native TFL 与 C-API 结合使用:
.arr为.zip并解压文件以获得共享库(.so文件)c中的目录下载所有头文件jni目录(New-> Folder-> JNI Folder)app/src/main并在其中创建架构子目录(arm64-v8a或者x86_64例如)jni目录(架构目录旁边)中,并将共享库放在架构目录/ies中CMakeLists.txt文件并包含add_libraryTFL 库的节、节中共享库的路径set_target_properties以及节中的标头include_directories(参见下文的“注释”部分)包含native-lib.cpp标题,例如:
#include "../jni/c_api.h"
#include "../jni/common.h"
#include "../jni/builtin_ops.h"
Run Code Online (Sandbox Code Playgroud)
可以直接调用TFL函数,例如:
TfLiteModel * model = TfLiteModelCreateFromFile(full_path);
TfLiteInterpreter * interpreter = TfLiteInterpreterCreate(model);
TfLiteInterpreterAllocateTensors(interpreter);
TfLiteTensor * input_tensor =
TfLiteInterpreterGetInputTensor(interpreter, 0);
const TfLiteTensor * output_tensor =
TfLiteInterpreterGetOutputTensor(interpreter, 0);
TfLiteStatus from_status = TfLiteTensorCopyFromBuffer(
input_tensor,
input_data,
TfLiteTensorByteSize(input_tensor));
TfLiteStatus interpreter_invoke_status = TfLiteInterpreterInvoke(interpreter);
TfLiteStatus to_status = TfLiteTensorCopyToBuffer(
output_tensor,
output_data,
TfLiteTensorByteSize(output_tensor));
Run Code Online (Sandbox Code Playgroud)
cmake环境也包括在内cppFlags "-frtti -fexceptions"CMakeLists.txt例子:
set(JNI_DIR ${CMAKE_CURRENT_SOURCE_DIR}/../jni)
add_library(tflite-lib SHARED IMPORTED)
set_target_properties(tflite-lib
PROPERTIES IMPORTED_LOCATION
${JNI_DIR}/${ANDROID_ABI}/libtfl.so)
include_directories( ${JNI_DIR} )
target_link_libraries(
native-lib
tflite-lib
...)
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
3680 次 |
| 最近记录: |