Skip to content

[Build] build onnxruntime for vsinpu error #23316

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
bethalianovike opened this issue Jan 10, 2025 · 8 comments
Closed

[Build] build onnxruntime for vsinpu error #23316

bethalianovike opened this issue Jan 10, 2025 · 8 comments
Labels
build build issues; typically submitted using template ep:vsinpu

Comments

@bethalianovike
Copy link

bethalianovike commented Jan 10, 2025

Describe the issue

I tried to build onnxruntime from source with use_vsinpu enabled. But when I ran ./build.sh, it gave an error message.

Urgency

No response

Target platform

VSI NPU on arm64

Build script

./build.sh --use_vsinpu

Error / output

CMake Error at CMakeLists.txt:710 (message):
The compiler doesn't support BFLOAT16!!!

-- ******** Summary ********
--   CMake version                     : 3.31.2
--   CMake command                     : /home/miniconda3/lib/python3.11/site-packages/cmake/data/bin/cmake
--   System                            : Linux
--   C++ compiler                      : /usr/bin/c++
--   C++ compiler version              : 9.4.0
--   CXX flags                         :  -ffunction-sections -fdata-sections -Wno-restrict  -DCPUINFO_SUPPORTED -Wnon-virtual-dtor
--   Build type                        : Debug
--   Compile definitions               : ORT_ENABLE_STREAM;EIGEN_MPL2_ONLY;_GNU_SOURCE;__STDC_FORMAT_MACROS
--   CMAKE_PREFIX_PATH                 : /home/onnxruntime/onnxruntime/build/Linux/Debug/installed
--   CMAKE_INSTALL_PREFIX              : /usr/local
--   CMAKE_MODULE_PATH                 : /home/onnxruntime/onnxruntime/cmake/external
-- 
--   ONNX version                      : 1.17.0
--   ONNX NAMESPACE                    : onnx
--   ONNX_USE_LITE_PROTO               : ON
--   USE_PROTOBUF_SHARED_LIBS          : OFF
--   Protobuf_USE_STATIC_LIBS          : ON
--   ONNX_DISABLE_EXCEPTIONS           : OFF
--   ONNX_DISABLE_STATIC_REGISTRATION  : OFF
--   ONNX_WERROR                       : OFF
--   ONNX_BUILD_TESTS                  : OFF
--   ONNX_BUILD_SHARED_LIBS            : 
--   BUILD_SHARED_LIBS                 : OFF
-- 
--   Protobuf compiler                 : 
--   Protobuf includes                 : 
--   Protobuf libraries                : 
--   BUILD_ONNX_PYTHON                 : OFF
-- Could NOT find Eigen3 (missing: Eigen3_DIR)
CMake Warning (dev) at /home/miniconda3/lib/python3.11/site-packages/cmake/data/share/cmake-3.31/Modules/FetchContent.cmake:1953 (message):
  Calling FetchContent_Populate(eigen) is deprecated, call
  FetchContent_MakeAvailable(eigen) instead.  Policy CMP0169 can be set to
  OLD to allow FetchContent_Populate(eigen) to be called directly for now,
  but the ability to call it with declared details will be removed completely
  in a future version.
Call Stack (most recent call first):
  external/eigen.cmake:22 (FetchContent_Populate)
  external/onnxruntime_external_deps.cmake:529 (include)
  CMakeLists.txt:629 (include)
This warning is for project developers.  Use -Wno-dev to suppress it.

-- Populating eigen
-- Configuring done (0.1s)
-- Generating done (0.0s)
-- Build files have been written to: /home/onnxruntime/onnxruntime/build/Linux/Debug/_deps/eigen-subbuild
[100%] Built target eigen-populate
-- Finished fetching external dependencies
NVCC_ERROR = 
NVCC_OUT = no such file or directory
CMake Error at CMakeLists.txt:710 (message):
  The compiler doesn't support BFLOAT16!!!


-- Configuring incomplete, errors occurred!
Traceback (most recent call last):
  File "/home/onnxruntime/onnxruntime/tools/ci_build/build.py", line 2964, in <module>
    sys.exit(main())
             ^^^^^^
  File "/home/onnxruntime/onnxruntime/tools/ci_build/build.py", line 2827, in main
    generate_build_tree(
  File "/home/onnxruntime/onnxruntime/tools/ci_build/build.py", line 1656, in generate_build_tree
    run_subprocess(
  File "/home/onnxruntime/onnxruntime/tools/ci_build/build.py", line 868, in run_subprocess
    return run(*args, cwd=cwd, capture_stdout=capture_stdout, shell=shell, env=my_env)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/onnxruntime/onnxruntime/tools/python/util/run.py", line 49, in run
    completed_process = subprocess.run(
                        ^^^^^^^^^^^^^^^
  File "/home/miniconda3/lib/python3.11/subprocess.py", line 571, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['/home/miniconda3/bin/cmake', '/home/onnxruntime/onnxruntime/cmake', '-Donnxruntime_RUN_ONNX_TESTS=OFF', '-Donnxruntime_GENERATE_TEST_REPORTS=ON', '-DPython_EXECUTABLE=/home/miniconda3/bin/python3', '-DPYTHON_EXECUTABLE=/home/miniconda3/bin/python3', '-Donnxruntime_USE_VCPKG=OFF', '-Donnxruntime_USE_MIMALLOC=OFF', '-Donnxruntime_ENABLE_PYTHON=OFF', '-Donnxruntime_BUILD_CSHARP=OFF', '-Donnxruntime_BUILD_JAVA=OFF', '-Donnxruntime_BUILD_NODEJS=OFF', '-Donnxruntime_BUILD_OBJC=OFF', '-Donnxruntime_BUILD_SHARED_LIB=OFF', '-Donnxruntime_BUILD_APPLE_FRAMEWORK=OFF', '-Donnxruntime_USE_DNNL=OFF', '-Donnxruntime_USE_NNAPI_BUILTIN=OFF', '-Donnxruntime_USE_VSINPU=ON', '-Donnxruntime_USE_RKNPU=OFF', '-Donnxruntime_ENABLE_MICROSOFT_INTERNAL=OFF', '-Donnxruntime_USE_VITISAI=OFF', '-Donnxruntime_USE_TENSORRT=OFF', '-Donnxruntime_USE_TENSORRT_BUILTIN_PARSER=ON', '-Donnxruntime_USE_MIGRAPHX=OFF', '-Donnxruntime_DISABLE_CONTRIB_OPS=OFF', '-Donnxruntime_DISABLE_ML_OPS=OFF', '-Donnxruntime_DISABLE_RTTI=OFF', '-Donnxruntime_DISABLE_EXCEPTIONS=OFF', '-Donnxruntime_MINIMAL_BUILD=OFF', '-Donnxruntime_EXTENDED_MINIMAL_BUILD=OFF', '-Donnxruntime_MINIMAL_BUILD_CUSTOM_OPS=OFF', '-Donnxruntime_REDUCED_OPS_BUILD=OFF', '-Donnxruntime_USE_DML=OFF', '-Donnxruntime_USE_WINML=OFF', '-Donnxruntime_BUILD_MS_EXPERIMENTAL_OPS=OFF', '-Donnxruntime_USE_TELEMETRY=OFF', '-Donnxruntime_ENABLE_LTO=OFF', '-Donnxruntime_USE_ACL=OFF', '-Donnxruntime_USE_ARMNN=OFF', '-Donnxruntime_ARMNN_RELU_USE_CPU=ON', '-Donnxruntime_ARMNN_BN_USE_CPU=ON', '-Donnxruntime_USE_JSEP=OFF', '-Donnxruntime_USE_WEBGPU=OFF', '-Donnxruntime_USE_EXTERNAL_DAWN=OFF', '-Donnxruntime_ENABLE_NVTX_PROFILE=OFF', '-Donnxruntime_ENABLE_TRAINING=OFF', '-Donnxruntime_ENABLE_TRAINING_OPS=OFF', '-Donnxruntime_ENABLE_TRAINING_APIS=OFF', '-Donnxruntime_ENABLE_CPU_FP16_OPS=OFF', '-Donnxruntime_USE_NCCL=OFF', '-Donnxruntime_BUILD_BENCHMARKS=OFF', '-Donnxruntime_USE_ROCM=OFF', '-Donnxruntime_GCOV_COVERAGE=OFF', '-Donnxruntime_USE_MPI=OFF', '-Donnxruntime_ENABLE_MEMORY_PROFILE=OFF', '-Donnxruntime_ENABLE_CUDA_LINE_NUMBER_INFO=OFF', '-Donnxruntime_USE_CUDA_NHWC_OPS=OFF', '-Donnxruntime_BUILD_WEBASSEMBLY_STATIC_LIB=OFF', '-Donnxruntime_ENABLE_WEBASSEMBLY_EXCEPTION_CATCHING=ON', '-Donnxruntime_ENABLE_WEBASSEMBLY_API_EXCEPTION_CATCHING=OFF', '-Donnxruntime_ENABLE_WEBASSEMBLY_EXCEPTION_THROWING=ON', '-Donnxruntime_WEBASSEMBLY_RUN_TESTS_IN_BROWSER=OFF', '-Donnxruntime_ENABLE_WEBASSEMBLY_THREADS=OFF', '-Donnxruntime_ENABLE_WEBASSEMBLY_MEMORY64=OFF', '-Donnxruntime_ENABLE_WEBASSEMBLY_DEBUG_INFO=OFF', '-Donnxruntime_ENABLE_WEBASSEMBLY_PROFILING=OFF', '-Donnxruntime_ENABLE_LAZY_TENSOR=OFF', '-Donnxruntime_ENABLE_EXTERNAL_CUSTOM_OP_SCHEMAS=OFF', '-Donnxruntime_ENABLE_CUDA_PROFILING=OFF', '-Donnxruntime_ENABLE_ROCM_PROFILING=OFF', '-Donnxruntime_USE_XNNPACK=OFF', '-Donnxruntime_USE_WEBNN=OFF', '-Donnxruntime_USE_CANN=OFF', '-Donnxruntime_USE_TRITON_KERNEL=OFF', '-Donnxruntime_DISABLE_FLOAT8_TYPES=OFF', '-Donnxruntime_DISABLE_SPARSE_TENSORS=OFF', '-Donnxruntime_DISABLE_OPTIONAL_TYPE=OFF', '-Donnxruntime_CUDA_MINIMAL=OFF', '-DCMAKE_TLS_VERIFY=ON', '-DFETCHCONTENT_QUIET=OFF', '-DCMAKE_BUILD_TYPE=Debug', '-DCMAKE_PREFIX_PATH=/home/onnxruntime/onnxruntime/build/Linux/Debug/installed']' returned non-zero exit status 1.

Visual Studio Version

No response

GCC / Compiler Version

9.4.0 (Ubuntu 20.04)

@bethalianovike bethalianovike added the build build issues; typically submitted using template label Jan 10, 2025
@bethalianovike
Copy link
Author

Hi @chenfeiyue-cfy!

Can you give me directions on how to build onnxruntime for vsinpu? What compatible version of TIM-VX is required?
I appreciate any help you can provide.

These are our attempts to build onnxruntime for vsinpu:

  1. using the current version of onnxruntime (from microsoft/onnxruntime)
    Command: ./build.sh --use_vsinpu
    Error message:
CMake Error at CMakeLists.txt:710 (message): The compiler doesn't support BFLOAT16!!!
  1. using the onnxruntime from VeriSilicon/onnxruntime
    TIMVX version: 1.2.2
    Command: ./build.sh -use-vsinpu
    Error message:
error: 'ASYMMETRIC_PER_CHANNEL' is not a member of 'tim::vx::QuantType'
  1. using the onnxruntime from VeriSilicon/onnxruntime
    TIMVX version: 1.2.6
    Command: ./build.sh -use-vsinpu
    Error message:
    Built up to 100% then run some test
  • When running test /home/onnxruntime/build/Linux/Debug/onnx_test_runner "/home/onnxruntime/build/Linux/Debug/_deps/onnx-src/onnx/backend/test/data/pytorch-operator" it will give SEGFAULT
The following tests FAILED:
	  1 - onnxruntime_test_all (SEGFAULT)
Errors while running CTest
Output from these tests are in: /home/onnxruntime/build/Linux/Debug/Testing/Temporary/LastTest.log
Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely.
  • Then, when I tried to import onnxruntime on python, it gave an error message:
ModuleNotFoundError: No module named 'onnxruntime.capi'
  1. using the onnxruntime from VeriSilicon/onnxruntime
    TIMVX version: 1.2.6
    Use the step from VSINPU-ExecutionProvider.md for build and run test
    Command: ./build.sh --config Debug --build_shared_lib --use_vsinpu --skip_tests
    Build succeed (build [INFO] - Build complete).
    Error message:
  • When I tried to import onnxruntime on python, it gave an error message:
ModuleNotFoundError: No module named 'onnxruntime.capi'
  • When run test ./onnx_test_runner -e vsinpu /home/onnxruntime/ (it contains mobilenetv2.onnx), it gave an error message:
Error: Failed to load model because protobuf parsing failed.
  • When running test ./onnx_test_runner -e vsinpu /home/onnxruntime/test/testdata/transform/, it gave an error message:
Error: Missing Input: input_0

@chenfeiyue-cfy
Copy link
Contributor

Hi @bethalianovike , I received your message,it seems you take wrong method to build with our backend. The right way is to use current version of onnxruntime (from microsoft/onnxruntime) and the command: ./build.sh --config Debug --build_shared_lib --use_vsinpu --skip_tests,( fromVSINPU-ExecutionProvider.md).The repo from Verisilicon/onnxruntime is not in use now, please ignore it.
Please let me know if you still have met other building issues, thanks!

@bethalianovike
Copy link
Author

Hi @chenfeiyue-cfy
Thanks! I already build the onnxruntime successfully.
But when I tried to import onnxruntime on python3 it said No module named 'onnxruntime'.
Can you give me the directions on how to use the onnxruntime package?

@chenfeiyue-cfy
Copy link
Contributor

Hi @chenfeiyue-cfy Thanks! I already build the onnxruntime successfully. But when I tried to import onnxruntime on python3 it said No module named 'onnxruntime'. Can you give me the directions on how to use the onnxruntime package?

@xuke537 please help to solve the python binding issue

@bethalianovike
Copy link
Author

Hi @xuke537, can you give me directions on how to solve the python binding issue after building onnxruntime? Thank you!

@xuke537
Copy link
Contributor

xuke537 commented Feb 12, 2025

Hi @bethalianovike , please use the following command to build onnxruntime with vsinpu to enble python binding.
”./build.sh --config Debug --build_shared_lib --use_vsinpu --skip_tests --enable_pybind --build_wheel --parall“
After building onnxruntime, there are two ways to import onnxruntime module.

  1. Use the local onnxruntime python module. The onnxruntime python module is located in "./build/Linux/Debug/onnxruntime"
  2. Install onnxruntime wheel package. The wheel package is located in "./build/Linux/Debug/dist".

@bethalianovike
Copy link
Author

Hi @xuke537, thank you for your explanation, thankfully I can import onnxruntime module successfully.
But when I tried to run mobilenet_v2_quantized model, it gave an error message when running session.run()
Can you please take a look and give me some direction or advice? Thank you!
Error message:

[E:onnxruntime:, vsinpu_execution_provider.cc:197 ComputeStateFunc] Failed to run graph. 

Script code:

sess = onnxruntime.InferenceSession("mobilenet_v2_quantized.onnx", providers=['VSINPUExecutionProvider'])
with open("cat.npy", 'rb') as f:
    img_data = np.load(f)
output = session.run([], {"input": img_data})[0]

@xuke537
Copy link
Contributor

xuke537 commented Feb 18, 2025

Hi @bethalianovike ,this issue should occur in Vivante SDK,not an onnxruntime issue. Please report it to your vendor.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
build build issues; typically submitted using template ep:vsinpu
Projects
None yet
Development

No branches or pull requests

4 participants