Skip to content

Migrate "Linux CPU Minimal Build E2E CI Pipeline" to Github Actions #1023

Migrate "Linux CPU Minimal Build E2E CI Pipeline" to Github Actions

Migrate "Linux CPU Minimal Build E2E CI Pipeline" to Github Actions #1023

Triggered via pull request April 24, 2025 17:44
Status Success
Total duration 1h 20m 6s
Artifacts 1

linux_tensorrt_ci.yml

on: pull_request
Build Linux TensorRT x64 Release  /  build_test_pipeline
45m 56s
Build Linux TensorRT x64 Release / build_test_pipeline
Test Linux TensorRT x64 Release
29m 13s
Test Linux TensorRT x64 Release
Fit to window
Zoom out
Zoom in

Annotations

12 warnings
Build Linux TensorRT x64 Release / build_test_pipeline
stderr: WARNING! Your password will be stored unencrypted in /home/cloudtest/.docker/config.json. Configure a credential helper to remove this warning. See https://docs.docker.com/engine/reference/commandline/login/#credential-stores
Build Linux TensorRT x64 Release / build_test_pipeline
stderr: #0 building with "default" instance using docker driver #1 [internal] load build definition from Dockerfile.manylinux2_28_cuda #1 transferring dockerfile: #1 transferring dockerfile: 1.90kB done #1 DONE 3.4s #2 [internal] load metadata for onnxruntimebuildcache.azurecr.io/internal/azureml/onnxruntime/build/cuda12_x64_ubi8_gcc12:20250124.1 #2 ... #3 [auth] internal/azureml/onnxruntime/build/cuda12_x64_ubi8_gcc12:pull token for onnxruntimebuildcache.azurecr.io #3 DONE 0.0s #2 [internal] load metadata for onnxruntimebuildcache.azurecr.io/internal/azureml/onnxruntime/build/cuda12_x64_ubi8_gcc12:20250124.1 #2 DONE 5.2s #4 [internal] load .dockerignore #4 transferring context: #4 transferring context: 2B done #4 DONE 2.2s #5 [internal] load build context #5 DONE 0.0s #6 [1/6] FROM onnxruntimebuildcache.azurecr.io/internal/azureml/onnxruntime/build/cuda12_x64_ubi8_gcc12:20250124.1@sha256:61d011618097ddf13426233a855b41e0ae5eb0fa6bdcc34429bc249a1f4732c5 #6 resolve onnxruntimebuildcache.azurecr.io/internal/azureml/onnxruntime/build/cuda12_x64_ubi8_gcc12:20250124.1@sha256:61d011618097ddf13426233a855b41e0ae5eb0fa6bdcc34429bc249a1f4732c5 #6 resolve onnxruntimebuildcache.azurecr.io/internal/azureml/onnxruntime/build/cuda12_x64_ubi8_gcc12:20250124.1@sha256:61d011618097ddf13426233a855b41e0ae5eb0fa6bdcc34429bc249a1f4732c5 1.4s done #6 sha256:61d011618097ddf13426233a855b41e0ae5eb0fa6bdcc34429bc249a1f4732c5 7.38kB / 7.38kB done #6 sha256:441fb182622d0e88411bb8a430da4f7383c3b539f657e29d86fd492aba582ce8 30.71kB / 30.71kB done #6 ... #5 [internal] load build context #5 transferring context: 33.11kB done #5 DONE 2.6s #6 [1/6] FROM onnxruntimebuildcache.azurecr.io/internal/azureml/onnxruntime/build/cuda12_x64_ubi8_gcc12:20250124.1@sha256:61d011618097ddf13426233a855b41e0ae5eb0fa6bdcc34429bc249a1f4732c5 #6 sha256:a8042be9673691eb6f70bc963a308b9dd3bed1fa5709a1dea123817ee42e3bfb 306B / 306B 0.4s done #6 sha256:2609da11fd88b1bbd9d644ca322ced1094671722bc7e7d1fee944c9bb693d0f3 0B / 71.79MB 0.5s #6 sha256:7f1b5342c34f462653f321a4aedd22f7e5de8ab9288eabf6e43d5f96a9b97ec1 0B / 103.65MB 0.5s #6 sha256:2609da11fd88b1bbd9d644ca322ced1094671722bc7e7d1fee944c9bb693d0f3 16.78MB / 71.79MB 0.6s #6 sha256:bf1a73b70f801a0f480b750bd3d2e38f9a4a1d14f87b1bcbc90d62cae52225d4 182B / 182B 0.6s done #6 sha256:339b7311cdefb111716ea228edc93a29e090d9fb3ab8fd5a07dd45c9bf4d5da9 1.44kB / 1.44kB 0.5s done #6 sha256:63b9f1fa197e5d822bcd2ff3a93b780b7d0b5c32dae8d4d223538f04fe6eee03 0B / 6.88kB 0.6s #6 sha256:2609da11fd88b1bbd9d644ca322ced1094671722bc7e7d1fee944c9bb693d0f3 54.53MB / 71.79MB 0.8s #6 sha256:7f1b5342c34f462653f321a4aedd22f7e5de8ab9288eabf6e43d5f96a9b97ec1 29.36MB / 103.65MB 0.8s #6 sha256:63b9f1fa197e5d822bcd2ff3a93b780b7d0b5c32dae8d4d223538f04fe6eee03 6.88kB / 6.88kB 0.6s done #6 sha256:2609da11fd88b1bbd9d644ca322ced1094671722bc7e7d1fee944c9bb693d0f3 71.79MB / 71.79MB 1.0s #6 sha256:7f1b5342c34f462653f321a4aedd22f7e5de8ab9288eabf6e43d5f96a9b97ec1 58.72MB / 103.65MB 1.0s #6 sha256:2a6c35d159e5470977cfd95b5896445e4673f7043de124919c67540e8cfa4ce2 0B / 1.28GB 1.0s #6 sha256:7f1b5342c34f462653f321a4aedd22f7e5de8ab9288eabf6e43d5f96a9b97ec1 67.11MB / 103.65MB 1.1s #6 sha256:7f1b5342c34f462653f321a4aedd22f7e5de8ab9288eabf6e43d5f96a9b97ec1 96.47MB / 103.65MB 1.3s #6 sha256:7f1b5342c34f462653f321a4aedd22f7e5de8ab9288eabf6e43d5f96a9b97ec1 103.65MB / 103.65MB 1.5s #6 sha256:2a6c35d159e5470977cfd95b5896445e4673f7043de124919c67540e8cfa4ce2 71.30MB / 1.28GB 1.5s #6 sha256:2609da11fd88b1bbd9d644ca322ced1094671722bc7e7d1fee944c9bb693d0f3 71.79MB / 71.79MB 1.9s done #6 sha256:2a6c35d159e5470977cfd95b5896445e4673f7043de124919c67540e8cfa4ce2 138.41MB / 1.28GB 2.0s #6 sha256:2a6c35d159e5470977cfd95b5896445e4673f7043de124919c67540e8cfa4ce2 205.52MB / 1.28GB 2.4s #6 extracting sha256:2609da11fd88b1bbd9d644ca322ced1094671722bc7e7d1fee944c9bb693d0f3 0.1s #6 sha256:2a6c35d159e5470977cfd95b5896445e4673f7043de124919c67540e8cfa4ce2 284.16MB / 1.28GB 3.0s #6 sha256:7f1b5342c34f462653f321a4aedd22f7e5de8ab9288eabf6e43d5f96a9b97ec1 103.65MB / 103.65MB 3.0s done #6 sha256:426ae846a5417
Build Linux TensorRT x64 Release / build_test_pipeline
Error trying to execute nvidia-smi: Unable to locate executable file: nvidia-smi. Please verify either the file path exists or the file can be found within a directory specified by the PATH environment variable. Also check the file mode to verify the file is executable.. Assuming no GPU.
Build Linux TensorRT x64 Release / build_test_pipeline
stderr: + PATH=/opt/python/cp310-cp310/bin:/usr/local/dotnet:/usr/lib/jvm/msopenjdk-17/bin:/opt/rh/gcc-toolset-12/root/usr/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin + python3 -m pip install --user -r tools/ci_build/github/linux/python/requirements.txt WARNING: The script isympy is installed in '/home/onnxruntimedev/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The scripts f2py and numpy-config are installed in '/home/onnxruntimedev/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. [notice] A new release of pip is available: 24.3.1 -> 25.0.1 [notice] To update, run: pip install --upgrade pip + python3 tools/ci_build/build.py --build_dir build/Release --config Release --cmake_generator Ninja --skip_submodule_sync --build_shared_lib --parallel --use_vcpkg --use_vcpkg_ms_internal_asset_cache --enable_onnx_tests --use_cuda --use_tensorrt --use_binskim_compliant_compile_flags --build_wheel --cuda_version=12.2 --cuda_home=/usr/local/cuda-12.2 --cudnn_home=/usr/local/cuda-12.2 --use_tensorrt --tensorrt_home /usr --build_java --cmake_extra_defines CMAKE_CUDA_ARCHITECTURES=80 onnxruntime_BUILD_UNIT_TESTS=ON onnxruntime_ENABLE_CUDA_EP_INTERNAL_TESTS=ON --build 2025-04-24 18:07:01,085 build [DEBUG] - Command line arguments: --build_dir build/Release --config Release --cmake_generator Ninja --skip_submodule_sync --build_shared_lib --parallel --use_vcpkg --use_vcpkg_ms_internal_asset_cache --enable_onnx_tests --use_cuda --use_tensorrt --use_binskim_compliant_compile_flags --build_wheel --cuda_version=12.2 --cuda_home=/usr/local/cuda-12.2 --cudnn_home=/usr/local/cuda-12.2 --use_tensorrt --tensorrt_home /usr --build_java --cmake_extra_defines CMAKE_CUDA_ARCHITECTURES=80 onnxruntime_BUILD_UNIT_TESTS=ON onnxruntime_ENABLE_CUDA_EP_INTERNAL_TESTS=ON --build 2025-04-24 18:07:01,089 build [INFO] - Build started 2025-04-24 18:07:01,089 build [INFO] - Building targets for Release configuration 2025-04-24 18:07:01,090 build [INFO] - /usr/bin/cmake --build build/Release/Release --config Release -- -j16 2025-04-24 18:29:23,814 build [INFO] - /opt/python/cp310-cp310/bin/python3 /onnxruntime_src/setup.py bdist_wheel --nightly_build --wheel_name_suffix=gpu --cuda_version=12.2 /opt/python/cp310-cp310/lib/python3.10/site-packages/setuptools/_distutils/cmd.py:66: SetuptoolsDeprecationWarning: setup.py install is deprecated. !! ******************************************************************************** Please avoid running ``setup.py`` directly. Instead, use pypa/build, pypa/installer or other standards-based tools. See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details. ******************************************************************************** !! self.initialize_options() DEBUG:auditwheel.musllinux:musl libc not detected DEBUG:auditwheel.libc:Falling back to GNU libc INFO:auditwheel.main_repair:Repairing onnxruntime_gpu-1.23.0.dev20250424-cp310-cp310-linux_x86_64.whl DEBUG:auditwheel.wheel_abi:processing: onnxruntime/capi/libonnxruntime.so.1.23.0 DEBUG:auditwheel.musllinux:musl libc not detected DEBUG:auditwheel.libc:Falling back to GNU libc DEBUG:auditwheel.lddtree:parse_ld_so_conf(//etc/ld.so.conf) DEBUG:auditwheel.lddtree: glob: //etc/ld.so.conf.d/*.conf DEBUG:auditwheel.lddtree: parse_ld_so_conf(//etc/ld.so.conf.d/00-manylinux.conf) DEBUG:auditwheel.lddtree: parse_ld_so_conf(//etc/ld.so.conf.d/gds-12-2.conf) DEBUG:auditwheel.lddtree: parse_ld_so_conf(//etc/ld.so.conf.d/nvidia.conf) DEBUG:auditwheel.lddtree: parse_ld_so_conf(//etc/ld.so.conf.d/000_cuda.conf) DEBUG:auditwheel.lddtree: parse_ld_so_conf(//etc/ld.so.conf.d/988_cuda-12.conf) DEBUG:auditwheel.lddtree:linker ldpaths: {'conf': ['/usr/local/lib', '/usr/local/cuda-12.2/targets/x86_64-linux/lib', '/usr/local/cuda/target
Build Linux TensorRT x64 Release / build_test_pipeline
Wheel output directory /mnt/vss/_work/_temp/Release/dist does not exist.
Build Linux TensorRT x64 Release / build_test_pipeline
Error trying to execute nvidia-smi: Unable to locate executable file: nvidia-smi. Please verify either the file path exists or the file can be found within a directory specified by the PATH environment variable. Also check the file mode to verify the file is executable.. Assuming no GPU.
Build Linux TensorRT x64 Release / build_test_pipeline
stderr: + PATH=/opt/python/cp310-cp310/bin:/usr/local/dotnet:/usr/lib/jvm/msopenjdk-17/bin:/opt/rh/gcc-toolset-12/root/usr/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin + python3 -m pip install --user -r tools/ci_build/github/linux/python/requirements.txt WARNING: The script isympy is installed in '/home/onnxruntimedev/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The scripts f2py and numpy-config are installed in '/home/onnxruntimedev/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. [notice] A new release of pip is available: 24.3.1 -> 25.0.1 [notice] To update, run: pip install --upgrade pip + python3 tools/ci_build/build.py --build_dir build/Release --config Release --cmake_generator Ninja --skip_submodule_sync --build_shared_lib --parallel --use_vcpkg --use_vcpkg_ms_internal_asset_cache --enable_onnx_tests --use_cuda --use_tensorrt --use_binskim_compliant_compile_flags --build_wheel --cuda_version=12.2 --cuda_home=/usr/local/cuda-12.2 --cudnn_home=/usr/local/cuda-12.2 --use_tensorrt --tensorrt_home /usr --build_java --cmake_extra_defines CMAKE_CUDA_ARCHITECTURES=80 onnxruntime_BUILD_UNIT_TESTS=ON onnxruntime_ENABLE_CUDA_EP_INTERNAL_TESTS=ON --update 2025-04-24 18:01:28,792 build [DEBUG] - Command line arguments: --build_dir build/Release --config Release --cmake_generator Ninja --skip_submodule_sync --build_shared_lib --parallel --use_vcpkg --use_vcpkg_ms_internal_asset_cache --enable_onnx_tests --use_cuda --use_tensorrt --use_binskim_compliant_compile_flags --build_wheel --cuda_version=12.2 --cuda_home=/usr/local/cuda-12.2 --cudnn_home=/usr/local/cuda-12.2 --use_tensorrt --tensorrt_home /usr --build_java --cmake_extra_defines CMAKE_CUDA_ARCHITECTURES=80 onnxruntime_BUILD_UNIT_TESTS=ON onnxruntime_ENABLE_CUDA_EP_INTERNAL_TESTS=ON --update 2025-04-24 18:01:28,796 build [INFO] - Build started 2025-04-24 18:01:28,796 build [INFO] - Generating CMake build tree 2025-04-24 18:01:28,814 build [INFO] - /usr/bin/cmake /onnxruntime_src/cmake -Donnxruntime_ENABLE_EXTERNAL_CUSTOM_OP_SCHEMAS=OFF -Donnxruntime_RUN_ONNX_TESTS=ON -Donnxruntime_GENERATE_TEST_REPORTS=ON -DPython_EXECUTABLE=/opt/python/cp310-cp310/bin/python3 -Donnxruntime_USE_VCPKG=ON -Donnxruntime_USE_MIMALLOC=OFF -Donnxruntime_ENABLE_PYTHON=ON -Donnxruntime_BUILD_CSHARP=OFF -Donnxruntime_BUILD_JAVA=ON -Donnxruntime_BUILD_NODEJS=OFF -Donnxruntime_BUILD_OBJC=OFF -Donnxruntime_BUILD_SHARED_LIB=ON -Donnxruntime_BUILD_APPLE_FRAMEWORK=OFF -Donnxruntime_USE_DNNL=OFF -Donnxruntime_USE_NNAPI_BUILTIN=OFF -Donnxruntime_USE_VSINPU=OFF -Donnxruntime_USE_RKNPU=OFF -Donnxruntime_ENABLE_MICROSOFT_INTERNAL=OFF -Donnxruntime_USE_VITISAI=OFF -Donnxruntime_USE_TENSORRT=ON -Donnxruntime_USE_NV=OFF -Donnxruntime_USE_TENSORRT_BUILTIN_PARSER=ON -Donnxruntime_USE_TENSORRT_INTERFACE=OFF -Donnxruntime_USE_CUDA_INTERFACE=OFF -Donnxruntime_USE_OPENVINO_INTERFACE=OFF -Donnxruntime_USE_VITISAI_INTERFACE=OFF -Donnxruntime_USE_QNN_INTERFACE=OFF -Donnxruntime_USE_MIGRAPHX=OFF -Donnxruntime_DISABLE_CONTRIB_OPS=OFF -Donnxruntime_DISABLE_ML_OPS=OFF -Donnxruntime_DISABLE_RTTI=OFF -Donnxruntime_DISABLE_EXCEPTIONS=OFF -Donnxruntime_MINIMAL_BUILD=OFF -Donnxruntime_EXTENDED_MINIMAL_BUILD=OFF -Donnxruntime_MINIMAL_BUILD_CUSTOM_OPS=OFF -Donnxruntime_REDUCED_OPS_BUILD=OFF -Donnxruntime_BUILD_MS_EXPERIMENTAL_OPS=OFF -Donnxruntime_ENABLE_LTO=OFF -Donnxruntime_USE_ACL=OFF -Donnxruntime_USE_ARMNN=OFF -Donnxruntime_ARMNN_RELU_USE_CPU=ON -Donnxruntime_ARMNN_BN_USE_CPU=ON -Donnxruntime_USE_JSEP=OFF -Donnxruntime_USE_WEBGPU=OFF -Donnxruntime_USE_EXTERNAL_DAWN=OFF -Donnxruntime_ENABLE_NVTX_PROFILE=OFF -Donnxruntime_ENABLE_TRAINING=OFF -Donnxruntime_ENABLE_TRAINING_OPS=OFF -Donnxruntime_ENABLE_TRAINING_APIS=OFF -Donnxruntime_ENABLE_CPU_FP16_OPS=OFF -Donnxruntime_USE_NCCL=OFF -Donnxruntime_BUILD_BENCHMARKS=OFF -Donnxruntime_USE_ROCM=OFF -Donnxr
Build Linux TensorRT x64 Release / build_test_pipeline
Wheel output directory /mnt/vss/_work/_temp/Release/dist does not exist.
Test Linux TensorRT x64 Release
stderr: WARNING! Your password will be stored unencrypted in /home/cloudtest/.docker/config.json. Configure a credential helper to remove this warning. See https://docs.docker.com/engine/reference/commandline/login/#credential-stores
Test Linux TensorRT x64 Release
stderr: #0 building with "default" instance using docker driver #1 [internal] load build definition from Dockerfile.manylinux2_28_cuda #1 transferring dockerfile: 1.90kB done #1 DONE 0.0s #2 [auth] internal/azureml/onnxruntime/build/cuda12_x64_ubi8_gcc12:pull token for onnxruntimebuildcache.azurecr.io #2 DONE 0.0s #3 [internal] load metadata for onnxruntimebuildcache.azurecr.io/internal/azureml/onnxruntime/build/cuda12_x64_ubi8_gcc12:20250124.1 #3 DONE 0.3s #4 [internal] load .dockerignore #4 transferring context: 2B done #4 DONE 0.0s #5 [internal] load build context #5 transferring context: 33.11kB done #5 DONE 0.0s #6 [1/6] FROM onnxruntimebuildcache.azurecr.io/internal/azureml/onnxruntime/build/cuda12_x64_ubi8_gcc12:20250124.1@sha256:61d011618097ddf13426233a855b41e0ae5eb0fa6bdcc34429bc249a1f4732c5 #6 resolve onnxruntimebuildcache.azurecr.io/internal/azureml/onnxruntime/build/cuda12_x64_ubi8_gcc12:20250124.1@sha256:61d011618097ddf13426233a855b41e0ae5eb0fa6bdcc34429bc249a1f4732c5 done #6 sha256:7f1b5342c34f462653f321a4aedd22f7e5de8ab9288eabf6e43d5f96a9b97ec1 0B / 103.65MB 0.1s #6 sha256:61d011618097ddf13426233a855b41e0ae5eb0fa6bdcc34429bc249a1f4732c5 7.38kB / 7.38kB done #6 sha256:441fb182622d0e88411bb8a430da4f7383c3b539f657e29d86fd492aba582ce8 30.71kB / 30.71kB done #6 sha256:339b7311cdefb111716ea228edc93a29e090d9fb3ab8fd5a07dd45c9bf4d5da9 0B / 1.44kB 0.1s #6 sha256:339b7311cdefb111716ea228edc93a29e090d9fb3ab8fd5a07dd45c9bf4d5da9 1.44kB / 1.44kB 0.1s done #6 sha256:2609da11fd88b1bbd9d644ca322ced1094671722bc7e7d1fee944c9bb693d0f3 7.34MB / 71.79MB 0.2s #6 sha256:a8042be9673691eb6f70bc963a308b9dd3bed1fa5709a1dea123817ee42e3bfb 306B / 306B 0.1s done #6 sha256:7f1b5342c34f462653f321a4aedd22f7e5de8ab9288eabf6e43d5f96a9b97ec1 41.94MB / 103.65MB 0.4s #6 sha256:2609da11fd88b1bbd9d644ca322ced1094671722bc7e7d1fee944c9bb693d0f3 29.36MB / 71.79MB 0.4s #6 sha256:bf1a73b70f801a0f480b750bd3d2e38f9a4a1d14f87b1bcbc90d62cae52225d4 182B / 182B 0.2s done #6 sha256:2a6c35d159e5470977cfd95b5896445e4673f7043de124919c67540e8cfa4ce2 12.58MB / 1.28GB 0.4s #6 sha256:63b9f1fa197e5d822bcd2ff3a93b780b7d0b5c32dae8d4d223538f04fe6eee03 6.88kB / 6.88kB 0.2s done #6 sha256:7f1b5342c34f462653f321a4aedd22f7e5de8ab9288eabf6e43d5f96a9b97ec1 78.64MB / 103.65MB 0.6s #6 sha256:2609da11fd88b1bbd9d644ca322ced1094671722bc7e7d1fee944c9bb693d0f3 58.72MB / 71.79MB 0.6s #6 sha256:7f1b5342c34f462653f321a4aedd22f7e5de8ab9288eabf6e43d5f96a9b97ec1 102.97MB / 103.65MB 0.7s #6 sha256:2609da11fd88b1bbd9d644ca322ced1094671722bc7e7d1fee944c9bb693d0f3 71.79MB / 71.79MB 0.7s #6 sha256:2a6c35d159e5470977cfd95b5896445e4673f7043de124919c67540e8cfa4ce2 79.69MB / 1.28GB 0.9s #6 sha256:2609da11fd88b1bbd9d644ca322ced1094671722bc7e7d1fee944c9bb693d0f3 71.79MB / 71.79MB 0.9s done #6 extracting sha256:2609da11fd88b1bbd9d644ca322ced1094671722bc7e7d1fee944c9bb693d0f3 #6 sha256:7f1b5342c34f462653f321a4aedd22f7e5de8ab9288eabf6e43d5f96a9b97ec1 103.65MB / 103.65MB 1.3s done #6 sha256:2a6c35d159e5470977cfd95b5896445e4673f7043de124919c67540e8cfa4ce2 150.99MB / 1.28GB 1.4s #6 sha256:d7115a3d38256b1b754c6d798cc6ef0b0efd37f8d6fa9324ca0d79d6d58c2df8 5.90MB / 583.41MB 1.4s #6 sha256:426ae846a5417aca125d0e9f405274b2e0eab89749e7dee20a3f390b50fd0114 2.10MB / 2.40GB 1.4s #6 sha256:d7115a3d38256b1b754c6d798cc6ef0b0efd37f8d6fa9324ca0d79d6d58c2df8 50.33MB / 583.41MB 1.6s #6 sha256:2a6c35d159e5470977cfd95b5896445e4673f7043de124919c67540e8cfa4ce2 216.01MB / 1.28GB 1.9s #6 sha256:d7115a3d38256b1b754c6d798cc6ef0b0efd37f8d6fa9324ca0d79d6d58c2df8 88.08MB / 583.41MB 1.9s #6 sha256:d7115a3d38256b1b754c6d798cc6ef0b0efd37f8d6fa9324ca0d79d6d58c2df8 125.83MB / 583.41MB 2.2s #6 sha256:d7115a3d38256b1b754c6d798cc6ef0b0efd37f8d6fa9324ca0d79d6d58c2df8 155.19MB / 583.41MB 2.4s #6 sha256:426ae846a5417aca125d0e9f405274b2e0eab89749e7dee20a3f390b50fd0114 128.29MB / 2.40GB 2.4s #6 sha256:2a6c35d159e5470977cfd95b5896445e4673f7043de124919c67540e8cfa4ce2 283.12MB / 1.28GB 2.5s #6 sha256:d7115a3d38256b1b754c6d798cc6ef0b0efd37f8d6fa9324ca0d79d6d58c2df8 201.33MB / 583.41MB 2.8s #6 sha256:2a6c35d159e5470977cfd95b5896445e4673f7043d
Test Linux TensorRT x64 Release
stderr: + PATH=/opt/python/cp310-cp310/bin:/usr/local/dotnet:/usr/lib/jvm/msopenjdk-17/bin:/opt/rh/gcc-toolset-12/root/usr/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin + python3 -m pip install --user -r tools/ci_build/github/linux/python/requirements.txt WARNING: The script isympy is installed in '/home/onnxruntimedev/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The scripts f2py and numpy-config are installed in '/home/onnxruntimedev/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. [notice] A new release of pip is available: 24.3.1 -> 25.0.1 [notice] To update, run: pip install --upgrade pip + python3 tools/ci_build/build.py --build_dir build/Release --config Release --cmake_generator Ninja --skip_submodule_sync --build_shared_lib --parallel --use_vcpkg --use_vcpkg_ms_internal_asset_cache --enable_onnx_tests --use_cuda --use_tensorrt --use_binskim_compliant_compile_flags --build_wheel --cuda_version=12.2 --cuda_home=/usr/local/cuda-12.2 --cudnn_home=/usr/local/cuda-12.2 --use_tensorrt --tensorrt_home /usr --build_java --cmake_extra_defines CMAKE_CUDA_ARCHITECTURES=80 onnxruntime_BUILD_UNIT_TESTS=ON onnxruntime_ENABLE_CUDA_EP_INTERNAL_TESTS=ON --test 2025-04-24 18:50:51,996 build [DEBUG] - Command line arguments: --build_dir build/Release --config Release --cmake_generator Ninja --skip_submodule_sync --build_shared_lib --parallel --use_vcpkg --use_vcpkg_ms_internal_asset_cache --enable_onnx_tests --use_cuda --use_tensorrt --use_binskim_compliant_compile_flags --build_wheel --cuda_version=12.2 --cuda_home=/usr/local/cuda-12.2 --cudnn_home=/usr/local/cuda-12.2 --use_tensorrt --tensorrt_home /usr --build_java --cmake_extra_defines CMAKE_CUDA_ARCHITECTURES=80 onnxruntime_BUILD_UNIT_TESTS=ON onnxruntime_ENABLE_CUDA_EP_INTERNAL_TESTS=ON --test 2025-04-24 18:50:52,000 build [INFO] - Build started 2025-04-24 18:50:52,001 build [DEBUG] - create symlink /data/models -> build/Release/models 2025-04-24 18:50:52,001 build [INFO] - Running tests for Release configuration 2025-04-24 18:50:52,001 build [INFO] - /usr/bin/ctest --build-config Release --verbose --timeout 10800 2025-04-24 18:59:37,514 build [INFO] - /opt/python/cp310-cp310/bin/python3 onnxruntime_test_python.py ......2025-04-24 18:59:45.281680941 [W:onnxruntime:, inference_session.cc:3150 SetTuningResults] Cannot find execution provider UnknownEP�[m 2025-04-24 18:59:45.281986283 [W:onnxruntime:, inference_session.cc:3166 SetTuningResults] Failed to load TuningResults (index=0). Reason: tuning_context_impl.h:167 CheckMandatoryKeys key="ORT_VERSION" is not provided for validation. �[m 2025-04-24 18:59:45.282169025 [W:onnxruntime:, inference_session.cc:3166 SetTuningResults] Failed to load TuningResults (index=0). Reason: tuning_context_impl.h:204 CheckKeysMatching Unmatched validator: "NOT_A_VALIDATOR_KEY" is provided, but onnxruntime is unable to consume it. �[m 2025-04-24 18:59:45.282298338 [W:onnxruntime:, inference_session.cc:3166 SetTuningResults] Failed to load TuningResults (index=0). Reason: tuning_context_impl.h:213 ValidateOrtVersion onnxruntime version mismatch�[m .....Unsupported ONNX data type: STRING (8) 2025-04-24 18:59:51.637527193 [E:onnxruntime:Default, tensorrt_execution_provider.h:89 log] [2025-04-24 18:59:51 ERROR] In node -1 with name: and operator: (importInput): UNSUPPORTED_NODE: Assertion failed: convertDtype(onnxDtype.elem_type(), &trtDtype) && "Failed to convert ONNX date type to TensorRT data type."�[m 2025-04-24 18:59:51.637622471 [W:onnxruntime:Default, tensorrt_execution_provider.cc:2748 GetCapability] [TensorRT EP] No graph will run on TensorRT execution provider�[m .Unsupported ONNX data type: STRING (8) 2025-04-24 18:59:53.763376269 [E:onnxruntime:Default, tensorrt_execution_provider.h:89 log] [2025-04-24 18:59:5
Test Linux TensorRT x64 Release
Wheel output directory /mnt/vss/_work/_temp/Release/dist does not exist.

Artifacts

Produced during runtime
Name Size Digest
build-output-x64-Release Expired
1.74 GB
sha256:70d93eb2597d901131e5acf899492c74be04630293ae1b0b4f479c17a34c0a0e