Skip to content

"--thread=n" switch does not seem to work #293

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
PapperYZ opened this issue Mar 7, 2025 · 23 comments
Open

"--thread=n" switch does not seem to work #293

PapperYZ opened this issue Mar 7, 2025 · 23 comments

Comments

@PapperYZ
Copy link

PapperYZ commented Mar 7, 2025

I am trying to use "--thread=4" to control the number of cores involved during the benchmark, and it does not seem to work and caused the failure. BTW, I have manually disabled 4 cores, and only leave 4 core available on my orangepi plus.

The command I used is as below:
(mlc) orangepi@orangepi5plus:~$ mlcr run-mlperf,inference,_full,_r5.0-dev --model=resnet50 --implementation=reference --framework=onnxruntime --category=edge --scenario=Offline --execution_mode=valid --device=cpu --quiet --test_query_count=1000 --thread=4

And the log is as below

[2025-03-06 16:30:48,427 module.py:558 INFO] - * mlcr run-mlperf,inference,_full,_r5.0-dev
[2025-03-06 16:30:48,468 module.py:558 INFO] -   * mlcr detect,os
[2025-03-06 16:30:48,496 module.py:5329 INFO] -          ! cd /home/orangepi
[2025-03-06 16:30:48,496 module.py:5330 INFO] -          ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-06 16:30:48,549 module.py:5476 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-06 16:30:48,626 module.py:558 INFO] -   * mlcr detect,cpu
[2025-03-06 16:30:48,658 module.py:558 INFO] -     * mlcr detect,os
[2025-03-06 16:30:48,687 module.py:5329 INFO] -            ! cd /home/orangepi
[2025-03-06 16:30:48,688 module.py:5330 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-06 16:30:48,742 module.py:5476 INFO] -            ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-06 16:30:48,807 module.py:5329 INFO] -          ! cd /home/orangepi
[2025-03-06 16:30:48,808 module.py:5330 INFO] -          ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh from tmp-run.sh
[2025-03-06 16:30:48,917 module.py:5476 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/customize.py
[2025-03-06 16:30:49,001 module.py:558 INFO] -   * mlcr get,python3
[2025-03-06 16:30:49,006 module.py:1270 INFO] -        ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:30:49,008 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:30:49,008 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:30:49,140 module.py:558 INFO] -   * mlcr get,mlcommons,inference,src
[2025-03-06 16:30:49,147 module.py:1270 INFO] -        ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-src_28dca458/mlc-cached-state.json
[2025-03-06 16:30:49,202 module.py:558 INFO] -   * mlcr get,sut,description
[2025-03-06 16:30:49,242 module.py:558 INFO] -     * mlcr detect,os
[2025-03-06 16:30:49,273 module.py:5329 INFO] -            ! cd /home/orangepi
[2025-03-06 16:30:49,273 module.py:5330 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-06 16:30:49,333 module.py:5476 INFO] -            ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-06 16:30:49,417 module.py:558 INFO] -     * mlcr detect,cpu
[2025-03-06 16:30:49,452 module.py:558 INFO] -       * mlcr detect,os
[2025-03-06 16:30:49,483 module.py:5329 INFO] -              ! cd /home/orangepi
[2025-03-06 16:30:49,483 module.py:5330 INFO] -              ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-06 16:30:49,543 module.py:5476 INFO] -              ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-06 16:30:49,613 module.py:5329 INFO] -            ! cd /home/orangepi
[2025-03-06 16:30:49,614 module.py:5330 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh from tmp-run.sh
[2025-03-06 16:30:49,732 module.py:5476 INFO] -            ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/customize.py
[2025-03-06 16:30:49,814 module.py:558 INFO] -     * mlcr get,python3
[2025-03-06 16:30:49,819 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:30:49,820 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:30:49,821 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:30:49,923 module.py:558 INFO] -     * mlcr get,compiler
[2025-03-06 16:30:49,927 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-llvm_2a16eab5/mlc-cached-state.json
[2025-03-06 16:30:50,426 module.py:558 INFO] -     * mlcr get,generic-python-lib,_package.dmiparser
[2025-03-06 16:30:50,488 module.py:558 INFO] -       * mlcr get,python3
[2025-03-06 16:30:50,493 module.py:1270 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:30:50,495 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:30:50,496 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:30:50,499 module.py:5329 INFO] -            ! cd /home/orangepi
[2025-03-06 16:30:50,499 module.py:5330 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh
[2025-03-06 16:30:50,841 module.py:5476 INFO] -            ! call "detect_version" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py
          Detected version: 5.1
[2025-03-06 16:30:50,948 module.py:558 INFO] -       * mlcr get,python3
[2025-03-06 16:30:50,953 module.py:1270 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:30:50,955 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:30:50,955 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:30:50,958 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-generic-python-lib_c5ca3e80/mlc-cached-state.json
[2025-03-06 16:30:50,997 module.py:558 INFO] -     * mlcr get,cache,dir,_name.mlperf-inference-sut-descriptions
[2025-03-06 16:30:51,002 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-cache-dir_d1147d31/mlc-cached-state.json
Generating SUT description file for orangepi5plus-onnxruntime
[2025-03-06 16:30:51,041 module.py:5476 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-sut-description/customize.py
[2025-03-06 16:30:51,119 module.py:558 INFO] -   * mlcr get,mlperf,inference,results,dir,_version.r5.0-dev
[2025-03-06 16:30:51,124 module.py:1270 INFO] -        ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_5b5d1028/mlc-cached-state.json
[2025-03-06 16:30:51,151 module.py:558 INFO] -   * mlcr install,pip-package,for-mlc-python,_package.tabulate
[2025-03-06 16:30:51,156 module.py:1270 INFO] -        ! load /home/orangepi/MLC/repos/local/cache/install-pip-package-for-mlc-python_d13518ce/mlc-cached-state.json
[2025-03-06 16:30:51,187 module.py:558 INFO] -   * mlcr get,mlperf,inference,utils
[2025-03-06 16:30:51,323 module.py:558 INFO] -     * mlcr get,mlperf,inference,src
[2025-03-06 16:30:51,330 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-src_28dca458/mlc-cached-state.json
[2025-03-06 16:30:51,362 module.py:5476 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-utils/customize.py
Using MLCommons Inference source from /home/orangepi/MLC/repos/local/cache/get-git-repo_455c691e/inference

Running loadgen scenario: Offline and mode: performance
[2025-03-06 16:30:52,609 module.py:558 INFO] - * mlcr app,mlperf,inference,generic,_reference,_resnet50,_onnxruntime,_cpu,_valid,_r5.0-dev_default,_offline
[2025-03-06 16:30:52,673 module.py:558 INFO] -   * mlcr detect,os
[2025-03-06 16:30:52,708 module.py:5329 INFO] -          ! cd /home/orangepi
[2025-03-06 16:30:52,709 module.py:5330 INFO] -          ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-06 16:30:52,771 module.py:5476 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-06 16:30:52,860 module.py:558 INFO] -   * mlcr get,sys-utils-cm
[2025-03-06 16:30:52,900 module.py:558 INFO] -     * mlcr detect,os
[2025-03-06 16:30:52,933 module.py:5329 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:30:52,934 module.py:5330 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-06 16:30:52,999 module.py:5476 INFO] -            ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-06 16:30:53,123 module.py:558 INFO] -   * mlcr get,python
[2025-03-06 16:30:53,128 module.py:1270 INFO] -        ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:30:53,130 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:30:53,130 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:30:53,265 module.py:558 INFO] -   * mlcr get,mlcommons,inference,src
[2025-03-06 16:30:53,273 module.py:1270 INFO] -        ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-src_d9661778/mlc-cached-state.json
[2025-03-06 16:30:53,303 module.py:558 INFO] -   * mlcr get,mlperf,inference,utils
[2025-03-06 16:30:53,447 module.py:558 INFO] -     * mlcr get,mlperf,inference,src
[2025-03-06 16:30:53,455 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-src_d9661778/mlc-cached-state.json
[2025-03-06 16:30:53,488 module.py:5476 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-utils/customize.py
[2025-03-06 16:30:53,578 module.py:558 INFO] -   * mlcr get,dataset-aux,imagenet-aux
[2025-03-06 16:30:53,584 module.py:1270 INFO] -        ! load /home/orangepi/MLC/repos/local/cache/get-dataset-imagenet-aux_06b55374/mlc-cached-state.json
[2025-03-06 16:30:54,559 module.py:558 INFO] -   * mlcr app,mlperf,reference,inference,_offline,_onnxruntime,_cpu,_resnet50,_fp32
[2025-03-06 16:30:54,628 module.py:558 INFO] -     * mlcr detect,os
[2025-03-06 16:30:54,665 module.py:5329 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:30:54,665 module.py:5330 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-06 16:30:54,730 module.py:5476 INFO] -            ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-06 16:30:54,819 module.py:558 INFO] -     * mlcr detect,cpu
[2025-03-06 16:30:54,858 module.py:558 INFO] -       * mlcr detect,os
[2025-03-06 16:30:54,891 module.py:5329 INFO] -              ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:30:54,892 module.py:5330 INFO] -              ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-06 16:30:54,956 module.py:5476 INFO] -              ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-06 16:30:55,032 module.py:5329 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:30:55,033 module.py:5330 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh from tmp-run.sh
[2025-03-06 16:30:55,156 module.py:5476 INFO] -            ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/customize.py
[2025-03-06 16:30:55,226 module.py:558 INFO] -     * mlcr get,sys-utils-cm
[2025-03-06 16:30:55,267 module.py:558 INFO] -       * mlcr detect,os
[2025-03-06 16:30:55,302 module.py:5329 INFO] -              ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:30:55,302 module.py:5330 INFO] -              ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-06 16:30:55,366 module.py:5476 INFO] -              ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-06 16:30:55,504 module.py:558 INFO] -     * mlcr get,python
[2025-03-06 16:30:55,510 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:30:55,512 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:30:55,512 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:30:56,054 module.py:558 INFO] -     * mlcr get,generic-python-lib,_onnxruntime
[2025-03-06 16:30:56,127 module.py:558 INFO] -       * mlcr get,python3
[2025-03-06 16:30:56,132 module.py:1270 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:30:56,134 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:30:56,134 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:30:56,138 module.py:5329 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:30:56,138 module.py:5330 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh
[2025-03-06 16:30:56,479 module.py:5476 INFO] -            ! call "detect_version" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py
          Detected version: 1.20.1
[2025-03-06 16:30:56,572 module.py:558 INFO] -       * mlcr get,python3
[2025-03-06 16:30:56,578 module.py:1270 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:30:56,580 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:30:56,580 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:30:56,583 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-generic-python-lib_b20ddd59/mlc-cached-state.json
[2025-03-06 16:30:56,779 module.py:558 INFO] -     * mlcr get,ml-model,image-classification,resnet50,raw,_onnx,_fp32
[2025-03-06 16:30:56,797 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-ml-model-resnet50_f98cae5e/mlc-cached-state.json
[2025-03-06 16:30:56,800 module.py:2218 INFO] - Path to the ML model: /home/orangepi/MLC/repos/local/cache/download-file_0b874fd8/resnet50_v1.onnx
[2025-03-06 16:30:57,003 module.py:558 INFO] -     * mlcr get,dataset,image-classification,imagenet,preprocessed,_-for.mobilenet,_NCHW,_full
[2025-03-06 16:30:57,017 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_ad351e75/mlc-cached-state.json
[2025-03-06 16:30:57,096 module.py:558 INFO] -     * mlcr get,dataset-aux,image-classification,imagenet-aux
[2025-03-06 16:30:57,103 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-dataset-imagenet-aux_06b55374/mlc-cached-state.json
[2025-03-06 16:30:57,175 module.py:558 INFO] -     * mlcr generate,user-conf,mlperf,inference
[2025-03-06 16:30:57,215 module.py:558 INFO] -       * mlcr detect,os
[2025-03-06 16:30:57,254 module.py:5329 INFO] -              ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:30:57,254 module.py:5330 INFO] -              ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-06 16:30:57,324 module.py:5476 INFO] -              ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-06 16:30:57,406 module.py:558 INFO] -       * mlcr detect,cpu
[2025-03-06 16:30:57,447 module.py:558 INFO] -         * mlcr detect,os
[2025-03-06 16:30:57,485 module.py:5329 INFO] -                ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:30:57,485 module.py:5330 INFO] -                ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-06 16:30:57,555 module.py:5476 INFO] -                ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-06 16:30:57,636 module.py:5329 INFO] -              ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:30:57,637 module.py:5330 INFO] -              ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh from tmp-run.sh
[2025-03-06 16:30:57,769 module.py:5476 INFO] -              ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/customize.py
[2025-03-06 16:30:57,872 module.py:558 INFO] -       * mlcr get,python
[2025-03-06 16:30:57,878 module.py:1270 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:30:57,880 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:30:57,881 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:30:58,018 module.py:558 INFO] -       * mlcr get,mlcommons,inference,src
[2025-03-06 16:30:58,028 module.py:1270 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-src_d9661778/mlc-cached-state.json
[2025-03-06 16:30:58,085 module.py:558 INFO] -       * mlcr get,sut,configs
[2025-03-06 16:30:58,126 module.py:558 INFO] -         * mlcr get,cache,dir,_name.mlperf-inference-sut-configs
[2025-03-06 16:30:58,133 module.py:1270 INFO] -              ! load /home/orangepi/MLC/repos/local/cache/get-cache-dir_8203c130/mlc-cached-state.json
[2025-03-06 16:30:58,142 module.py:5476 INFO] -              ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-sut-configs/customize.py
Using MLCommons Inference source from '/home/orangepi/MLC/repos/local/cache/get-git-repo_455c691e/inference'
Original configuration value 1.0 target_qps
Adjusted configuration value 1.01 target_qps
Output Dir: '/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_5b5d1028/valid_results/orangepi5plus-reference-cpu-onnxruntime-v1.20.1-default_config/resnet50/offline/performance/run_1'
resnet50.Offline.target_qps = 1.01

[2025-03-06 16:30:58,420 module.py:558 INFO] -     * mlcr get,loadgen
[2025-03-06 16:30:58,429 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-loadgen_2eabbbbb/mlc-cached-state.json
[2025-03-06 16:30:58,433 module.py:2218 INFO] - Path to the tool: /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-loadgen_2eabbbbb/install
[2025-03-06 16:30:58,577 module.py:558 INFO] -     * mlcr get,mlcommons,inference,src
[2025-03-06 16:30:58,587 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-src_d9661778/mlc-cached-state.json
[2025-03-06 16:30:58,733 module.py:558 INFO] -     * mlcr get,mlcommons,inference,src
[2025-03-06 16:30:58,742 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-src_28dca458/mlc-cached-state.json
[2025-03-06 16:30:59,244 module.py:558 INFO] -     * mlcr get,generic-python-lib,_package.psutil
[2025-03-06 16:30:59,312 module.py:558 INFO] -       * mlcr get,python3
[2025-03-06 16:30:59,318 module.py:1270 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:30:59,320 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:30:59,320 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:30:59,326 module.py:5329 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:30:59,326 module.py:5330 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh
[2025-03-06 16:30:59,681 module.py:5476 INFO] -            ! call "detect_version" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py
          Detected version: 7.0.0
[2025-03-06 16:30:59,796 module.py:558 INFO] -       * mlcr get,python3
[2025-03-06 16:30:59,802 module.py:1270 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:30:59,805 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:30:59,805 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:30:59,809 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-generic-python-lib_d389aaea/mlc-cached-state.json
[2025-03-06 16:31:00,311 module.py:558 INFO] -     * mlcr get,generic-python-lib,_opencv-python
[2025-03-06 16:31:00,379 module.py:558 INFO] -       * mlcr get,python3
[2025-03-06 16:31:00,385 module.py:1270 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:31:00,387 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:31:00,388 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:31:00,393 module.py:5329 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:31:00,393 module.py:5330 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh
[2025-03-06 16:31:00,747 module.py:5476 INFO] -            ! call "detect_version" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py
          Detected version: 4.11.0.86
[2025-03-06 16:31:00,860 module.py:558 INFO] -       * mlcr get,python3
[2025-03-06 16:31:00,866 module.py:1270 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:31:00,869 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:31:00,869 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:31:00,873 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-generic-python-lib_b5037291/mlc-cached-state.json
[2025-03-06 16:31:01,371 module.py:558 INFO] -     * mlcr get,generic-python-lib,_numpy
[2025-03-06 16:31:01,440 module.py:558 INFO] -       * mlcr get,python3
[2025-03-06 16:31:01,445 module.py:1270 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:31:01,448 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:31:01,448 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:31:01,454 module.py:5329 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:31:01,454 module.py:5330 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh
[2025-03-06 16:31:01,831 module.py:5476 INFO] -            ! call "detect_version" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py
          Detected version: 1.26.4
[2025-03-06 16:31:01,941 module.py:558 INFO] -       * mlcr get,python3
[2025-03-06 16:31:01,947 module.py:1270 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:31:01,950 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:31:01,950 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:31:01,954 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-generic-python-lib_cd082116/mlc-cached-state.json
[2025-03-06 16:31:02,453 module.py:558 INFO] -     * mlcr get,generic-python-lib,_pycocotools
[2025-03-06 16:31:02,520 module.py:558 INFO] -       * mlcr get,python3
[2025-03-06 16:31:02,526 module.py:1270 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:31:02,528 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:31:02,529 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:31:02,534 module.py:5329 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:31:02,534 module.py:5330 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh
[2025-03-06 16:31:02,885 module.py:5476 INFO] -            ! call "detect_version" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py
          Detected version: 2.0.8
[2025-03-06 16:31:03,001 module.py:558 INFO] -       * mlcr get,python3
[2025-03-06 16:31:03,007 module.py:1270 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_b40f8788/mlc-cached-state.json
[2025-03-06 16:31:03,010 module.py:2218 INFO] - Path to Python: /home/orangepi/mlc/bin/python3
[2025-03-06 16:31:03,010 module.py:2218 INFO] - Python version: 3.10.12
[2025-03-06 16:31:03,014 module.py:1270 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-generic-python-lib_e2e50d3c/mlc-cached-state.json
Using MLCommons Inference source from '/home/orangepi/MLC/repos/local/cache/get-git-repo_455c691e/inference'
[2025-03-06 16:31:03,075 module.py:5476 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/app-mlperf-inference-mlcommons-python/customize.py
[2025-03-06 16:31:03,127 module.py:558 INFO] -   * mlcr benchmark-mlperf
[2025-03-06 16:31:03,175 module.py:5476 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/benchmark-program-mlperf/customize.py
[2025-03-06 16:31:03,261 module.py:558 INFO] -   * mlcr benchmark-program,program
[2025-03-06 16:31:03,305 module.py:558 INFO] -     * mlcr detect,cpu
[2025-03-06 16:31:03,354 module.py:558 INFO] -       * mlcr detect,os
[2025-03-06 16:31:03,407 module.py:5329 INFO] -              ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:31:03,408 module.py:5330 INFO] -              ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-06 16:31:03,496 module.py:5476 INFO] -              ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-06 16:31:03,594 module.py:5329 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:31:03,594 module.py:5330 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh from tmp-run.sh
[2025-03-06 16:31:03,755 module.py:5476 INFO] -            ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/customize.py
***************************************************************************
CM script::benchmark-program/run.sh

Run Directory: /home/orangepi/MLC/repos/local/cache/get-git-repo_455c691e/inference/vision/classification_and_detection

CMD: ./run_local.sh onnxruntime resnet50 cpu --scenario Offline    --threads 8 --user_conf '/home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/954c6df540524a85923ad4060ad036bc.conf' --use_preprocessed_dataset --cache_dir /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_ad351e75 --dataset-list /home/orangepi/MLC/repos/local/cache/extract-file_b98bd951/val.txt 2>&1 | tee '/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_5b5d1028/valid_results/orangepi5plus-reference-cpu-onnxruntime-v1.20.1-default_config/resnet50/offline/performance/run_1/console.out'; echo \${PIPESTATUS[0]} > exitstatus

[2025-03-06 16:31:03,836 module.py:5329 INFO] -          ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_ba8cc059
[2025-03-06 16:31:03,837 module.py:5330 INFO] -          ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/benchmark-program/run-ubuntu.sh from tmp-run.sh

./run_local.sh onnxruntime resnet50 cpu --scenario Offline    --threads 8 --user_conf '/home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/954c6df540524a85923ad4060ad036bc.conf' --use_preprocessed_dataset --cache_dir /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_ad351e75 --dataset-list /home/orangepi/MLC/repos/local/cache/extract-file_b98bd951/val.txt 2>&1 | tee '/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_5b5d1028/valid_results/orangepi5plus-reference-cpu-onnxruntime-v1.20.1-default_config/resnet50/offline/performance/run_1/console.out'; echo ${PIPESTATUS[0]} > exitstatus
python3 python/main.py --profile resnet50-onnxruntime --model "/home/orangepi/MLC/repos/local/cache/download-file_0b874fd8/resnet50_v1.onnx" --dataset-path /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_ad351e75 --output "/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_5b5d1028/valid_results/orangepi5plus-reference-cpu-onnxruntime-v1.20.1-default_config/resnet50/offline/performance/run_1" --scenario Offline --threads 8 --user_conf /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/954c6df540524a85923ad4060ad036bc.conf --use_preprocessed_dataset --cache_dir /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_ad351e75 --dataset-list /home/orangepi/MLC/repos/local/cache/extract-file_b98bd951/val.txt
INFO:main:Namespace(dataset='imagenet', dataset_path='/home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_ad351e75', dataset_list='/home/orangepi/MLC/repos/local/cache/extract-file_b98bd951/val.txt', data_format=None, profile='resnet50-onnxruntime', scenario='Offline', max_batchsize=32, model='/home/orangepi/MLC/repos/local/cache/download-file_0b874fd8/resnet50_v1.onnx', output='/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_5b5d1028/valid_results/orangepi5plus-reference-cpu-onnxruntime-v1.20.1-default_config/resnet50/offline/performance/run_1', inputs=None, outputs=['ArgMax:0'], backend='onnxruntime', device=None, model_name='resnet50', threads=8, qps=None, cache=0, cache_dir='/home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_ad351e75', preprocessed_dir=None, use_preprocessed_dataset=True, accuracy=False, find_peak_performance=False, debug=False, user_conf='/home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/954c6df540524a85923ad4060ad036bc.conf', audit_conf='audit.config', time=None, count=None, performance_sample_count=None, max_latency=None, samples_per_query=8)
INFO:imagenet:Loading 50000 preprocessed images using 8 threads
INFO:imagenet:loaded 50000 images, cache=0, already_preprocessed=True, took=3.7sec
2025-03-06 16:31:08.749502348 [E:onnxruntime:Default, env.cc:234 ThreadMain] pthread_setaffinity_np failed for thread: 8002, index: 0, mask: {6, }, error code: 22 error msg: Invalid argument. Specify the number of threads explicitly so the affinity is not set.
2025-03-06 16:31:08.749659848 [E:onnxruntime:Default, env.cc:234 ThreadMain] pthread_setaffinity_np failed for thread: 8003, index: 1, mask: {5, }, error code: 22 error msg: Invalid argument. Specify the number of threads explicitly so the affinity is not set.
2025-03-06 16:31:08.749984765 [E:onnxruntime:Default, env.cc:234 ThreadMain] pthread_setaffinity_np failed for thread: 8004, index: 2, mask: {4, }, error code: 22 error msg: Invalid argument. Specify the number of threads explicitly so the affinity is not set.
/opt/rh/gcc-toolset-12/root/usr/include/c++/12/bits/stl_vector.h:1123: std::vector<_Tp, _Alloc>::reference std::vector<_Tp, _Alloc>::operator[](size_type) [with _Tp = unsigned int; _Alloc = std::allocator<unsigned int>; reference = unsigned int&; size_type = long unsigned int]: Assertion '__n < this->size()' failed.
./run_local.sh: line 30:  7990 Aborted                 (core dumped) python3 python/main.py --profile resnet50-onnxruntime --model "/home/orangepi/MLC/repos/local/cache/download-file_0b874fd8/resnet50_v1.onnx" --dataset-path /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_ad351e75 --output "/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_5b5d1028/valid_results/orangepi5plus-reference-cpu-onnxruntime-v1.20.1-default_config/resnet50/offline/performance/run_1" --scenario Offline --threads 8 --user_conf /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/954c6df540524a85923ad4060ad036bc.conf --use_preprocessed_dataset --cache_dir /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_ad351e75 --dataset-list /home/orangepi/MLC/repos/local/cache/extract-file_b98bd951/val.txt
Traceback (most recent call last):
  File "/home/orangepi/mlc/bin/mlcr", line 8, in <module>
    sys.exit(mlcr())
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/main.py", line 73, in mlcr
    main()
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/main.py", line 160, in main
    res = method(run_args)
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 121, in call_script_module_function
    result = automation_instance.run(run_args)  # Pass args to the run method
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 225, in run
    r = self._run(i)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 1770, in _run
    r = customize_code.preprocess(ii)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/run-mlperf-inference-app/customize.py", line 284, in preprocess
    r = mlc.access(ii)
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/action.py", line 56, in access
    result = method(options)
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 121, in call_script_module_function
    result = automation_instance.run(run_args)  # Pass args to the run method
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 225, in run
    r = self._run(i)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 1840, in _run
    r = self._call_run_deps(prehook_deps, self.local_env_keys, local_env_keys_from_meta, env, state, const, const_state, add_deps_recursive,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3527, in _call_run_deps
    r = script._run_deps(deps, local_env_keys, env, state, const, const_state, add_deps_recursive, recursion_spaces,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3697, in _run_deps
    r = self.action_object.access(ii)
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/action.py", line 56, in access
    result = method(options)
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 121, in call_script_module_function
    result = automation_instance.run(run_args)  # Pass args to the run method
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 225, in run
    r = self._run(i)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 1856, in _run
    r = prepare_and_run_script_with_postprocessing(
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 5483, in prepare_and_run_script_with_postprocessing
    r = script_automation._call_run_deps(posthook_deps, local_env_keys, local_env_keys_from_meta, env, state, const, const_state,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3527, in _call_run_deps
    r = script._run_deps(deps, local_env_keys, env, state, const, const_state, add_deps_recursive, recursion_spaces,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3697, in _run_deps
    r = self.action_object.access(ii)
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/action.py", line 56, in access
    result = method(options)
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 121, in call_script_module_function
    result = automation_instance.run(run_args)  # Pass args to the run method
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 225, in run
    r = self._run(i)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 1883, in _run
    r = self._run_deps(post_deps, clean_env_keys_post_deps, env, state, const, const_state, add_deps_recursive, recursion_spaces,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3697, in _run_deps
    r = self.action_object.access(ii)
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/action.py", line 56, in access
    result = method(options)
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 131, in call_script_module_function
    raise ScriptExecutionError(f"Script {function_name} execution failed. Error : {error}")
mlc.script_action.ScriptExecutionError: Script run execution failed. Error : MLC script failed (name = benchmark-program, return code = 34304)


^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Please file an issue at https://github.com/mlcommons/mlperf-automations/issues along with the full MLC command being run and the relevant
or full console log.

@sujik18
Copy link
Member

sujik18 commented Mar 7, 2025

Hi @PapperYZ, can u once try rerun the command with the tag --threads=4

@arjunsuresh
Copy link
Collaborator

@PapperYZ can As @sujik18 added above --threads=4 is the correct usage. But I'm not sure all the MLPerf implementations correctly respect this parameter. Can you please share what you`re trying to do?

@PapperYZ
Copy link
Author

Hi @arjunsuresh , we have a custom ASIC which has the similar core structure as the 4 small cores in the orangepi plus system, so we attempt to do a comparison by turning off the 4 big cores in orangepi and run the exact the same benchmark to see the custom ASIC delivers the same benchmark performance. Do you see a good way of supporting that?

@PapperYZ
Copy link
Author

Hi @sujik18 , I tried using --threads=4 --rerun, it still uses the same 8 threads and generates a failure.
@arjunsuresh , would you help to make some adjust so that --threads=4 applies globally?

(mlc) orangepi@orangepi5plus:~/work$ mlcr run-mlperf,inference,_full,_r5.0-dev    --model=resnet50    --implementation=reference    --framework=onnxruntime    --category=edge    --scenario=Offline    --execution_mode=test    --device=cpu    --quiet    --test_query_count=1000 --threads=4 --rerun
[2025-03-13 16:09:39,703 module.py:557 INFO] - * mlcr run-mlperf,inference,_full,_r5.0-dev
[2025-03-13 16:09:39,744 module.py:557 INFO] -   * mlcr detect,os
[2025-03-13 16:09:39,775 module.py:5106 INFO] -          ! cd /home/orangepi/work
[2025-03-13 16:09:39,776 module.py:5107 INFO] -          ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-13 16:09:39,842 module.py:5253 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-13 16:09:39,920 module.py:557 INFO] -   * mlcr detect,cpu
[2025-03-13 16:09:39,953 module.py:557 INFO] -     * mlcr detect,os
[2025-03-13 16:09:39,985 module.py:5106 INFO] -            ! cd /home/orangepi/work
[2025-03-13 16:09:39,986 module.py:5107 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-13 16:09:40,052 module.py:5253 INFO] -            ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-13 16:09:40,121 module.py:5106 INFO] -          ! cd /home/orangepi/work
[2025-03-13 16:09:40,121 module.py:5107 INFO] -          ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh from tmp-run.sh
[2025-03-13 16:09:40,239 module.py:5253 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/customize.py
[2025-03-13 16:09:40,330 module.py:557 INFO] -   * mlcr get,python3
[2025-03-13 16:09:40,335 module.py:1269 INFO] -        ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:40,338 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:40,338 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:40,483 module.py:557 INFO] -   * mlcr get,mlcommons,inference,src
[2025-03-13 16:09:40,490 module.py:1269 INFO] -        ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-src_9d8d3a8e/mlc-cached-state.json
[2025-03-13 16:09:40,548 module.py:557 INFO] -   * mlcr get,sut,description
[2025-03-13 16:09:40,591 module.py:557 INFO] -     * mlcr detect,os
[2025-03-13 16:09:40,625 module.py:5106 INFO] -            ! cd /home/orangepi/work
[2025-03-13 16:09:40,626 module.py:5107 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-13 16:09:40,696 module.py:5253 INFO] -            ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-13 16:09:40,779 module.py:557 INFO] -     * mlcr detect,cpu
[2025-03-13 16:09:40,816 module.py:557 INFO] -       * mlcr detect,os
[2025-03-13 16:09:40,849 module.py:5106 INFO] -              ! cd /home/orangepi/work
[2025-03-13 16:09:40,850 module.py:5107 INFO] -              ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-13 16:09:40,921 module.py:5253 INFO] -              ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-13 16:09:40,996 module.py:5106 INFO] -            ! cd /home/orangepi/work
[2025-03-13 16:09:40,997 module.py:5107 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh from tmp-run.sh
[2025-03-13 16:09:41,125 module.py:5253 INFO] -            ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/customize.py
[2025-03-13 16:09:41,216 module.py:557 INFO] -     * mlcr get,python3
[2025-03-13 16:09:41,221 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:41,223 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:41,223 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:41,355 module.py:557 INFO] -     * mlcr get,compiler
[2025-03-13 16:09:41,360 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-llvm_4a3cf0d3/mlc-cached-state.json
[2025-03-13 16:09:41,855 module.py:557 INFO] -     * mlcr get,generic-python-lib,_package.dmiparser
[2025-03-13 16:09:41,929 module.py:557 INFO] -       * mlcr get,python3
[2025-03-13 16:09:41,935 module.py:1269 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:41,937 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:41,938 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:41,942 module.py:5106 INFO] -            ! cd /home/orangepi/work
[2025-03-13 16:09:41,942 module.py:5107 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh
[2025-03-13 16:09:42,289 module.py:5253 INFO] -            ! call "detect_version" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py
          Detected version: 5.1
[2025-03-13 16:09:42,399 module.py:557 INFO] -       * mlcr get,python3
[2025-03-13 16:09:42,405 module.py:1269 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:42,407 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:42,407 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:42,410 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-generic-python-lib_7e602518/mlc-cached-state.json
[2025-03-13 16:09:42,449 module.py:557 INFO] -     * mlcr get,cache,dir,_name.mlperf-inference-sut-descriptions
[2025-03-13 16:09:42,457 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-cache-dir_d8207d77/mlc-cached-state.json
Generating SUT description file for orangepi5plus-onnxruntime
[2025-03-13 16:09:42,503 module.py:5253 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-sut-description/customize.py
[2025-03-13 16:09:42,596 module.py:557 INFO] -   * mlcr get,mlperf,inference,results,dir,_version.r5.0-dev
[2025-03-13 16:09:42,603 module.py:1269 INFO] -        ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_aa85ec32/mlc-cached-state.json
[2025-03-13 16:09:42,632 module.py:557 INFO] -   * mlcr install,pip-package,for-mlc-python,_package.tabulate
[2025-03-13 16:09:42,639 module.py:1269 INFO] -        ! load /home/orangepi/MLC/repos/local/cache/install-pip-package-for-mlc-python_ab427f56/mlc-cached-state.json
[2025-03-13 16:09:42,676 module.py:557 INFO] -   * mlcr get,mlperf,inference,utils
[2025-03-13 16:09:42,815 module.py:557 INFO] -     * mlcr get,mlperf,inference,src
[2025-03-13 16:09:42,822 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-src_9d8d3a8e/mlc-cached-state.json
[2025-03-13 16:09:42,857 module.py:5253 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-utils/customize.py
Using MLCommons Inference source from /home/orangepi/MLC/repos/local/cache/get-git-repo_0717a341/inference

Running loadgen scenario: Offline and mode: performance
[2025-03-13 16:09:44,079 module.py:557 INFO] - * mlcr app,mlperf,inference,generic,_reference,_resnet50,_onnxruntime,_cpu,_test,_r5.0-dev_default,_offline
[2025-03-13 16:09:44,147 module.py:557 INFO] -   * mlcr detect,os
[2025-03-13 16:09:44,184 module.py:5106 INFO] -          ! cd /home/orangepi/work
[2025-03-13 16:09:44,185 module.py:5107 INFO] -          ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-13 16:09:44,259 module.py:5253 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-13 16:09:44,342 module.py:557 INFO] -   * mlcr get,sys-utils-cm
[2025-03-13 16:09:44,450 module.py:557 INFO] -     * mlcr detect,os
[2025-03-13 16:09:44,485 module.py:5106 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:44,486 module.py:5107 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-13 16:09:44,559 module.py:5253 INFO] -            ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-13 16:09:44,694 module.py:557 INFO] -   * mlcr get,python
[2025-03-13 16:09:44,699 module.py:1269 INFO] -        ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:44,701 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:44,701 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:44,835 module.py:557 INFO] -   * mlcr get,mlcommons,inference,src
[2025-03-13 16:09:44,844 module.py:1269 INFO] -        ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-src_6fba5e8c/mlc-cached-state.json
[2025-03-13 16:09:44,877 module.py:557 INFO] -   * mlcr get,mlperf,inference,utils
[2025-03-13 16:09:45,026 module.py:557 INFO] -     * mlcr get,mlperf,inference,src
[2025-03-13 16:09:45,036 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-src_6fba5e8c/mlc-cached-state.json
[2025-03-13 16:09:45,071 module.py:5253 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-utils/customize.py
[2025-03-13 16:09:45,168 module.py:557 INFO] -   * mlcr get,dataset-aux,imagenet-aux
[2025-03-13 16:09:45,174 module.py:1269 INFO] -        ! load /home/orangepi/MLC/repos/local/cache/get-dataset-imagenet-aux_ce06eb83/mlc-cached-state.json
[2025-03-13 16:09:46,130 module.py:557 INFO] -   * mlcr app,mlperf,reference,inference,_onnxruntime,_resnet50,_offline,_cpu,_fp32
[2025-03-13 16:09:46,195 module.py:557 INFO] -     * mlcr detect,os
[2025-03-13 16:09:46,235 module.py:5106 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:46,235 module.py:5107 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-13 16:09:46,308 module.py:5253 INFO] -            ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-13 16:09:46,402 module.py:557 INFO] -     * mlcr detect,cpu
[2025-03-13 16:09:46,442 module.py:557 INFO] -       * mlcr detect,os
[2025-03-13 16:09:46,480 module.py:5106 INFO] -              ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:46,481 module.py:5107 INFO] -              ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-13 16:09:46,555 module.py:5253 INFO] -              ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-13 16:09:46,636 module.py:5106 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:46,637 module.py:5107 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh from tmp-run.sh
[2025-03-13 16:09:46,770 module.py:5253 INFO] -            ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/customize.py
[2025-03-13 16:09:46,843 module.py:557 INFO] -     * mlcr get,sys-utils-cm
[2025-03-13 16:09:46,949 module.py:557 INFO] -       * mlcr detect,os
[2025-03-13 16:09:46,986 module.py:5106 INFO] -              ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:46,986 module.py:5107 INFO] -              ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-13 16:09:47,063 module.py:5253 INFO] -              ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-13 16:09:47,204 module.py:557 INFO] -     * mlcr get,python
[2025-03-13 16:09:47,210 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:47,212 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:47,212 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:47,758 module.py:557 INFO] -     * mlcr get,generic-python-lib,_onnxruntime
[2025-03-13 16:09:47,819 module.py:557 INFO] -       * mlcr get,python3
[2025-03-13 16:09:47,826 module.py:1269 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:47,828 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:47,828 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:47,832 module.py:5106 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:47,832 module.py:5107 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh
[2025-03-13 16:09:48,187 module.py:5253 INFO] -            ! call "detect_version" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py
          Detected version: 1.21.0
[2025-03-13 16:09:48,286 module.py:557 INFO] -       * mlcr get,python3
[2025-03-13 16:09:48,292 module.py:1269 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:48,294 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:48,294 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:48,298 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-generic-python-lib_50fb793e/mlc-cached-state.json
[2025-03-13 16:09:48,493 module.py:557 INFO] -     * mlcr get,ml-model,image-classification,resnet50,raw,_onnx,_fp32
[2025-03-13 16:09:48,515 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-ml-model-resnet50_07e46fa9/mlc-cached-state.json
[2025-03-13 16:09:48,518 module.py:2221 INFO] - Path to the ML model: /home/orangepi/MLC/repos/local/cache/download-file_fcdf56f8/resnet50_v1.onnx
[2025-03-13 16:09:48,724 module.py:557 INFO] -     * mlcr get,dataset,image-classification,imagenet,preprocessed,_-for.mobilenet,_full,_NCHW
[2025-03-13 16:09:48,740 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_f462e678/mlc-cached-state.json
[2025-03-13 16:09:48,816 module.py:557 INFO] -     * mlcr get,dataset-aux,image-classification,imagenet-aux
[2025-03-13 16:09:48,824 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-dataset-imagenet-aux_ce06eb83/mlc-cached-state.json
[2025-03-13 16:09:48,899 module.py:557 INFO] -     * mlcr generate,user-conf,mlperf,inference
[2025-03-13 16:09:48,943 module.py:557 INFO] -       * mlcr detect,os
[2025-03-13 16:09:48,990 module.py:5106 INFO] -              ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:48,991 module.py:5107 INFO] -              ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-13 16:09:49,071 module.py:5253 INFO] -              ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-13 16:09:49,166 module.py:557 INFO] -       * mlcr detect,cpu
[2025-03-13 16:09:49,208 module.py:557 INFO] -         * mlcr detect,os
[2025-03-13 16:09:49,251 module.py:5106 INFO] -                ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:49,252 module.py:5107 INFO] -                ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-13 16:09:49,330 module.py:5253 INFO] -                ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-13 16:09:49,416 module.py:5106 INFO] -              ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:49,417 module.py:5107 INFO] -              ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh from tmp-run.sh
[2025-03-13 16:09:49,559 module.py:5253 INFO] -              ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/customize.py
[2025-03-13 16:09:49,664 module.py:557 INFO] -       * mlcr get,python
[2025-03-13 16:09:49,670 module.py:1269 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:49,672 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:49,673 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:49,811 module.py:557 INFO] -       * mlcr get,mlcommons,inference,src
[2025-03-13 16:09:49,823 module.py:1269 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-src_6fba5e8c/mlc-cached-state.json
[2025-03-13 16:09:49,890 module.py:557 INFO] -       * mlcr get,sut,configs
[2025-03-13 16:09:49,944 module.py:557 INFO] -         * mlcr get,cache,dir,_name.mlperf-inference-sut-configs
[2025-03-13 16:09:49,955 module.py:1269 INFO] -              ! load /home/orangepi/MLC/repos/local/cache/get-cache-dir_12468dd9/mlc-cached-state.json
[2025-03-13 16:09:49,966 module.py:5253 INFO] -              ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-sut-configs/customize.py
Using MLCommons Inference source from '/home/orangepi/MLC/repos/local/cache/get-git-repo_0717a341/inference'
Original configuration value 1.0 target_qps
Adjusted configuration value 1.01 target_qps
Output Dir: '/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_aa85ec32/test_results/orangepi5plus-reference-cpu-onnxruntime-v1.21.0-default_config/resnet50/offline/performance/run_1'
resnet50.Offline.target_qps = 1.0
resnet50.Offline.max_query_count = 1000
resnet50.Offline.min_query_count = 1000
resnet50.Offline.min_duration = 0
resnet50.Offline.sample_concatenate_permutation = 0

[2025-03-13 16:09:50,257 module.py:557 INFO] -     * mlcr get,loadgen
[2025-03-13 16:09:50,267 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-loadgen_c9601613/mlc-cached-state.json
[2025-03-13 16:09:50,271 module.py:2221 INFO] - Path to the tool: /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-loadgen_c9601613/install
[2025-03-13 16:09:50,420 module.py:557 INFO] -     * mlcr get,mlcommons,inference,src
[2025-03-13 16:09:50,433 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-src_6fba5e8c/mlc-cached-state.json
[2025-03-13 16:09:50,587 module.py:557 INFO] -     * mlcr get,mlcommons,inference,src
[2025-03-13 16:09:50,597 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-mlperf-inference-src_9d8d3a8e/mlc-cached-state.json
[2025-03-13 16:09:51,094 module.py:557 INFO] -     * mlcr get,generic-python-lib,_package.psutil
[2025-03-13 16:09:51,163 module.py:557 INFO] -       * mlcr get,python3
[2025-03-13 16:09:51,170 module.py:1269 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:51,173 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:51,173 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:51,179 module.py:5106 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:51,179 module.py:5107 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh
[2025-03-13 16:09:51,542 module.py:5253 INFO] -            ! call "detect_version" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py
          Detected version: 7.0.0
[2025-03-13 16:09:51,662 module.py:557 INFO] -       * mlcr get,python3
[2025-03-13 16:09:51,669 module.py:1269 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:51,671 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:51,672 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:51,676 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-generic-python-lib_a5adce9b/mlc-cached-state.json
[2025-03-13 16:09:52,175 module.py:557 INFO] -     * mlcr get,generic-python-lib,_opencv-python
[2025-03-13 16:09:52,242 module.py:557 INFO] -       * mlcr get,python3
[2025-03-13 16:09:52,249 module.py:1269 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:52,252 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:52,252 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:52,258 module.py:5106 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:52,258 module.py:5107 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh
[2025-03-13 16:09:52,617 module.py:5253 INFO] -            ! call "detect_version" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py
          Detected version: 4.11.0.86
[2025-03-13 16:09:52,723 module.py:557 INFO] -       * mlcr get,python3
[2025-03-13 16:09:52,729 module.py:1269 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:52,732 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:52,732 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:52,737 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-generic-python-lib_02332b65/mlc-cached-state.json
[2025-03-13 16:09:53,284 module.py:557 INFO] -     * mlcr get,generic-sys-util,_libgl
[2025-03-13 16:09:53,296 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-generic-sys-util_947d3f25/mlc-cached-state.json
[2025-03-13 16:09:53,790 module.py:557 INFO] -     * mlcr get,generic-python-lib,_numpy
[2025-03-13 16:09:53,862 module.py:557 INFO] -       * mlcr get,python3
[2025-03-13 16:09:53,869 module.py:1269 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:53,871 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:53,872 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:53,877 module.py:5106 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:53,878 module.py:5107 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh
[2025-03-13 16:09:54,266 module.py:5253 INFO] -            ! call "detect_version" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py
          Detected version: 1.26.4
[2025-03-13 16:09:54,377 module.py:557 INFO] -       * mlcr get,python3
[2025-03-13 16:09:54,384 module.py:1269 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:54,387 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:54,387 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:54,391 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-generic-python-lib_73c93d20/mlc-cached-state.json
[2025-03-13 16:09:54,895 module.py:557 INFO] -     * mlcr get,generic-python-lib,_pycocotools
[2025-03-13 16:09:54,964 module.py:557 INFO] -       * mlcr get,python3
[2025-03-13 16:09:54,970 module.py:1269 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:54,973 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:54,974 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:54,979 module.py:5106 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:54,980 module.py:5107 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh
[2025-03-13 16:09:55,341 module.py:5253 INFO] -            ! call "detect_version" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py
          Detected version: 2.0.8
[2025-03-13 16:09:55,459 module.py:557 INFO] -       * mlcr get,python3
[2025-03-13 16:09:55,465 module.py:1269 INFO] -            ! load /home/orangepi/MLC/repos/local/cache/get-python3_347946f9/mlc-cached-state.json
[2025-03-13 16:09:55,468 module.py:2221 INFO] - Path to Python: /home/orangepi/work/mlc/bin/python3
[2025-03-13 16:09:55,469 module.py:2221 INFO] - Python version: 3.10.12
[2025-03-13 16:09:55,473 module.py:1269 INFO] -          ! load /home/orangepi/MLC/repos/local/cache/get-generic-python-lib_f007b752/mlc-cached-state.json
Using MLCommons Inference source from '/home/orangepi/MLC/repos/local/cache/get-git-repo_0717a341/inference'
[2025-03-13 16:09:55,540 module.py:5253 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/app-mlperf-inference-mlcommons-python/customize.py
[2025-03-13 16:09:55,607 module.py:557 INFO] -   * mlcr benchmark-mlperf
[2025-03-13 16:09:55,661 module.py:5253 INFO] -          ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/benchmark-program-mlperf/customize.py
[2025-03-13 16:09:55,758 module.py:557 INFO] -   * mlcr benchmark-program,program
[2025-03-13 16:09:55,803 module.py:557 INFO] -     * mlcr detect,cpu
[2025-03-13 16:09:55,857 module.py:557 INFO] -       * mlcr detect,os
[2025-03-13 16:09:55,910 module.py:5106 INFO] -              ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:55,911 module.py:5107 INFO] -              ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-03-13 16:09:56,007 module.py:5253 INFO] -              ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-03-13 16:09:56,114 module.py:5106 INFO] -            ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:56,114 module.py:5107 INFO] -            ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh from tmp-run.sh
[2025-03-13 16:09:56,285 module.py:5253 INFO] -            ! call "postprocess" from /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/customize.py
***************************************************************************
CM script::benchmark-program/run.sh

Run Directory: /home/orangepi/MLC/repos/local/cache/get-git-repo_0717a341/inference/vision/classification_and_detection

CMD: ./run_local.sh onnxruntime resnet50 cpu --scenario Offline    --threads 4 --user_conf '/home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/e04d6880b67b4ad285412bc8569d3971.conf' --use_preprocessed_dataset --cache_dir /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_f462e678 --dataset-list /home/orangepi/MLC/repos/local/cache/extract-file_9cdedb0f/val.txt 2>&1 | tee '/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_aa85ec32/test_results/orangepi5plus-reference-cpu-onnxruntime-v1.21.0-default_config/resnet50/offline/performance/run_1/console.out'; echo \${PIPESTATUS[0]} > exitstatus

[2025-03-13 16:09:56,372 module.py:5106 INFO] -          ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_246cb44f
[2025-03-13 16:09:56,373 module.py:5107 INFO] -          ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/benchmark-program/run-ubuntu.sh from tmp-run.sh

./run_local.sh onnxruntime resnet50 cpu --scenario Offline    --threads 4 --user_conf '/home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/e04d6880b67b4ad285412bc8569d3971.conf' --use_preprocessed_dataset --cache_dir /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_f462e678 --dataset-list /home/orangepi/MLC/repos/local/cache/extract-file_9cdedb0f/val.txt 2>&1 | tee '/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_aa85ec32/test_results/orangepi5plus-reference-cpu-onnxruntime-v1.21.0-default_config/resnet50/offline/performance/run_1/console.out'; echo ${PIPESTATUS[0]} > exitstatus
python3 python/main.py --profile resnet50-onnxruntime --model "/home/orangepi/MLC/repos/local/cache/download-file_fcdf56f8/resnet50_v1.onnx" --dataset-path /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_f462e678 --output "/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_aa85ec32/test_results/orangepi5plus-reference-cpu-onnxruntime-v1.21.0-default_config/resnet50/offline/performance/run_1" --scenario Offline --threads 4 --user_conf /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/e04d6880b67b4ad285412bc8569d3971.conf --use_preprocessed_dataset --cache_dir /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_f462e678 --dataset-list /home/orangepi/MLC/repos/local/cache/extract-file_9cdedb0f/val.txt
INFO:main:Namespace(dataset='imagenet', dataset_path='/home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_f462e678', dataset_list='/home/orangepi/MLC/repos/local/cache/extract-file_9cdedb0f/val.txt', data_format=None, profile='resnet50-onnxruntime', scenario='Offline', max_batchsize=32, model='/home/orangepi/MLC/repos/local/cache/download-file_fcdf56f8/resnet50_v1.onnx', output='/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_aa85ec32/test_results/orangepi5plus-reference-cpu-onnxruntime-v1.21.0-default_config/resnet50/offline/performance/run_1', inputs=None, outputs=['ArgMax:0'], backend='onnxruntime', device=None, model_name='resnet50', threads=4, qps=None, cache=0, cache_dir='/home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_f462e678', preprocessed_dir=None, use_preprocessed_dataset=True, accuracy=False, find_peak_performance=False, debug=False, user_conf='/home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/e04d6880b67b4ad285412bc8569d3971.conf', audit_conf='audit.config', time=None, count=None, performance_sample_count=None, max_latency=None, samples_per_query=8)
/opt/rh/gcc-toolset-14/root/usr/include/c++/14/bits/stl_vector.h:1130: std::vector<_Tp, _Alloc>::reference std::vector<_Tp, _Alloc>::operator[](size_type) [with _Tp = unsigned int; _Alloc = std::allocator<unsigned int>; reference = unsigned int&; size_type = long unsigned int]: Assertion '__n < this->size()' failed.
./run_local.sh: line 30: 25363 Aborted                 (core dumped) python3 python/main.py --profile resnet50-onnxruntime --model "/home/orangepi/MLC/repos/local/cache/download-file_fcdf56f8/resnet50_v1.onnx" --dataset-path /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_f462e678 --output "/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_aa85ec32/test_results/orangepi5plus-reference-cpu-onnxruntime-v1.21.0-default_config/resnet50/offline/performance/run_1" --scenario Offline --threads 4 --user_conf /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/e04d6880b67b4ad285412bc8569d3971.conf --use_preprocessed_dataset --cache_dir /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_f462e678 --dataset-list /home/orangepi/MLC/repos/local/cache/extract-file_9cdedb0f/val.txt
Traceback (most recent call last):
  File "/home/orangepi/work/mlc/bin/mlcr", line 8, in <module>
    sys.exit(mlcr())
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/main.py", line 86, in mlcr
    main()
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/main.py", line 173, in main
    res = method(run_args)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 121, in call_script_module_function
    result = automation_instance.run(run_args)  # Pass args to the run method
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 225, in run
    r = self._run(i)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 1769, in _run
    r = customize_code.preprocess(ii)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/run-mlperf-inference-app/customize.py", line 284, in preprocess
    r = mlc.access(ii)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/action.py", line 56, in access
    result = method(options)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 121, in call_script_module_function
    result = automation_instance.run(run_args)  # Pass args to the run method
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 225, in run
    r = self._run(i)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 1843, in _run
    r = self._call_run_deps(prehook_deps, self.local_env_keys, local_env_keys_from_meta, env, state, const, const_state, add_deps_recursive,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3293, in _call_run_deps
    r = script._run_deps(deps, local_env_keys, env, state, const, const_state, add_deps_recursive, recursion_spaces,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3463, in _run_deps
    r = self.action_object.access(ii)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/action.py", line 56, in access
    result = method(options)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 121, in call_script_module_function
    result = automation_instance.run(run_args)  # Pass args to the run method
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 225, in run
    r = self._run(i)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 1859, in _run
    r = prepare_and_run_script_with_postprocessing(
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 5260, in prepare_and_run_script_with_postprocessing
    r = script_automation._call_run_deps(posthook_deps, local_env_keys, local_env_keys_from_meta, env, state, const, const_state,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3293, in _call_run_deps
    r = script._run_deps(deps, local_env_keys, env, state, const, const_state, add_deps_recursive, recursion_spaces,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3463, in _run_deps
    r = self.action_object.access(ii)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/action.py", line 56, in access
    result = method(options)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 121, in call_script_module_function
    result = automation_instance.run(run_args)  # Pass args to the run method
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 225, in run
    r = self._run(i)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 1886, in _run
    r = self._run_deps(post_deps, clean_env_keys_post_deps, env, state, const, const_state, add_deps_recursive, recursion_spaces,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3463, in _run_deps
    r = self.action_object.access(ii)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/action.py", line 56, in access
    result = method(options)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 131, in call_script_module_function
    raise ScriptExecutionError(f"Script {function_name} execution failed. Error : {error}")
mlc.script_action.ScriptExecutionError: Script run execution failed. Error : MLC script failed (name = benchmark-program, return code = 34304)


^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Please file an issue at https://github.com/mlcommons/mlperf-automations/issues along with the full MLC command being run and the relevant
or full console log.

@sujik18
Copy link
Member

sujik18 commented Mar 14, 2025

Hi @PapperYZ,
As per the logs, it seems the mlc script is using only 4 threads. Why do you think 8 threads are still being used?

I believe the script has failed due to an out-of-index error, which might have happened because of main memory constraints. So, running the script with a smaller batch size might help.

Here is the updated command. It will do a quick performance test run on a short dataset (500 images):

mlcr run-mlperf,inference,_find-performance,_short,_r5.0-dev \
   --model=resnet50 \
   --implementation=reference \
   --framework=onnxruntime \
   --category=edge \
   --scenario=Offline \
   --execution_mode=test \
   --device=cpu  \
   --quiet \
   --batch_size=8 \
   --threads=4 \
   --test_query_count=1000

@arjunsuresh
Copy link
Collaborator

"@arjunsuresh , would you help to make some adjust so that --threads=4 applies globally? "

Unfortunately, that's out of the scope of our automations. The automation code is only passing the inputs to the respective implementations and if the implementations are not supporting num_threads there is very little we can do. For example, the support for num_threads for resnet50 onnxruntime must be added here.

@sujik18 Did either "--batch_size=8" or "--threads=4" work for you for ResNet50 onnxruntime implementation?

@sujik18
Copy link
Member

sujik18 commented Mar 15, 2025

@arjunsuresh Yes, it did work for me. As my system only has 8GB ram, and if I try to run the script with the default values for threads and batch_size, it usually fails with an out of index error.

@arjunsuresh
Copy link
Collaborator

@sujik18 What was the perf difference between running with batch size 1 and batch size 8?

And to see if the threads=4 worked, you can monitor the output of htop.

@sujik18
Copy link
Member

sujik18 commented Mar 16, 2025

@arjunsuresh I haven't tried running the perf with batch size 1, I sticked to batch size 8 instead of the default value 32, for all perf evaluation.
I believe passing parameter for the threads tag does work, because initially the script was utilising all the 12 threads of my system, and the ram utilisation was at peak because of which the execution gets aborted. But running the script along with threads=8 fixed the issue and ram utilisation was also less.

@arjunsuresh
Copy link
Collaborator

But running the script along with threads=8 fixed the issue and ram utilisation was also less.

This could be for the preprocessing stage. You need to see the htop during the inference to confirm this.

@arjunsuresh
Copy link
Collaborator

@PapperYZ
Copy link
Author

Hi @sujik18 , I tried your recommended command above (noted it is still running with 4 cores), below are the error logs, please kindly take a look and advise... appreciate it!

***************************************************************************
CM script::benchmark-program/run.sh

Run Directory: /home/orangepi/MLC/repos/local/cache/get-git-repo_3c164293/inference/vision/classification_and_detection

CMD: ./run_local.sh onnxruntime resnet50 cpu --scenario Offline    --max-batchsize 8 --threads 4 --user_conf '/home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/6acb7c4c9229491993f34d767996e2a3.conf' --use_preprocessed_dataset --cache_dir /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_236f9d1d --dataset-list /home/orangepi/MLC/repos/local/cache/extract-file_c7f93c2e/val.txt 2>&1 | tee '/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_c7cc8ae6/test_results/orangepi5plus-reference-cpu-onnxruntime-v1.21.0-default_config/resnet50/offline/performance/run_1/console.out'; echo \${PIPESTATUS[0]} > exitstatus

[2025-03-17 12:00:41,773 module.py:5105 INFO] -          ! cd /home/orangepi/MLC/repos/local/cache/get-sys-utils-cm_f850c9c0
[2025-03-17 12:00:41,774 module.py:5106 INFO] -          ! call /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/benchmark-program/run-ubuntu.sh from tmp-run.sh

./run_local.sh onnxruntime resnet50 cpu --scenario Offline    --max-batchsize 8 --threads 4 --user_conf '/home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/6acb7c4c9229491993f34d767996e2a3.conf' --use_preprocessed_dataset --cache_dir /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_236f9d1d --dataset-list /home/orangepi/MLC/repos/local/cache/extract-file_c7f93c2e/val.txt 2>&1 | tee '/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_c7cc8ae6/test_results/orangepi5plus-reference-cpu-onnxruntime-v1.21.0-default_config/resnet50/offline/performance/run_1/console.out'; echo ${PIPESTATUS[0]} > exitstatus
python3 python/main.py --profile resnet50-onnxruntime --model "/home/orangepi/MLC/repos/local/cache/download-file_f6ae226d/resnet50_v1.onnx" --dataset-path /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_236f9d1d --output "/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_c7cc8ae6/test_results/orangepi5plus-reference-cpu-onnxruntime-v1.21.0-default_config/resnet50/offline/performance/run_1" --scenario Offline --max-batchsize 8 --threads 4 --user_conf /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/6acb7c4c9229491993f34d767996e2a3.conf --use_preprocessed_dataset --cache_dir /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_236f9d1d --dataset-list /home/orangepi/MLC/repos/local/cache/extract-file_c7f93c2e/val.txt
INFO:main:Namespace(dataset='imagenet', dataset_path='/home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_236f9d1d', dataset_list='/home/orangepi/MLC/repos/local/cache/extract-file_c7f93c2e/val.txt', data_format=None, profile='resnet50-onnxruntime', scenario='Offline', max_batchsize=8, model='/home/orangepi/MLC/repos/local/cache/download-file_f6ae226d/resnet50_v1.onnx', output='/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_c7cc8ae6/test_results/orangepi5plus-reference-cpu-onnxruntime-v1.21.0-default_config/resnet50/offline/performance/run_1', inputs=None, outputs=['ArgMax:0'], backend='onnxruntime', device=None, model_name='resnet50', threads=4, qps=None, cache=0, cache_dir='/home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_236f9d1d', preprocessed_dir=None, use_preprocessed_dataset=True, accuracy=False, find_peak_performance=False, debug=False, user_conf='/home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/6acb7c4c9229491993f34d767996e2a3.conf', audit_conf='audit.config', time=None, count=None, performance_sample_count=None, max_latency=None, samples_per_query=8)
/opt/rh/gcc-toolset-14/root/usr/include/c++/14/bits/stl_vector.h:1130: std::vector<_Tp, _Alloc>::reference std::vector<_Tp, _Alloc>::operator[](size_type) [with _Tp = unsigned int; _Alloc = std::allocator<unsigned int>; reference = unsigned int&; size_type = long unsigned int]: Assertion '__n < this->size()' failed.
./run_local.sh: line 30: 60796 Aborted                 (core dumped) python3 python/main.py --profile resnet50-onnxruntime --model "/home/orangepi/MLC/repos/local/cache/download-file_f6ae226d/resnet50_v1.onnx" --dataset-path /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_236f9d1d --output "/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_c7cc8ae6/test_results/orangepi5plus-reference-cpu-onnxruntime-v1.21.0-default_config/resnet50/offline/performance/run_1" --scenario Offline --max-batchsize 8 --threads 4 --user_conf /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/6acb7c4c9229491993f34d767996e2a3.conf --use_preprocessed_dataset --cache_dir /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_236f9d1d --dataset-list /home/orangepi/MLC/repos/local/cache/extract-file_c7f93c2e/val.txt
Traceback (most recent call last):
  File "/home/orangepi/work/mlc/bin/mlcr", line 8, in <module>
    sys.exit(mlcr())
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/main.py", line 86, in mlcr
    main()
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/main.py", line 173, in main
    res = method(run_args)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 121, in call_script_module_function
    result = automation_instance.run(run_args)  # Pass args to the run method
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 225, in run
    r = self._run(i)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 1776, in _run
    r = customize_code.preprocess(ii)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/run-mlperf-inference-app/customize.py", line 284, in preprocess
    r = mlc.access(ii)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/action.py", line 56, in access
    result = method(options)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 121, in call_script_module_function
    result = automation_instance.run(run_args)  # Pass args to the run method
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 225, in run
    r = self._run(i)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 1850, in _run
    r = self._call_run_deps(prehook_deps, self.local_env_keys, local_env_keys_from_meta, env, state, const, const_state, add_deps_recursive,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3300, in _call_run_deps
    r = script._run_deps(deps, local_env_keys, env, state, const, const_state, add_deps_recursive, recursion_spaces,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3470, in _run_deps
    r = self.action_object.access(ii)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/action.py", line 56, in access
    result = method(options)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 121, in call_script_module_function
    result = automation_instance.run(run_args)  # Pass args to the run method
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 225, in run
    r = self._run(i)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 1866, in _run
    r = prepare_and_run_script_with_postprocessing(
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 5259, in prepare_and_run_script_with_postprocessing
    r = script_automation._call_run_deps(posthook_deps, local_env_keys, local_env_keys_from_meta, env, state, const, const_state,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3300, in _call_run_deps
    r = script._run_deps(deps, local_env_keys, env, state, const, const_state, add_deps_recursive, recursion_spaces,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3470, in _run_deps
    r = self.action_object.access(ii)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/action.py", line 56, in access
    result = method(options)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 121, in call_script_module_function
    result = automation_instance.run(run_args)  # Pass args to the run method
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 225, in run
    r = self._run(i)
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 1893, in _run
    r = self._run_deps(post_deps, clean_env_keys_post_deps, env, state, const, const_state, add_deps_recursive, recursion_spaces,
  File "/home/orangepi/MLC/repos/mlcommons@mlperf-automations/automation/script/module.py", line 3470, in _run_deps
    r = self.action_object.access(ii)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/action.py", line 56, in access
    result = method(options)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 141, in run
    return self.call_script_module_function("run", run_args)
  File "/home/orangepi/work/mlc/lib/python3.10/site-packages/mlc/script_action.py", line 131, in call_script_module_function
    raise ScriptExecutionError(f"Script {function_name} execution failed. Error : {error}")
mlc.script_action.ScriptExecutionError: Script run execution failed. Error : MLC script failed (name = benchmark-program, return code = 34304)


^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Please file an issue at https://github.com/mlcommons/mlperf-automations/issues along with the full MLC command being run and the relevant
or full console log.

@sujik18
Copy link
Member

sujik18 commented Mar 18, 2025

Hi @PapperYZ , can you please share your system details, like OS, Main memory size, available disk storage, and as you have mentioned earlier that your AISC is similar to the Orange Pi system, I assume it only supports single thread execution. Also as @arjunsuresh mentioned initially, the threads parameter value is not taken into account, as I tried a test run with threads=4 tag but htop stats made it clear that the system was still using max threads (in mine case it was 12) which is available to the system.
@PapperYZ , if possible try to rerun the script with lowest batch_size value 1, and monitor the system stats using htop before beginning the perf and during the execution, to know the number of threads that is being used by the script and to check whether ram utilisation is reaching its peak.

@PapperYZ
Copy link
Author

Hi @sujik18
I am using an orange pi 5 plus system, and the system has a Rochchip rk3588 cpu,

Image

And I am using Ubuntu 22.04, I turned off the 4 big cores 4-7 using the commands below to disable them (I guess you can turn off some of your 12 cores to mimic what I have done here)

echo 0 | sudo tee /sys/devices/system/cpu/cpu4/online
echo 0 | sudo tee /sys/devices/system/cpu/cpu5/online
echo 0 | sudo tee /sys/devices/system/cpu/cpu6/online
echo 0 | sudo tee /sys/devices/system/cpu/cpu7/online

Image

my htop is as below:
Image

@PapperYZ
Copy link
Author

Hi @sujik18, batch_size=1 does not help with the error, see below bolded assertion msg, do you have thoughts about it?

INFO:main:Namespace(dataset='imagenet', dataset_path='/home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_236f9d1d', dataset_list='/home/orangepi/MLC/repos/local/cache/extract-file_c7f93c2e/val.txt', data_format=None, profile='resnet50-onnxruntime', scenario='Offline', max_batchsize=1, model='/home/orangepi/MLC/repos/local/cache/download-file_f6ae226d/resnet50_v1.onnx', output='/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_c7cc8ae6/test_results/orangepi5plus-reference-cpu-onnxruntime-v1.21.0-default_config/resnet50/offline/performance/run_1', inputs=None, outputs=['ArgMax:0'], backend='onnxruntime', device=None, model_name='resnet50', threads=4, qps=None, cache=0, cache_dir='/home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_236f9d1d', preprocessed_dir=None, use_preprocessed_dataset=True, accuracy=False, find_peak_performance=False, debug=False, user_conf='/home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/316769885346468f8e8ff829aa739448.conf', audit_conf='audit.config', time=None, count=None, performance_sample_count=None, max_latency=None, samples_per_query=8)
/opt/rh/gcc-toolset-14/root/usr/include/c++/14/bits/stl_vector.h:1130: std::vector<_Tp, _Alloc>::reference std::vector<_Tp, _Alloc>::operator [with _Tp = unsigned int; _Alloc = std::allocator; reference = unsigned int&; size_type = long unsigned int]: Assertion '__n < this->size()' failed.
./run_local.sh: line 30: 72243 Aborted (core dumped) python3 python/main.py --profile resnet50-onnxruntime --model "/home/orangepi/MLC/repos/local/cache/download-file_f6ae226d/resnet50_v1.onnx" --dataset-path /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_236f9d1d --output "/home/orangepi/MLC/repos/local/cache/get-mlperf-inference-results-dir_c7cc8ae6/test_results/orangepi5plus-reference-cpu-onnxruntime-v1.21.0-default_config/resnet50/offline/performance/run_1" --scenario Offline --max-batchsize 1 --threads 4 --user_conf /home/orangepi/MLC/repos/mlcommons@mlperf-automations/script/generate-mlperf-inference-user-conf/tmp/316769885346468f8e8ff829aa739448.conf --use_preprocessed_dataset --cache_dir /home/orangepi/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_236f9d1d --dataset-list /home/orangepi/MLC/repos/local/cache/extract-file_c7f93c2e/val.txt

@sujik18
Copy link
Member

sujik18 commented Mar 20, 2025

@PapperYZ , lowering the batch_size is not required as your system has enough RAM to support the default batch_size value, but was curious to know how the main memory size is 31G, isn't it should have been 32G?
Regarding the error, I am not sure exactly what might be causing the issue, I believe it might be due to invalid cached dataset or model, so
mlc rm cache --tags=imagenet,preprocessed -f helps to remove the cached dataset. Also once try mlc pull repo to fetch the latest changes before running the script again.

@arjunsuresh Can you please look into this error

@PapperYZ
Copy link
Author

PapperYZ commented Mar 24, 2025

Hi @sujik18 , remove cached dataset does not seem to help... it generates the same error as shown above.

Image

would you be able to turn off some of your cores using the commands I provided above and see if you see the same error? One more data point, when I turned those 4 A76 cores back online, the error immediately disappeared.

@sujik18
Copy link
Member

sujik18 commented Mar 25, 2025

Hi @PapperYZ, I will try running the script after disabling half of my system cores tonight or tomorrow. In the meantime, could you please check whether you monitored the number of threads being used both after running the script and before the script failure, as I had previously asked? You can do this by observing the number of threads in an idle state, rerunning the script with the rerun tag to avoid using the cached state, and then noting how many new threads are being created during the process.
As I believe the threads tag doesn't seems to be taken into account, because of which the system is still trying to use all the threads available, (i.e. 8). However, as you have turned off 4 cores, only 4 of the threads (as I believe orange pi doesn't support multi-threading) are available which is causing a thread management issue.

@PapperYZ
Copy link
Author

Hi @sujik18, if you see my htop screen I shared above, once I disable the 4 A76 cores, htop will only show 4 cores to me, so I am not sure how would I monitor threads... though I can clearly see that preprocessing is using 4 threads and works fine... the error only showed up during the final step.

Image

@PapperYZ
Copy link
Author

@sujik18 please see below htop state while the preprocess is ongoing

Image

Image

@PapperYZ
Copy link
Author

@sujik18 @arjunsuresh , I believe the issue is understood with Grok's help, I submitted a ticket in here:
microsoft/onnxruntime#24221

@sujik18
Copy link
Member

sujik18 commented Mar 28, 2025

Hi @PapperYZ, I tried running the benchmarking test after disabling one of the CPU threads (my processor is a 4600H with 6 physical cores and 12 threads), but I was unable to reproduce this error. My test run completed successfully. Here is the screenshot of htop during the execution time.

Image

@PapperYZ
Copy link
Author

Thank you for the test, it would be great to disable one cluster instead of just one core...
Also, would you help to do a quick reproduce with this easy script I provided in here: microsoft/onnxruntime#24221

Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants