We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
In my docker, I already have a image file. However, when I try to execute following command, it is downloading once again.
CSEMA@Masthan-PC MINGW64 /s/A/mlcflow (main) $ mlcr run-mlperf,inference,_find-performance,_full,_r5.0-dev --model=resnet50 --implementation=reference --framework=onnxruntime --category=edge --scenario=Offline --execution_mode=test --device=cpu --docker --quiet --test_query_count=1000 [2025-02-08 14:08:35,218 module.py:560 INFO] - * mlcr run-mlperf,inference,_find-performance,_full,_r5.0-dev [2025-02-08 14:08:35,227 module.py:560 INFO] - * mlcr get,mlcommons,inference,src [2025-02-08 14:08:35,231 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-mlperf-inference-src_0d343849\mlc-cached-state.json [2025-02-08 14:08:35,237 module.py:560 INFO] - * mlcr get,mlperf,inference,results,dir,_version.r5.0-dev [2025-02-08 14:08:35,240 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-mlperf-inference-results-dir_b14e6733\mlc-cached-state.json [2025-02-08 14:08:35,243 module.py:560 INFO] - * mlcr install,pip-package,for-mlc-python,_package.tabulate [2025-02-08 14:08:35,245 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\install-pip-package-for-mlc-python_abab6956\mlc-cached-state.json [2025-02-08 14:08:35,248 module.py:560 INFO] - * mlcr get,mlperf,inference,utils [2025-02-08 14:08:35,260 module.py:560 INFO] - * mlcr get,mlperf,inference,src [2025-02-08 14:08:35,261 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-mlperf-inference-src_0d343849\mlc-cached-state.json [2025-02-08 14:08:35,265 module.py:5487 INFO] - ! call "postprocess" from C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations\script\get-mlperf-inference-utils\customize.py Using MLCommons Inference source from C:\Users\CSEMA/MLC/repos\local\cache\get-git-repo_37fd06ba\inference
Running loadgen scenario: Offline and mode: performance [2025-02-08 14:08:35,354 module.py:560 INFO] - * mlcr build,dockerfile [2025-02-08 14:08:35,359 module.py:560 INFO] - * mlcr get,docker [2025-02-08 14:08:35,362 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-docker_9e55fd95\mlc-cached-state.json mlc pull repo && mlcr --tags=app,mlperf,inference,generic,_reference,_resnet50,_onnxruntime,_cpu,_test,_r5.0-dev_default,_offline --quiet=true --env.MLC_QUIET=yes --env.MLC_WINDOWS=yes --env.MLC_MLPERF_IMPLEMENTATION=reference --env.MLC_MLPERF_MODEL=resnet50 --env.MLC_MLPERF_RUN_STYLE=test --env.MLC_MLPERF_SKIP_SUBMISSION_GENERATION=False --env.MLC_DOCKER_PRIVILEGED_MODE=True --env.MLC_MLPERF_SUBMISSION_DIVISION=open --env.MLC_MLPERF_INFERENCE_TP_SIZE=1 --env.MLC_MLPERF_SUBMISSION_SYSTEM_TYPE=edge --env.MLC_MLPERF_DEVICE=cpu --env.MLC_MLPERF_USE_DOCKER=True --env.MLC_MLPERF_BACKEND=onnxruntime --env.MLC_MLPERF_LOADGEN_SCENARIO=Offline --env.MLC_TEST_QUERY_COUNT=1000 --env.MLC_MLPERF_FIND_PERFORMANCE_MODE=yes --env.MLC_MLPERF_LOADGEN_ALL_MODES=no --env.MLC_MLPERF_LOADGEN_MODE=performance --env.MLC_MLPERF_RESULT_PUSH_TO_GITHUB=False --env.MLC_MLPERF_SUBMISSION_GENERATION_STYLE=full --env.MLC_MLPERF_INFERENCE_VERSION=5.0-dev --env.MLC_RUN_MLPERF_INFERENCE_APP_DEFAULTS=r5.0-dev_default --env.MLC_MLPERF_SUBMISSION_CHECKER_VERSION=v5.0 --env.MLC_MLPERF_INFERENCE_SOURCE_VERSION=5.0.15 --env.MLC_MLPERF_LAST_RELEASE=v5.0 --env.+PYTHONPATH,=C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations\script\get-mlperf-inference-utils --env.MLC_MLPERF_INFERENCE_RESULTS_VERSION=r5.0-dev --env.MLC_MODEL=resnet50 --env.MLC_MLPERF_LOADGEN_COMPLIANCE=no --env.MLC_MLPERF_LOADGEN_EXTRA_OPTIONS= --env.MLC_MLPERF_LOADGEN_SCENARIOS,=Offline --env.MLC_MLPERF_LOADGEN_MODES,=performance --env.MLC_OUTPUT_FOLDER_NAME=test_results --add_deps_recursive.coco2014-original.tags=_full --add_deps_recursive.coco2014-preprocessed.tags=_full --add_deps_recursive.imagenet-original.tags=_full --add_deps_recursive.imagenet-preprocessed.tags=_full --add_deps_recursive.openimages-original.tags=_full --add_deps_recursive.openimages-preprocessed.tags=_full --add_deps_recursive.openorca-original.tags=_full --add_deps_recursive.openorca-preprocessed.tags=_full --add_deps_recursive.coco2014-dataset.tags=_full --add_deps_recursive.igbh-dataset.tags=_full --add_deps_recursive.get-mlperf-inference-results-dir.tags=_version.r5.0-dev --add_deps_recursive.get-mlperf-inference-submission-dir.tags=_version.r5.0-dev --add_deps_recursive.mlperf-inference-nvidia-scratch-space.tags=_version.r5.0-dev --v=False --print_env=False --print_deps=False --dump_version_info=True --quiet Dockerfile written at C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations\script\app-mlperf-inference\dockerfiles\ubuntu_22.04.Dockerfile [2025-02-08 14:08:35,440 docker.py:191 INFO] - Dockerfile generated at C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations\script\app-mlperf-inference\dockerfiles\ubuntu_22.04.Dockerfile [2025-02-08 14:08:35,508 module.py:560 INFO] - * mlcr get,docker [2025-02-08 14:08:35,510 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-docker_9e55fd95\mlc-cached-state.json [2025-02-08 14:08:35,516 module.py:560 INFO] - * mlcr get,mlperf,inference,submission,dir,local,_version.r5.0-dev [2025-02-08 14:08:35,520 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-mlperf-inference-submission-dir_c73e05d3\mlc-cached-state.json [2025-02-08 14:08:35,525 module.py:560 INFO] - * mlcr run,docker,container [2025-02-08 14:08:35,531 module.py:560 INFO] - * mlcr get,docker [2025-02-08 14:08:35,532 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-docker_9e55fd95\mlc-cached-state.json
Checking existing Docker container:
docker ps --format "{{ .ID }}," --filter "ancestor=localhost/local/mlc-script-app-mlperf-inference-generic--reference--resnet50--onnxruntime--cpu--test--r5.0-dev-default--offline:ubuntu-22.04-latest" 2> nul
Checking Docker images:
docker images -q localhost/local/mlc-script-app-mlperf-inference-generic--reference--resnet50--onnxruntime--cpu--test--r5.0-dev-default--offline:ubuntu-22.04-latest 2> nul
Docker image exists with ID: f30ca6880cbb
[2025-02-08 14:08:36,009 module.py:560 INFO] - * mlcr get,docker [2025-02-08 14:08:36,011 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-docker_9e55fd95\mlc-cached-state.json [2025-02-08 14:08:36,013 module.py:5487 INFO] - ! call "postprocess" from C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations\script\run-docker-container\customize.py
Container launch command:
docker run -it --entrypoint "" --shm-size=32gb --cap-add SYS_ADMIN --cap-add SYS_TIME --security-opt apparmor=unconfined --security-opt seccomp=unconfined --dns 8.8.8.8 --dns 8.8.4.4 -v "C:\Users\CSEMA\MLC\repos\local\cache\get-mlperf-inference-results-dir_b14e6733":/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733 -v "C:\Users\CSEMA\MLC\repos\local\cache\get-mlperf-inference-submission-dir_c73e05d3":/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-submission-dir_c73e05d3 localhost/local/mlc-script-app-mlperf-inference-generic--reference--resnet50--onnxruntime--cpu--test--r5.0-dev-default--offline:ubuntu-22.04-latest bash -c "(mlc pull repo && mlcr --tags=app,mlperf,inference,generic,_reference,_resnet50,_onnxruntime,_cpu,_test,_r5.0-dev_default,_offline --quiet=true --env.MLC_QUIET=yes --env.MLC_WINDOWS=yes --env.MLC_MLPERF_IMPLEMENTATION=reference --env.MLC_MLPERF_MODEL=resnet50 --env.MLC_MLPERF_RUN_STYLE=test --env.MLC_MLPERF_SKIP_SUBMISSION_GENERATION=False --env.MLC_DOCKER_PRIVILEGED_MODE=True --env.MLC_MLPERF_SUBMISSION_DIVISION=open --env.MLC_MLPERF_INFERENCE_TP_SIZE=1 --env.MLC_MLPERF_SUBMISSION_SYSTEM_TYPE=edge --env.MLC_MLPERF_DEVICE=cpu --env.MLC_MLPERF_USE_DOCKER=True --env.MLC_MLPERF_BACKEND=onnxruntime --env.MLC_MLPERF_LOADGEN_SCENARIO=Offline --env.MLC_TEST_QUERY_COUNT=1000 --env.MLC_MLPERF_FIND_PERFORMANCE_MODE=yes --env.MLC_MLPERF_LOADGEN_ALL_MODES=no --env.MLC_MLPERF_LOADGEN_MODE=performance --env.MLC_MLPERF_RESULT_PUSH_TO_GITHUB=False --env.MLC_MLPERF_SUBMISSION_GENERATION_STYLE=full --env.MLC_MLPERF_INFERENCE_VERSION=5.0-dev --env.MLC_RUN_MLPERF_INFERENCE_APP_DEFAULTS=r5.0-dev_default --env.MLC_MLPERF_SUBMISSION_CHECKER_VERSION=v5.0 --env.MLC_MLPERF_INFERENCE_SOURCE_VERSION=5.0.15 --env.MLC_MLPERF_LAST_RELEASE=v5.0 --env.+PYTHONPATH,=C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations\script\get-mlperf-inference-utils --env.MLC_MLPERF_INFERENCE_RESULTS_VERSION=r5.0-dev --env.MLC_TMP_CURRENT_PATH=S:\A\mlcflow --env.MLC_TMP_CURRENT_SCRIPT_REPO_PATH=C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations --env.MLC_TMP_CURRENT_SCRIPT_REPO_PATH_WITH_PREFIX=C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations --env.MLC_TMP_CURRENT_SCRIPT_PATH=C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations\script\run-mlperf-inference-app --env.MLC_TMP_PIP_VERSION_STRING= --env.MLC_MODEL=resnet50 --env.MLC_MLPERF_LOADGEN_COMPLIANCE=no --env.MLC_MLPERF_LOADGEN_EXTRA_OPTIONS= --env.MLC_MLPERF_LOADGEN_SCENARIOS,=Offline --env.MLC_MLPERF_LOADGEN_MODES,=performance --env.MLC_OUTPUT_FOLDER_NAME=test_results --add_deps_recursive.coco2014-original.tags=_full --add_deps_recursive.coco2014-preprocessed.tags=_full --add_deps_recursive.imagenet-original.tags=_full --add_deps_recursive.imagenet-preprocessed.tags=_full --add_deps_recursive.openimages-original.tags=_full --add_deps_recursive.openimages-preprocessed.tags=_full --add_deps_recursive.openorca-original.tags=_full --add_deps_recursive.openorca-preprocessed.tags=_full --add_deps_recursive.coco2014-dataset.tags=_full --add_deps_recursive.igbh-dataset.tags=_full --add_deps_recursive.get-mlperf-inference-results-dir.tags=_version.r5.0-dev --add_deps_recursive.get-mlperf-inference-submission-dir.tags=_version.r5.0-dev --add_deps_recursive.mlperf-inference-nvidia-scratch-space.tags=_version.r5.0-dev --v=False --print_env=False --print_deps=False --dump_version_info=True --env.MLC_MLPERF_INFERENCE_RESULTS_DIR=/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733 --env.OUTPUT_BASE_DIR=/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733 --env.MLC_MLPERF_INFERENCE_SUBMISSION_DIR=/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-submission-dir_c73e05d3/mlperf-inference-submission && bash ) || bash"
[2025-02-08 08:38:39,292 main.py:1017 INFO] - Repository mlperf-automations already exists at /home/mlcuser/MLC/repos/mlcommons@mlperf-automations. Pulling latest changes... remote: Enumerating objects: 74, done. remote: Counting objects: 100% (31/31), done. remote: Compressing objects: 100% (24/24), done. remote: Total 74 (delta 16), reused 7 (delta 7), pack-reused 43 (from 2) Unpacking objects: 100% (74/74), 114.08 KiB | 957.00 KiB/s, done. From https://github.com/mlcommons/mlperf-automations eae72d8..dce8947 dev -> origin/dev Updating eae72d8..dce8947 Fast-forward .github/workflows/test-mlc-script-features.yml | 4 +++- .github/workflows/test-mlperf-inference-abtf-poc.yml | 1 + .github/workflows/test-mlperf-inference-bert-deepsparse-tf-onnxruntime-pytorch.yml | 1 + .github/workflows/test-mlperf-inference-mlcommons-cpp-resnet50.yml | 1 + .github/workflows/test-mlperf-inference-rgat.yml | 3 ++- .github/workflows/test-mlperf-inference-tvm-resnet50.yml | 1 + .github/workflows/test-mlperf-loadgen-onnx-huggingface-bert-fp32-squad.yml | 2 +- .github/workflows/test-nvidia-mlperf-inference-implementations.yml | 2 +- automation/script/docker_utils.py | 2 +- automation/script/module.py | 18 ++---------------- script/app-mlperf-inference/customize.py | 2 +- script/app-mlperf-inference/meta.yaml | 5 +++-- script/generate-docs-for-all-scripts.cmd | 1 - script/generate-mlperf-inference-submission/meta.yaml | 2 +- script/get-dataset-igbh/customize.py | 2 +- script/get-dataset-igbh/meta.yaml | 3 +++ script/get-platform-details/meta.yaml | 1 + script/install-pip-package-for-cmind-python/README.md | 1 - script/{install-pip-package-for-cmind-python => install-pip-package-for-mlc-python}/COPYRIGHT.md | 0 script/{install-pip-package-for-cmind-python => install-pip-package-for-mlc-python}/customize.py | 0 script/{install-pip-package-for-cmind-python => install-pip-package-for-mlc-python}/meta.yaml | 6 +++--- script/run-docker-container/customize.py | 6 +++++- script/run-docker-container/meta.yaml | 1 + script/run-mlperf-automotive-app/meta.yaml | 2 +- script/run-mlperf-inference-app/meta.yaml | 4 +--- script/run-mlperf-inference-submission-checker/customize.py | 2 -- script/run-mlperf-inference-submission-checker/meta.yaml | 10 +++++++--- script/submit-mlperf-results/customize.py | 18 ++++++++++++++---- script/tar-my-folder/customize.py | 1 + script/tar-my-folder/meta.yaml | 2 ++ 30 files changed, 59 insertions(+), 45 deletions(-) delete mode 100644 script/generate-docs-for-all-scripts.cmd delete mode 100644 script/install-pip-package-for-cmind-python/README.md rename script/{install-pip-package-for-cmind-python => install-pip-package-for-mlc-python}/COPYRIGHT.md (100%) rename script/{install-pip-package-for-cmind-python => install-pip-package-for-mlc-python}/customize.py (100%) rename script/{install-pip-package-for-cmind-python => install-pip-package-for-mlc-python}/meta.yaml (71%) [2025-02-08 08:38:46,147 main.py:1025 INFO] - Repository successfully pulled. [2025-02-08 08:38:46,148 main.py:1026 INFO] - Registering the repo in repos.json [2025-02-08 08:38:46,148 main.py:1046 INFO] - No changes made to repos.json. [2025-02-08 08:38:48,397 module.py:560 INFO] - * mlcr app,mlperf,inference,generic,_reference,_resnet50,_onnxruntime,_cpu,_test,_r5.0-dev_default,_offline [2025-02-08 08:38:48,397 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,398 module.py:816 DEBUG] - - Found script::app-mlperf-inference, d775cac873ee4231 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/app-mlperf-inference [2025-02-08 08:38:48,398 module.py:2412 DEBUG] - Prepared variations: _reference,_resnet50,_onnxruntime,_cpu,_test,_r5.0-dev_default,_offline,_float32 [2025-02-08 08:38:48,400 module.py:1628 DEBUG] - - Checking dependencies on other MLC scripts: [2025-02-08 08:38:48,408 module.py:560 INFO] - * mlcr detect,os [2025-02-08 08:38:48,408 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,408 module.py:816 DEBUG] - - Found script::detect-os, 863735b7db8c44fc in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os [2025-02-08 08:38:48,411 module.py:1759 DEBUG] - - Running preprocess ... [2025-02-08 08:38:48,414 module.py:5333 DEBUG] - - Running native script "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh" from temporal script "tmp-run.sh" in "/home/mlcuser" ... [2025-02-08 08:38:48,414 module.py:5340 INFO] - ! cd /home/mlcuser [2025-02-08 08:38:48,414 module.py:5341 INFO] - ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh [2025-02-08 08:38:48,442 module.py:5487 INFO] - ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py [2025-02-08 08:38:48,442 module.py:5550 DEBUG] - - Running postprocess ... [2025-02-08 08:38:48,459 module.py:2192 INFO] - - running time of script "detect-os,detect,os,info": 0.06 sec. [2025-02-08 08:38:48,465 module.py:560 INFO] - * mlcr get,sys-utils-cm [2025-02-08 08:38:48,465 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,465 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,sys-utils-cm [2025-02-08 08:38:48,465 module.py:667 DEBUG] - - Number of cached script outputs found: 1 [2025-02-08 08:38:48,465 module.py:816 DEBUG] - - Found script::get-sys-utils-cm, bc90993277e84b8e in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-sys-utils-cm [2025-02-08 08:38:48,466 module.py:4823 DEBUG] - - Checking if script execution is already cached ... [2025-02-08 08:38:48,466 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,sys-utils-cm,sys-utils-mlc [2025-02-08 08:38:48,467 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-sys-utils-cm_edc7d689 [2025-02-08 08:38:48,467 module.py:1228 DEBUG] - - Checking dynamic dependencies on other MLC scripts: [2025-02-08 08:38:48,467 module.py:1238 DEBUG] - - Processing env after dependencies ... [2025-02-08 08:38:48,467 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:48,467 module.py:1261 DEBUG] - - Loading state from cached entry ... [2025-02-08 08:38:48,468 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-sys-utils-cm_edc7d689/mlc-cached-state.json [2025-02-08 08:38:48,468 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts: [2025-02-08 08:38:48,468 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts: [2025-02-08 08:38:48,468 module.py:2192 INFO] - - running time of script "get,sys-utils-cm,sys-utils-mlc": 0.01 sec. [2025-02-08 08:38:48,477 module.py:560 INFO] - * mlcr get,python [2025-02-08 08:38:48,477 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,477 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python [2025-02-08 08:38:48,477 module.py:667 DEBUG] - - Number of cached script outputs found: 1 [2025-02-08 08:38:48,477 module.py:816 DEBUG] - - Found script::get-python3, d0b5dd74373f4a62 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-python3 [2025-02-08 08:38:48,479 module.py:4823 DEBUG] - - Checking if script execution is already cached ... [2025-02-08 08:38:48,480 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python,python3,get-python,get-python3 [2025-02-08 08:38:48,480 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c [2025-02-08 08:38:48,480 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:48,480 module.py:1261 DEBUG] - - Loading state from cached entry ... [2025-02-08 08:38:48,481 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c/mlc-cached-state.json [2025-02-08 08:38:48,481 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts: [2025-02-08 08:38:48,481 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts: [2025-02-08 08:38:48,481 module.py:2192 INFO] - - running time of script "get,python,python3,get-python,get-python3": 0.01 sec. [2025-02-08 08:38:48,482 module.py:2220 INFO] - Path to Python: /home/mlcuser/venv/mlc/bin/python3 [2025-02-08 08:38:48,482 module.py:2220 INFO] - Python version: 3.10.12 [2025-02-08 08:38:48,499 module.py:560 INFO] - * mlcr get,mlcommons,inference,src [2025-02-08 08:38:48,499 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,499 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,mlcommons,inference,src [2025-02-08 08:38:48,499 module.py:667 DEBUG] - - Number of cached script outputs found: 2 [2025-02-08 08:38:48,500 module.py:816 DEBUG] - - Found script::get-mlperf-inference-src, 4b57186581024797 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-src [2025-02-08 08:38:48,500 module.py:2412 DEBUG] - Prepared variations: _short-history [2025-02-08 08:38:48,500 module.py:970 DEBUG] - - Requested version: == r5.0 [2025-02-08 08:38:48,502 module.py:4823 DEBUG] - - Checking if script execution is already cached ... [2025-02-08 08:38:48,502 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,mlcommons,inference,src,source,inference-src,inference-source,mlperf,version-r5.0 [2025-02-08 08:38:48,502 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-src_0d0695b9 [2025-02-08 08:38:48,502 module.py:1228 DEBUG] - - Checking dynamic dependencies on other MLC scripts: [2025-02-08 08:38:48,502 module.py:1238 DEBUG] - - Processing env after dependencies ... [2025-02-08 08:38:48,502 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:48,502 module.py:1261 DEBUG] - - Loading state from cached entry ... [2025-02-08 08:38:48,504 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-src_0d0695b9/mlc-cached-state.json [2025-02-08 08:38:48,504 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts: [2025-02-08 08:38:48,504 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts: [2025-02-08 08:38:48,504 module.py:2192 INFO] - - running time of script "get,src,source,inference,inference-src,inference-source,mlperf,mlcommons": 0.02 sec. [2025-02-08 08:38:48,510 module.py:560 INFO] - * mlcr get,mlperf,inference,utils [2025-02-08 08:38:48,511 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,511 module.py:816 DEBUG] - - Found script::get-mlperf-inference-utils, e341e5f86d8342e5 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-utils [2025-02-08 08:38:48,512 module.py:1628 DEBUG] - - Checking dependencies on other MLC scripts: [2025-02-08 08:38:48,529 module.py:560 INFO] - * mlcr get,mlperf,inference,src [2025-02-08 08:38:48,529 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,529 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,mlperf,inference,src [2025-02-08 08:38:48,530 module.py:667 DEBUG] - - Number of cached script outputs found: 2 [2025-02-08 08:38:48,530 module.py:816 DEBUG] - - Found script::get-mlperf-inference-src, 4b57186581024797 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-src [2025-02-08 08:38:48,530 module.py:2412 DEBUG] - Prepared variations: _short-history [2025-02-08 08:38:48,530 module.py:970 DEBUG] - - Requested version: == r5.0 [2025-02-08 08:38:48,531 module.py:4823 DEBUG] - - Checking if script execution is already cached ... [2025-02-08 08:38:48,531 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,mlperf,inference,src,source,inference-src,inference-source,mlcommons,version-r5.0 [2025-02-08 08:38:48,532 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-src_0d0695b9 [2025-02-08 08:38:48,532 module.py:1228 DEBUG] - - Checking dynamic dependencies on other MLC scripts: [2025-02-08 08:38:48,532 module.py:1238 DEBUG] - - Processing env after dependencies ... [2025-02-08 08:38:48,533 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:48,533 module.py:1261 DEBUG] - - Loading state from cached entry ... [2025-02-08 08:38:48,533 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-src_0d0695b9/mlc-cached-state.json [2025-02-08 08:38:48,534 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts: [2025-02-08 08:38:48,534 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts: [2025-02-08 08:38:48,534 module.py:2192 INFO] - - running time of script "get,src,source,inference,inference-src,inference-source,mlperf,mlcommons": 0.02 sec. [2025-02-08 08:38:48,535 module.py:1637 DEBUG] - - Processing env after dependencies ... [2025-02-08 08:38:48,536 module.py:1759 DEBUG] - - Running preprocess ... [2025-02-08 08:38:48,540 module.py:5487 INFO] - ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-utils/customize.py [2025-02-08 08:38:48,540 module.py:5550 DEBUG] - - Running postprocess ... [2025-02-08 08:38:48,543 module.py:2192 INFO] - - running time of script "get,mlperf,inference,util,utils,functions": 0.04 sec. [2025-02-08 08:38:48,552 module.py:560 INFO] - * mlcr get,dataset-aux,imagenet-aux [2025-02-08 08:38:48,552 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,552 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,dataset-aux,imagenet-aux [2025-02-08 08:38:48,552 module.py:667 DEBUG] - - Number of cached script outputs found: 1 [2025-02-08 08:38:48,552 module.py:816 DEBUG] - - Found script::get-dataset-imagenet-aux, bb2c6dd8c8c64217 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-dataset-imagenet-aux [2025-02-08 08:38:48,553 module.py:2412 DEBUG] - Prepared variations: _from.berkeleyvision,_2012 [2025-02-08 08:38:48,553 module.py:4823 DEBUG] - - Checking if script execution is already cached ... [2025-02-08 08:38:48,553 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,dataset-aux,imagenet-aux,aux,image-classification [2025-02-08 08:38:48,554 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-aux_8c5cefe0 [2025-02-08 08:38:48,554 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:48,554 module.py:1261 DEBUG] - - Loading state from cached entry ... [2025-02-08 08:38:48,555 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-aux_8c5cefe0/mlc-cached-state.json [2025-02-08 08:38:48,555 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts: [2025-02-08 08:38:48,555 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts: [2025-02-08 08:38:48,555 module.py:2192 INFO] - - running time of script "get,aux,dataset-aux,image-classification,imagenet-aux": 0.01 sec. [2025-02-08 08:38:48,556 module.py:1637 DEBUG] - - Processing env after dependencies ... [2025-02-08 08:38:48,567 module.py:1759 DEBUG] - - Running preprocess ... [2025-02-08 08:38:48,574 module.py:1834 DEBUG] - { "+MLC_HOST_OS_DEFAULT_LIBRARY_PATH": [ "/usr/local/lib/x86_64-linux-gnu", "/lib/x86_64-linux-gnu", "/usr/lib/x86_64-linux-gnu", "/usr/lib/x86_64-linux-gnu64", "/usr/local/lib64", "/lib64", "/usr/lib64", "/usr/local/lib", "/lib", "/usr/lib", "/usr/x86_64-linux-gnu/lib64", "/usr/x86_64-linux-gnu/lib" ], "+PYTHONPATH": [ "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/vision/classification_and_detection/python", "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/tools/submission", "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-utils" ], "MLC_CNNDM_ACCURACY_DTYPE": "int32", "MLC_DATASET_AUX_PATH": "/home/mlcuser/MLC/repos/local/cache/extract-file_5e1e9382", "MLC_DATASET_AUX_VER": "2012", "MLC_DOCKER_PRIVILEGED_MODE": "True", "MLC_ENV_NVMITTEN_DOCKER_WHEEL_PATH": "/opt/nvmitten-0.1.3b0-cp38-cp38-linux_x86_64.whl", "MLC_GET_PLATFORM_DETAILS": true, "MLC_HOST_OS_BITS": "64", "MLC_HOST_OS_FLAVOR": "ubuntu", "MLC_HOST_OS_FLAVOR_LIKE": "debian", "MLC_HOST_OS_GLIBC_VERSION": "2.35", "MLC_HOST_OS_KERNEL_VERSION": "5.15.167.4-microsoft-standard-WSL2", "MLC_HOST_OS_MACHINE": "x86_64", "MLC_HOST_OS_PACKAGE_MANAGER": "apt", "MLC_HOST_OS_PACKAGE_MANAGER_INSTALL_CMD": "DEBIAN_FRONTEND=noninteractive apt-get install -y", "MLC_HOST_OS_PACKAGE_MANAGER_UPDATE_CMD": "apt-get update -y", "MLC_HOST_OS_TYPE": "linux", "MLC_HOST_OS_VERSION": "22.04", "MLC_HOST_PLATFORM_FLAVOR": "x86_64", "MLC_HOST_PYTHON_BITS": "64", "MLC_HOST_SYSTEM_NAME": "e85017639ed6", "MLC_IMAGENET_ACCURACY_DTYPE": "float32", "MLC_LIBRISPEECH_ACCURACY_DTYPE": "float32", "MLC_MLPERF_BACKEND": "onnxruntime", "MLC_MLPERF_DEVICE": "cpu", "MLC_MLPERF_FIND_PERFORMANCE_MODE": "yes", "MLC_MLPERF_IMPLEMENTATION": "mlcommons_python", "MLC_MLPERF_INFERENCE_3DUNET_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/vision/medical_imaging/3d-unet-kits19", "MLC_MLPERF_INFERENCE_BERT_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/language/bert", "MLC_MLPERF_INFERENCE_CLASSIFICATION_AND_DETECTION_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/vision/classification_and_detection", "MLC_MLPERF_INFERENCE_CONF_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/mlperf.conf", "MLC_MLPERF_INFERENCE_DLRM_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/recommendation/dlrm", "MLC_MLPERF_INFERENCE_DLRM_V2_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/recommendation/dlrm_v2", "MLC_MLPERF_INFERENCE_GPTJ_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/language/gpt-j", "MLC_MLPERF_INFERENCE_POINTPAINTING_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/automotive/3d-object-detection", "MLC_MLPERF_INFERENCE_RESULTS_DIR": "/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733", "MLC_MLPERF_INFERENCE_RESULTS_VERSION": "r5.0-dev", "MLC_MLPERF_INFERENCE_RGAT_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/graph/R-GAT", "MLC_MLPERF_INFERENCE_RNNT_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/speech_recognition/rnnt", "MLC_MLPERF_INFERENCE_SOURCE": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference", "MLC_MLPERF_INFERENCE_SOURCE_VERSION": "5.0.15", "MLC_MLPERF_INFERENCE_SUBMISSION_DIR": "/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-submission-dir_c73e05d3/mlperf-inference-submission", "MLC_MLPERF_INFERENCE_TP_SIZE": "1", "MLC_MLPERF_INFERENCE_VERSION": "5.0-dev", "MLC_MLPERF_INFERENCE_VISION_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/vision", "MLC_MLPERF_LAST_RELEASE": "v5.0", "MLC_MLPERF_LOADGEN_ALL_MODES": "no", "MLC_MLPERF_LOADGEN_COMPLIANCE": "no", "MLC_MLPERF_LOADGEN_EXTRA_OPTIONS": "", "MLC_MLPERF_LOADGEN_MODE": "performance", "MLC_MLPERF_LOADGEN_SCENARIO": "Offline", "MLC_MLPERF_MODEL": "resnet50", "MLC_MLPERF_MODEL_EQUAL_ISSUE_MODE": "no", "MLC_MLPERF_MODEL_PRECISION": "float32", "MLC_MLPERF_PRINT_SUMMARY": "no", "MLC_MLPERF_PYTHON": "yes", "MLC_MLPERF_QUANTIZATION": false, "MLC_MLPERF_RESULT_PUSH_TO_GITHUB": "False", "MLC_MLPERF_RUN_STYLE": "test", "MLC_MLPERF_SKIP_SUBMISSION_GENERATION": "False", "MLC_MLPERF_SUBMISSION_CHECKER_VERSION": "v5.0", "MLC_MLPERF_SUBMISSION_DIVISION": "open", "MLC_MLPERF_SUBMISSION_GENERATION_STYLE": "full", "MLC_MLPERF_SUBMISSION_SYSTEM_TYPE": "edge", "MLC_MLPERF_USE_DOCKER": "True", "MLC_MODEL": "resnet50", "MLC_OPENIMAGES_ACCURACY_DTYPE": "float32", "MLC_OUTPUT_FOLDER_NAME": "test_results", "MLC_PYTHON_BIN": "python3", "MLC_PYTHON_BIN_PATH": "/home/mlcuser/venv/mlc/bin", "MLC_PYTHON_BIN_WITH_PATH": "/home/mlcuser/venv/mlc/bin/python3", "MLC_PYTHON_CACHE_TAGS": "version-3.10.12,non-virtual", "MLC_PYTHON_MAJOR_VERSION": "3", "MLC_PYTHON_MINOR_VERSION": "10", "MLC_PYTHON_PATCH_VERSION": "12", "MLC_PYTHON_VERSION": "3.10.12", "MLC_QUIET": "yes", "MLC_REGENERATE_MEASURE_FILES": "yes", "MLC_RUN_MLPERF_INFERENCE_APP_DEFAULTS": "r5.0-dev_default", "MLC_SKIP_SYS_UTILS": "yes", "MLC_SQUAD_ACCURACY_DTYPE": "float32", "MLC_TEST_QUERY_COUNT": "1000", "MLC_TMP_CURRENT_PATH": "/home/mlcuser", "MLC_TMP_CURRENT_SCRIPT_PATH": "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/app-mlperf-inference", "MLC_TMP_CURRENT_SCRIPT_REPO_PATH": "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations", "MLC_TMP_CURRENT_SCRIPT_REPO_PATH_WITH_PREFIX": "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations", "MLC_TMP_PIP_VERSION_STRING": "", "MLC_VERBOSE": "yes", "MLC_WINDOWS": "yes", "OUTPUT_BASE_DIR": "/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733" } [2025-02-08 08:38:48,574 module.py:1838 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:48,681 module.py:560 INFO] - * mlcr app,mlperf,reference,inference,_resnet50,_onnxruntime,_cpu,_offline,_fp32 [2025-02-08 08:38:48,682 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,682 module.py:816 DEBUG] - - Found script::app-mlperf-inference-mlcommons-python, ff149e9781fc4b65 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/app-mlperf-inference-mlcommons-python [2025-02-08 08:38:48,682 module.py:2412 DEBUG] - Prepared variations: _resnet50,_onnxruntime,_cpu,_offline,_fp32,_python [2025-02-08 08:38:48,685 module.py:1628 DEBUG] - - Checking dependencies on other MLC scripts: [2025-02-08 08:38:48,691 module.py:560 INFO] - * mlcr detect,os [2025-02-08 08:38:48,692 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,692 module.py:816 DEBUG] - - Found script::detect-os, 863735b7db8c44fc in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os [2025-02-08 08:38:48,693 module.py:1759 DEBUG] - - Running preprocess ... [2025-02-08 08:38:48,696 module.py:5333 DEBUG] - - Running native script "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh" from temporal script "tmp-run.sh" in "/home/mlcuser" ... [2025-02-08 08:38:48,696 module.py:5340 INFO] - ! cd /home/mlcuser [2025-02-08 08:38:48,696 module.py:5341 INFO] - ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh [2025-02-08 08:38:48,712 module.py:5487 INFO] - ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py [2025-02-08 08:38:48,712 module.py:5550 DEBUG] - - Running postprocess ... [2025-02-08 08:38:48,718 module.py:2192 INFO] - - running time of script "detect-os,detect,os,info": 0.03 sec. [2025-02-08 08:38:48,723 module.py:560 INFO] - * mlcr detect,cpu [2025-02-08 08:38:48,723 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,724 module.py:816 DEBUG] - - Found script::detect-cpu, 586c8a43320142f7 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu [2025-02-08 08:38:48,725 module.py:1628 DEBUG] - - Checking dependencies on other MLC scripts: [2025-02-08 08:38:48,730 module.py:560 INFO] - * mlcr detect,os [2025-02-08 08:38:48,731 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,731 module.py:816 DEBUG] - - Found script::detect-os, 863735b7db8c44fc in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os [2025-02-08 08:38:48,732 module.py:1759 DEBUG] - - Running preprocess ... [2025-02-08 08:38:48,735 module.py:5333 DEBUG] - - Running native script "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh" from temporal script "tmp-run.sh" in "/home/mlcuser" ... [2025-02-08 08:38:48,736 module.py:5340 INFO] - ! cd /home/mlcuser [2025-02-08 08:38:48,736 module.py:5341 INFO] - ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh [2025-02-08 08:38:48,750 module.py:5487 INFO] - ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py [2025-02-08 08:38:48,750 module.py:5550 DEBUG] - - Running postprocess ... [2025-02-08 08:38:48,757 module.py:2192 INFO] - - running time of script "detect-os,detect,os,info": 0.03 sec. [2025-02-08 08:38:48,758 module.py:1637 DEBUG] - - Processing env after dependencies ... [2025-02-08 08:38:48,758 module.py:1759 DEBUG] - - Running preprocess ... [2025-02-08 08:38:48,762 module.py:5333 DEBUG] - - Running native script "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh" from temporal script "tmp-run.sh" in "/home/mlcuser" ... [2025-02-08 08:38:48,762 module.py:5340 INFO] - ! cd /home/mlcuser [2025-02-08 08:38:48,762 module.py:5341 INFO] - ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh from tmp-run.sh [2025-02-08 08:38:48,809 module.py:5487 INFO] - ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/customize.py [2025-02-08 08:38:48,809 module.py:5550 DEBUG] - - Running postprocess ... [2025-02-08 08:38:48,813 module.py:2192 INFO] - - running time of script "detect,cpu,detect-cpu,info": 0.09 sec. [2025-02-08 08:38:48,820 module.py:560 INFO] - * mlcr get,sys-utils-cm [2025-02-08 08:38:48,820 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,820 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,sys-utils-cm [2025-02-08 08:38:48,821 module.py:667 DEBUG] - - Number of cached script outputs found: 1 [2025-02-08 08:38:48,821 module.py:816 DEBUG] - - Found script::get-sys-utils-cm, bc90993277e84b8e in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-sys-utils-cm [2025-02-08 08:38:48,822 module.py:4823 DEBUG] - - Checking if script execution is already cached ... [2025-02-08 08:38:48,822 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,sys-utils-cm,sys-utils-mlc [2025-02-08 08:38:48,823 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-sys-utils-cm_edc7d689 [2025-02-08 08:38:48,823 module.py:1228 DEBUG] - - Checking dynamic dependencies on other MLC scripts: [2025-02-08 08:38:48,823 module.py:1238 DEBUG] - - Processing env after dependencies ... [2025-02-08 08:38:48,824 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:48,824 module.py:1261 DEBUG] - - Loading state from cached entry ... [2025-02-08 08:38:48,824 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-sys-utils-cm_edc7d689/mlc-cached-state.json [2025-02-08 08:38:48,824 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts: [2025-02-08 08:38:48,825 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts: [2025-02-08 08:38:48,825 module.py:2192 INFO] - - running time of script "get,sys-utils-cm,sys-utils-mlc": 0.01 sec. [2025-02-08 08:38:48,834 module.py:560 INFO] - * mlcr get,python [2025-02-08 08:38:48,834 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,834 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python [2025-02-08 08:38:48,834 module.py:667 DEBUG] - - Number of cached script outputs found: 1 [2025-02-08 08:38:48,834 module.py:816 DEBUG] - - Found script::get-python3, d0b5dd74373f4a62 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-python3 [2025-02-08 08:38:48,835 module.py:4823 DEBUG] - - Checking if script execution is already cached ... [2025-02-08 08:38:48,835 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python,python3,get-python,get-python3 [2025-02-08 08:38:48,836 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c [2025-02-08 08:38:48,836 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:48,836 module.py:1261 DEBUG] - - Loading state from cached entry ... [2025-02-08 08:38:48,836 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c/mlc-cached-state.json [2025-02-08 08:38:48,836 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts: [2025-02-08 08:38:48,836 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts: [2025-02-08 08:38:48,837 module.py:2192 INFO] - - running time of script "get,python,python3,get-python,get-python3": 0.01 sec. [2025-02-08 08:38:48,837 module.py:2220 INFO] - Path to Python: /home/mlcuser/venv/mlc/bin/python3 [2025-02-08 08:38:48,837 module.py:2220 INFO] - Python version: 3.10.12 [2025-02-08 08:38:48,895 module.py:560 INFO] - * mlcr get,generic-python-lib,_onnxruntime [2025-02-08 08:38:48,895 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,895 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,generic-python-lib,_onnxruntime [2025-02-08 08:38:48,895 module.py:667 DEBUG] - - Number of cached script outputs found: 1 [2025-02-08 08:38:48,895 module.py:816 DEBUG] - - Found script::get-generic-python-lib, 94b62a682bc44791 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib [2025-02-08 08:38:48,896 module.py:2412 DEBUG] - Prepared variations: _onnxruntime [2025-02-08 08:38:48,898 module.py:4823 DEBUG] - - Checking if script execution is already cached ... [2025-02-08 08:38:48,899 module.py:4856 DEBUG] - - Prepared explicit variations: _onnxruntime [2025-02-08 08:38:48,899 module.py:4875 DEBUG] - - Prepared variations: _onnxruntime [2025-02-08 08:38:48,899 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,generic-python-lib,install,generic,pip-package,_onnxruntime,deps-python-version-3.10.12,deps-python-non-virtual [2025-02-08 08:38:48,907 module.py:560 INFO] - * mlcr get,python3 [2025-02-08 08:38:48,907 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:48,907 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python3 [2025-02-08 08:38:48,907 module.py:667 DEBUG] - - Number of cached script outputs found: 1 [2025-02-08 08:38:48,907 module.py:816 DEBUG] - - Found script::get-python3, d0b5dd74373f4a62 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-python3 [2025-02-08 08:38:48,908 module.py:4823 DEBUG] - - Checking if script execution is already cached ... [2025-02-08 08:38:48,908 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python3,python,get-python,get-python3 [2025-02-08 08:38:48,908 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c [2025-02-08 08:38:48,908 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:48,908 module.py:1261 DEBUG] - - Loading state from cached entry ... [2025-02-08 08:38:48,908 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c/mlc-cached-state.json [2025-02-08 08:38:48,909 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts: [2025-02-08 08:38:48,909 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts: [2025-02-08 08:38:48,909 module.py:2220 INFO] - Path to Python: /home/mlcuser/venv/mlc/bin/python3 [2025-02-08 08:38:48,909 module.py:2220 INFO] - Python version: 3.10.12 [2025-02-08 08:38:48,909 module.py:5333 DEBUG] - - Running native script "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh" from temporal script "tmp-run.sh" in "/home/mlcuser" ... [2025-02-08 08:38:48,909 module.py:5340 INFO] - ! cd /home/mlcuser [2025-02-08 08:38:48,909 module.py:5341 INFO] - ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh [2025-02-08 08:38:48,988 module.py:5487 INFO] - ! call "detect_version" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py Detected version: 1.20.1 [2025-02-08 08:38:48,993 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-generic-python-lib_77cd7fef [2025-02-08 08:38:48,993 module.py:1228 DEBUG] - - Checking dynamic dependencies on other MLC scripts: [2025-02-08 08:38:49,001 module.py:560 INFO] - * mlcr get,python3 [2025-02-08 08:38:49,001 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:49,001 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python3 [2025-02-08 08:38:49,002 module.py:667 DEBUG] - - Number of cached script outputs found: 1 [2025-02-08 08:38:49,002 module.py:816 DEBUG] - - Found script::get-python3, d0b5dd74373f4a62 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-python3 [2025-02-08 08:38:49,004 module.py:4823 DEBUG] - - Checking if script execution is already cached ... [2025-02-08 08:38:49,004 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python3,python,get-python,get-python3 [2025-02-08 08:38:49,005 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c [2025-02-08 08:38:49,005 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:49,005 module.py:1261 DEBUG] - - Loading state from cached entry ... [2025-02-08 08:38:49,005 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c/mlc-cached-state.json [2025-02-08 08:38:49,005 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts: [2025-02-08 08:38:49,005 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts: [2025-02-08 08:38:49,006 module.py:2192 INFO] - - running time of script "get,python,python3,get-python,get-python3": 0.01 sec. [2025-02-08 08:38:49,006 module.py:2220 INFO] - Path to Python: /home/mlcuser/venv/mlc/bin/python3 [2025-02-08 08:38:49,006 module.py:2220 INFO] - Python version: 3.10.12 [2025-02-08 08:38:49,007 module.py:1238 DEBUG] - - Processing env after dependencies ... [2025-02-08 08:38:49,007 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:49,007 module.py:1261 DEBUG] - - Loading state from cached entry ... [2025-02-08 08:38:49,008 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-generic-python-lib_77cd7fef/mlc-cached-state.json [2025-02-08 08:38:49,008 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts: [2025-02-08 08:38:49,008 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts: [2025-02-08 08:38:49,008 module.py:2192 INFO] - - running time of script "get,install,generic,pip-package,generic-python-lib": 0.17 sec. [2025-02-08 08:38:49,033 module.py:560 INFO] - * mlcr get,ml-model,image-classification,resnet50,raw,_onnx,_fp32 [2025-02-08 08:38:49,033 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:49,033 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,ml-model,image-classification,resnet50,raw,_onnx,_fp32 [2025-02-08 08:38:49,033 module.py:667 DEBUG] - - Number of cached script outputs found: 1 [2025-02-08 08:38:49,033 module.py:816 DEBUG] - - Found script::get-ml-model-resnet50, 56203e4e998b4bc0 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-ml-model-resnet50 [2025-02-08 08:38:49,034 module.py:2412 DEBUG] - Prepared variations: _onnx,_fp32,_opset-11,_argmax [2025-02-08 08:38:49,037 module.py:4823 DEBUG] - - Checking if script execution is already cached ... [2025-02-08 08:38:49,037 module.py:4856 DEBUG] - - Prepared explicit variations: _onnx,_fp32 [2025-02-08 08:38:49,037 module.py:4875 DEBUG] - - Prepared variations: _onnx,_fp32 [2025-02-08 08:38:49,037 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,ml-model,image-classification,resnet50,raw,ml-model-resnet50,_onnx,_fp32 [2025-02-08 08:38:49,038 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-ml-model-resnet50_3edcd59c [2025-02-08 08:38:49,038 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:49,039 module.py:1261 DEBUG] - - Loading state from cached entry ... [2025-02-08 08:38:49,040 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-ml-model-resnet50_3edcd59c/mlc-cached-state.json [2025-02-08 08:38:49,040 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts: [2025-02-08 08:38:49,040 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts: [2025-02-08 08:38:49,041 module.py:2192 INFO] - - running time of script "get,raw,ml-model,resnet50,ml-model-resnet50,image-classification": 0.03 sec. [2025-02-08 08:38:49,041 module.py:2220 INFO] - Path to the ML model: /home/mlcuser/MLC/repos/local/cache/download-file_e94966f8/resnet50_v1.onnx [2025-02-08 08:38:49,067 module.py:560 INFO] - * mlcr get,dataset,image-classification,imagenet,preprocessed,_NCHW,_full [2025-02-08 08:38:49,067 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:49,067 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,dataset,image-classification,imagenet,preprocessed,_NCHW,_full [2025-02-08 08:38:49,067 module.py:667 DEBUG] - - Number of cached script outputs found: 0 [2025-02-08 08:38:49,067 module.py:816 DEBUG] - - Found script::get-preprocessed-dataset-imagenet, f259d490bbaf45f5 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-preprocessed-dataset-imagenet [2025-02-08 08:38:49,067 module.py:2412 DEBUG] - Prepared variations: _NCHW,_full,_validation,_mlcommons-reference-preprocessor,_resolution.224 [2025-02-08 08:38:49,070 module.py:4823 DEBUG] - - Checking if script execution is already cached ... [2025-02-08 08:38:49,070 module.py:4856 DEBUG] - - Prepared explicit variations: _NCHW,_full [2025-02-08 08:38:49,070 module.py:4875 DEBUG] - - Prepared variations: _NCHW,_full [2025-02-08 08:38:49,070 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,dataset,image-classification,imagenet,preprocessed,ILSVRC,_NCHW,_full [2025-02-08 08:38:49,070 module.py:1369 DEBUG] - - Creating new "cache" script artifact in the MLC local repository ... [2025-02-08 08:38:49,070 module.py:1372 DEBUG] - - Tags: tmp,get,dataset,image-classification,imagenet,preprocessed,ILSVRC,_NCHW,_full,script-item-f259d490bbaf45f5 [2025-02-08 08:38:49,071 module.py:1400 DEBUG] - - Changing to /home/mlcuser/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_34b7b905 [2025-02-08 08:38:49,072 module.py:1628 DEBUG] - - Checking dependencies on other MLC scripts: [2025-02-08 08:38:49,080 module.py:560 INFO] - * mlcr get,python3 [2025-02-08 08:38:49,080 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:49,080 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python3 [2025-02-08 08:38:49,080 module.py:667 DEBUG] - - Number of cached script outputs found: 1 [2025-02-08 08:38:49,080 module.py:816 DEBUG] - - Found script::get-python3, d0b5dd74373f4a62 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-python3 [2025-02-08 08:38:49,081 module.py:4823 DEBUG] - - Checking if script execution is already cached ... [2025-02-08 08:38:49,081 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python3,python,get-python,get-python3 [2025-02-08 08:38:49,081 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c [2025-02-08 08:38:49,081 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:49,081 module.py:1261 DEBUG] - - Loading state from cached entry ... [2025-02-08 08:38:49,081 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c/mlc-cached-state.json [2025-02-08 08:38:49,081 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts: [2025-02-08 08:38:49,081 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts: [2025-02-08 08:38:49,082 module.py:2192 INFO] - - running time of script "get,python,python3,get-python,get-python3": 0.01 sec. [2025-02-08 08:38:49,082 module.py:2220 INFO] - Path to Python: /home/mlcuser/venv/mlc/bin/python3 [2025-02-08 08:38:49,082 module.py:2220 INFO] - Python version: 3.10.12 [2025-02-08 08:38:49,093 module.py:560 INFO] - * mlcr get,dataset,image-classification,original,_full [2025-02-08 08:38:49,093 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:49,093 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,dataset,image-classification,original,_full [2025-02-08 08:38:49,093 module.py:667 DEBUG] - - Number of cached script outputs found: 0 [2025-02-08 08:38:49,093 module.py:816 DEBUG] - - Found script::get-dataset-imagenet-val, 7afd58d287fe4f11 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-dataset-imagenet-val [2025-02-08 08:38:49,093 module.py:2412 DEBUG] - Prepared variations: _full,_2012 [2025-02-08 08:38:49,095 module.py:4823 DEBUG] - - Checking if script execution is already cached ... [2025-02-08 08:38:49,095 module.py:4856 DEBUG] - - Prepared explicit variations: _full [2025-02-08 08:38:49,095 module.py:4875 DEBUG] - - Prepared variations: _full [2025-02-08 08:38:49,095 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,dataset,image-classification,original,val,validation,imagenet,ILSVRC,_full [2025-02-08 08:38:49,096 module.py:1369 DEBUG] - - Creating new "cache" script artifact in the MLC local repository ... [2025-02-08 08:38:49,096 module.py:1372 DEBUG] - - Tags: tmp,get,dataset,image-classification,original,val,validation,imagenet,ILSVRC,_full,script-item-7afd58d287fe4f11 [2025-02-08 08:38:49,096 module.py:1400 DEBUG] - - Changing to /home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-val_52100728 [2025-02-08 08:38:49,097 module.py:1628 DEBUG] - - Checking dependencies on other MLC scripts: [2025-02-08 08:38:49,103 module.py:560 INFO] - * mlcr detect,os [2025-02-08 08:38:49,103 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:49,103 module.py:816 DEBUG] - - Found script::detect-os, 863735b7db8c44fc in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os [2025-02-08 08:38:49,105 module.py:1759 DEBUG] - - Running preprocess ... [2025-02-08 08:38:49,109 module.py:5333 DEBUG] - - Running native script "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh" from temporal script "tmp-run.sh" in "/home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-val_52100728" ... [2025-02-08 08:38:49,109 module.py:5340 INFO] - ! cd /home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-val_52100728 [2025-02-08 08:38:49,109 module.py:5341 INFO] - ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh [2025-02-08 08:38:49,124 module.py:5487 INFO] - ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py [2025-02-08 08:38:49,124 module.py:5550 DEBUG] - - Running postprocess ... [2025-02-08 08:38:49,130 module.py:2192 INFO] - - running time of script "detect-os,detect,os,info": 0.03 sec. [2025-02-08 08:38:49,131 module.py:1637 DEBUG] - - Processing env after dependencies ... [2025-02-08 08:38:49,131 module.py:1759 DEBUG] - - Running preprocess ... [2025-02-08 08:38:49,135 module.py:1838 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:49,148 module.py:560 INFO] - * mlcr download-and-extract,file,_extract,_url.https://image-net.org/data/ILSVRC/2012/ILSVRC2012_img_val.tar [2025-02-08 08:38:49,148 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:49,149 module.py:816 DEBUG] - - Found script::download-and-extract, c67e81a4ce2649f5 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/download-and-extract [2025-02-08 08:38:49,149 module.py:2412 DEBUG] - Prepared variations: _extract,_url.https://image-net.org/data/ILSVRC/2012/ILSVRC2012_img_val.tar,_cmutil,_keep [2025-02-08 08:38:49,166 module.py:1759 DEBUG] - - Running preprocess ... [2025-02-08 08:38:49,171 module.py:1838 DEBUG] - - Checking prehook dependencies on other MLC scripts: [2025-02-08 08:38:49,183 module.py:560 INFO] - * mlcr download,file,_cmutil,_url.https://image-net.org/data/ILSVRC/2012/ILSVRC2012_img_val.tar [2025-02-08 08:38:49,183 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:49,184 module.py:816 DEBUG] - - Found script::download-file, 9cdc8dc41aae437e in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/download-file [2025-02-08 08:38:49,184 module.py:2412 DEBUG] - Prepared variations: _cmutil,_url.https://image-net.org/data/ILSVRC/2012/ILSVRC2012_img_val.tar [2025-02-08 08:38:49,188 module.py:1628 DEBUG] - - Checking dependencies on other MLC scripts: [2025-02-08 08:38:49,195 module.py:560 INFO] - * mlcr detect,os [2025-02-08 08:38:49,195 module.py:592 DEBUG] - - Number of scripts found: 1 [2025-02-08 08:38:49,195 module.py:816 DEBUG] - - Found script::detect-os, 863735b7db8c44fc in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os [2025-02-08 08:38:49,196 module.py:1759 DEBUG] - - Running preprocess ... [2025-02-08 08:38:49,201 module.py:5333 DEBUG] - - Running native script "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh" from temporal script "tmp-run.sh" in "/home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-val_52100728" ... [2025-02-08 08:38:49,201 module.py:5340 INFO] - ! cd /home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-val_52100728 [2025-02-08 08:38:49,201 module.py:5341 INFO] - ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh [2025-02-08 08:38:49,219 module.py:5487 INFO] - ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py [2025-02-08 08:38:49,219 module.py:5550 DEBUG] - - Running postprocess ... [2025-02-08 08:38:49,228 module.py:2192 INFO] - - running time of script "detect-os,detect,os,info": 0.04 sec. [2025-02-08 08:38:49,229 module.py:1637 DEBUG] - - Processing env after dependencies ... [2025-02-08 08:38:49,230 module.py:1759 DEBUG] - - Running preprocess ...
Downloading from https://image-net.org/data/ILSVRC/2012/ILSVRC2012_img_val.tar Downloading to /home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-val_52100728/ILSVRC2012_img_val.tar
Downloaded: 10%
The text was updated successfully, but these errors were encountered:
Can you try using --download_dataset_to_host=yes option? For imagenet this option will be set by default from now.
--download_dataset_to_host=yes
Sorry, something went wrong.
No branches or pull requests
In my docker, I already have a image file. However, when I try to execute following command, it is downloading once again.
CSEMA@Masthan-PC MINGW64 /s/A/mlcflow (main)
$ mlcr run-mlperf,inference,_find-performance,_full,_r5.0-dev --model=resnet50 --implementation=reference --framework=onnxruntime --category=edge --scenario=Offline --execution_mode=test --device=cpu --docker --quiet --test_query_count=1000
[2025-02-08 14:08:35,218 module.py:560 INFO] - * mlcr run-mlperf,inference,_find-performance,_full,_r5.0-dev
[2025-02-08 14:08:35,227 module.py:560 INFO] - * mlcr get,mlcommons,inference,src
[2025-02-08 14:08:35,231 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-mlperf-inference-src_0d343849\mlc-cached-state.json
[2025-02-08 14:08:35,237 module.py:560 INFO] - * mlcr get,mlperf,inference,results,dir,_version.r5.0-dev
[2025-02-08 14:08:35,240 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-mlperf-inference-results-dir_b14e6733\mlc-cached-state.json
[2025-02-08 14:08:35,243 module.py:560 INFO] - * mlcr install,pip-package,for-mlc-python,_package.tabulate
[2025-02-08 14:08:35,245 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\install-pip-package-for-mlc-python_abab6956\mlc-cached-state.json
[2025-02-08 14:08:35,248 module.py:560 INFO] - * mlcr get,mlperf,inference,utils
[2025-02-08 14:08:35,260 module.py:560 INFO] - * mlcr get,mlperf,inference,src
[2025-02-08 14:08:35,261 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-mlperf-inference-src_0d343849\mlc-cached-state.json
[2025-02-08 14:08:35,265 module.py:5487 INFO] - ! call "postprocess" from C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations\script\get-mlperf-inference-utils\customize.py
Using MLCommons Inference source from C:\Users\CSEMA/MLC/repos\local\cache\get-git-repo_37fd06ba\inference
Running loadgen scenario: Offline and mode: performance
[2025-02-08 14:08:35,354 module.py:560 INFO] - * mlcr build,dockerfile
[2025-02-08 14:08:35,359 module.py:560 INFO] - * mlcr get,docker
[2025-02-08 14:08:35,362 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-docker_9e55fd95\mlc-cached-state.json
mlc pull repo && mlcr --tags=app,mlperf,inference,generic,_reference,_resnet50,_onnxruntime,_cpu,_test,_r5.0-dev_default,_offline --quiet=true --env.MLC_QUIET=yes --env.MLC_WINDOWS=yes --env.MLC_MLPERF_IMPLEMENTATION=reference --env.MLC_MLPERF_MODEL=resnet50 --env.MLC_MLPERF_RUN_STYLE=test --env.MLC_MLPERF_SKIP_SUBMISSION_GENERATION=False --env.MLC_DOCKER_PRIVILEGED_MODE=True --env.MLC_MLPERF_SUBMISSION_DIVISION=open --env.MLC_MLPERF_INFERENCE_TP_SIZE=1 --env.MLC_MLPERF_SUBMISSION_SYSTEM_TYPE=edge --env.MLC_MLPERF_DEVICE=cpu --env.MLC_MLPERF_USE_DOCKER=True --env.MLC_MLPERF_BACKEND=onnxruntime --env.MLC_MLPERF_LOADGEN_SCENARIO=Offline --env.MLC_TEST_QUERY_COUNT=1000 --env.MLC_MLPERF_FIND_PERFORMANCE_MODE=yes --env.MLC_MLPERF_LOADGEN_ALL_MODES=no --env.MLC_MLPERF_LOADGEN_MODE=performance --env.MLC_MLPERF_RESULT_PUSH_TO_GITHUB=False --env.MLC_MLPERF_SUBMISSION_GENERATION_STYLE=full --env.MLC_MLPERF_INFERENCE_VERSION=5.0-dev --env.MLC_RUN_MLPERF_INFERENCE_APP_DEFAULTS=r5.0-dev_default --env.MLC_MLPERF_SUBMISSION_CHECKER_VERSION=v5.0 --env.MLC_MLPERF_INFERENCE_SOURCE_VERSION=5.0.15 --env.MLC_MLPERF_LAST_RELEASE=v5.0 --env.+PYTHONPATH,=C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations\script\get-mlperf-inference-utils --env.MLC_MLPERF_INFERENCE_RESULTS_VERSION=r5.0-dev --env.MLC_MODEL=resnet50 --env.MLC_MLPERF_LOADGEN_COMPLIANCE=no --env.MLC_MLPERF_LOADGEN_EXTRA_OPTIONS= --env.MLC_MLPERF_LOADGEN_SCENARIOS,=Offline --env.MLC_MLPERF_LOADGEN_MODES,=performance --env.MLC_OUTPUT_FOLDER_NAME=test_results --add_deps_recursive.coco2014-original.tags=_full --add_deps_recursive.coco2014-preprocessed.tags=_full --add_deps_recursive.imagenet-original.tags=_full --add_deps_recursive.imagenet-preprocessed.tags=_full --add_deps_recursive.openimages-original.tags=_full --add_deps_recursive.openimages-preprocessed.tags=_full --add_deps_recursive.openorca-original.tags=_full --add_deps_recursive.openorca-preprocessed.tags=_full --add_deps_recursive.coco2014-dataset.tags=_full --add_deps_recursive.igbh-dataset.tags=_full --add_deps_recursive.get-mlperf-inference-results-dir.tags=_version.r5.0-dev --add_deps_recursive.get-mlperf-inference-submission-dir.tags=_version.r5.0-dev --add_deps_recursive.mlperf-inference-nvidia-scratch-space.tags=_version.r5.0-dev --v=False --print_env=False --print_deps=False --dump_version_info=True --quiet
Dockerfile written at C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations\script\app-mlperf-inference\dockerfiles\ubuntu_22.04.Dockerfile
[2025-02-08 14:08:35,440 docker.py:191 INFO] - Dockerfile generated at C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations\script\app-mlperf-inference\dockerfiles\ubuntu_22.04.Dockerfile
[2025-02-08 14:08:35,508 module.py:560 INFO] - * mlcr get,docker
[2025-02-08 14:08:35,510 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-docker_9e55fd95\mlc-cached-state.json
[2025-02-08 14:08:35,516 module.py:560 INFO] - * mlcr get,mlperf,inference,submission,dir,local,_version.r5.0-dev
[2025-02-08 14:08:35,520 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-mlperf-inference-submission-dir_c73e05d3\mlc-cached-state.json
[2025-02-08 14:08:35,525 module.py:560 INFO] - * mlcr run,docker,container
[2025-02-08 14:08:35,531 module.py:560 INFO] - * mlcr get,docker
[2025-02-08 14:08:35,532 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-docker_9e55fd95\mlc-cached-state.json
Checking existing Docker container:
docker ps --format "{{ .ID }}," --filter "ancestor=localhost/local/mlc-script-app-mlperf-inference-generic--reference--resnet50--onnxruntime--cpu--test--r5.0-dev-default--offline:ubuntu-22.04-latest" 2> nul
Checking Docker images:
docker images -q localhost/local/mlc-script-app-mlperf-inference-generic--reference--resnet50--onnxruntime--cpu--test--r5.0-dev-default--offline:ubuntu-22.04-latest 2> nul
Docker image exists with ID: f30ca6880cbb
[2025-02-08 14:08:36,009 module.py:560 INFO] - * mlcr get,docker
[2025-02-08 14:08:36,011 module.py:1274 INFO] - ! load C:\Users\CSEMA\MLC\repos\local\cache\get-docker_9e55fd95\mlc-cached-state.json
[2025-02-08 14:08:36,013 module.py:5487 INFO] - ! call "postprocess" from C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations\script\run-docker-container\customize.py
Container launch command:
docker run -it --entrypoint "" --shm-size=32gb --cap-add SYS_ADMIN --cap-add SYS_TIME --security-opt apparmor=unconfined --security-opt seccomp=unconfined --dns 8.8.8.8 --dns 8.8.4.4 -v "C:\Users\CSEMA\MLC\repos\local\cache\get-mlperf-inference-results-dir_b14e6733":/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733 -v "C:\Users\CSEMA\MLC\repos\local\cache\get-mlperf-inference-submission-dir_c73e05d3":/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-submission-dir_c73e05d3 localhost/local/mlc-script-app-mlperf-inference-generic--reference--resnet50--onnxruntime--cpu--test--r5.0-dev-default--offline:ubuntu-22.04-latest bash -c "(mlc pull repo && mlcr --tags=app,mlperf,inference,generic,_reference,_resnet50,_onnxruntime,_cpu,_test,_r5.0-dev_default,_offline --quiet=true --env.MLC_QUIET=yes --env.MLC_WINDOWS=yes --env.MLC_MLPERF_IMPLEMENTATION=reference --env.MLC_MLPERF_MODEL=resnet50 --env.MLC_MLPERF_RUN_STYLE=test --env.MLC_MLPERF_SKIP_SUBMISSION_GENERATION=False --env.MLC_DOCKER_PRIVILEGED_MODE=True --env.MLC_MLPERF_SUBMISSION_DIVISION=open --env.MLC_MLPERF_INFERENCE_TP_SIZE=1 --env.MLC_MLPERF_SUBMISSION_SYSTEM_TYPE=edge --env.MLC_MLPERF_DEVICE=cpu --env.MLC_MLPERF_USE_DOCKER=True --env.MLC_MLPERF_BACKEND=onnxruntime --env.MLC_MLPERF_LOADGEN_SCENARIO=Offline --env.MLC_TEST_QUERY_COUNT=1000 --env.MLC_MLPERF_FIND_PERFORMANCE_MODE=yes --env.MLC_MLPERF_LOADGEN_ALL_MODES=no --env.MLC_MLPERF_LOADGEN_MODE=performance --env.MLC_MLPERF_RESULT_PUSH_TO_GITHUB=False --env.MLC_MLPERF_SUBMISSION_GENERATION_STYLE=full --env.MLC_MLPERF_INFERENCE_VERSION=5.0-dev --env.MLC_RUN_MLPERF_INFERENCE_APP_DEFAULTS=r5.0-dev_default --env.MLC_MLPERF_SUBMISSION_CHECKER_VERSION=v5.0 --env.MLC_MLPERF_INFERENCE_SOURCE_VERSION=5.0.15 --env.MLC_MLPERF_LAST_RELEASE=v5.0 --env.+PYTHONPATH,=C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations\script\get-mlperf-inference-utils --env.MLC_MLPERF_INFERENCE_RESULTS_VERSION=r5.0-dev --env.MLC_TMP_CURRENT_PATH=S:\A\mlcflow --env.MLC_TMP_CURRENT_SCRIPT_REPO_PATH=C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations --env.MLC_TMP_CURRENT_SCRIPT_REPO_PATH_WITH_PREFIX=C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations --env.MLC_TMP_CURRENT_SCRIPT_PATH=C:\Users\CSEMA/MLC/repos\mlcommons@mlperf-automations\script\run-mlperf-inference-app --env.MLC_TMP_PIP_VERSION_STRING= --env.MLC_MODEL=resnet50 --env.MLC_MLPERF_LOADGEN_COMPLIANCE=no --env.MLC_MLPERF_LOADGEN_EXTRA_OPTIONS= --env.MLC_MLPERF_LOADGEN_SCENARIOS,=Offline --env.MLC_MLPERF_LOADGEN_MODES,=performance --env.MLC_OUTPUT_FOLDER_NAME=test_results --add_deps_recursive.coco2014-original.tags=_full --add_deps_recursive.coco2014-preprocessed.tags=_full --add_deps_recursive.imagenet-original.tags=_full --add_deps_recursive.imagenet-preprocessed.tags=_full --add_deps_recursive.openimages-original.tags=_full --add_deps_recursive.openimages-preprocessed.tags=_full --add_deps_recursive.openorca-original.tags=_full --add_deps_recursive.openorca-preprocessed.tags=_full --add_deps_recursive.coco2014-dataset.tags=_full --add_deps_recursive.igbh-dataset.tags=_full --add_deps_recursive.get-mlperf-inference-results-dir.tags=_version.r5.0-dev --add_deps_recursive.get-mlperf-inference-submission-dir.tags=_version.r5.0-dev --add_deps_recursive.mlperf-inference-nvidia-scratch-space.tags=_version.r5.0-dev --v=False --print_env=False --print_deps=False --dump_version_info=True --env.MLC_MLPERF_INFERENCE_RESULTS_DIR=/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733 --env.OUTPUT_BASE_DIR=/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733 --env.MLC_MLPERF_INFERENCE_SUBMISSION_DIR=/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-submission-dir_c73e05d3/mlperf-inference-submission && bash ) || bash"
[2025-02-08 08:38:39,292 main.py:1017 INFO] - Repository mlperf-automations already exists at /home/mlcuser/MLC/repos/mlcommons@mlperf-automations. Pulling latest changes...
remote: Enumerating objects: 74, done.
remote: Counting objects: 100% (31/31), done.
remote: Compressing objects: 100% (24/24), done.
remote: Total 74 (delta 16), reused 7 (delta 7), pack-reused 43 (from 2)
Unpacking objects: 100% (74/74), 114.08 KiB | 957.00 KiB/s, done.
From https://github.com/mlcommons/mlperf-automations
eae72d8..dce8947 dev -> origin/dev
Updating eae72d8..dce8947
Fast-forward
.github/workflows/test-mlc-script-features.yml | 4 +++-
.github/workflows/test-mlperf-inference-abtf-poc.yml | 1 +
.github/workflows/test-mlperf-inference-bert-deepsparse-tf-onnxruntime-pytorch.yml | 1 +
.github/workflows/test-mlperf-inference-mlcommons-cpp-resnet50.yml | 1 +
.github/workflows/test-mlperf-inference-rgat.yml | 3 ++-
.github/workflows/test-mlperf-inference-tvm-resnet50.yml | 1 +
.github/workflows/test-mlperf-loadgen-onnx-huggingface-bert-fp32-squad.yml | 2 +-
.github/workflows/test-nvidia-mlperf-inference-implementations.yml | 2 +-
automation/script/docker_utils.py | 2 +-
automation/script/module.py | 18 ++----------------
script/app-mlperf-inference/customize.py | 2 +-
script/app-mlperf-inference/meta.yaml | 5 +++--
script/generate-docs-for-all-scripts.cmd | 1 -
script/generate-mlperf-inference-submission/meta.yaml | 2 +-
script/get-dataset-igbh/customize.py | 2 +-
script/get-dataset-igbh/meta.yaml | 3 +++
script/get-platform-details/meta.yaml | 1 +
script/install-pip-package-for-cmind-python/README.md | 1 -
script/{install-pip-package-for-cmind-python => install-pip-package-for-mlc-python}/COPYRIGHT.md | 0
script/{install-pip-package-for-cmind-python => install-pip-package-for-mlc-python}/customize.py | 0
script/{install-pip-package-for-cmind-python => install-pip-package-for-mlc-python}/meta.yaml | 6 +++---
script/run-docker-container/customize.py | 6 +++++-
script/run-docker-container/meta.yaml | 1 +
script/run-mlperf-automotive-app/meta.yaml | 2 +-
script/run-mlperf-inference-app/meta.yaml | 4 +---
script/run-mlperf-inference-submission-checker/customize.py | 2 --
script/run-mlperf-inference-submission-checker/meta.yaml | 10 +++++++---
script/submit-mlperf-results/customize.py | 18 ++++++++++++++----
script/tar-my-folder/customize.py | 1 +
script/tar-my-folder/meta.yaml | 2 ++
30 files changed, 59 insertions(+), 45 deletions(-)
delete mode 100644 script/generate-docs-for-all-scripts.cmd
delete mode 100644 script/install-pip-package-for-cmind-python/README.md
rename script/{install-pip-package-for-cmind-python => install-pip-package-for-mlc-python}/COPYRIGHT.md (100%)
rename script/{install-pip-package-for-cmind-python => install-pip-package-for-mlc-python}/customize.py (100%)
rename script/{install-pip-package-for-cmind-python => install-pip-package-for-mlc-python}/meta.yaml (71%)
[2025-02-08 08:38:46,147 main.py:1025 INFO] - Repository successfully pulled.
[2025-02-08 08:38:46,148 main.py:1026 INFO] - Registering the repo in repos.json
[2025-02-08 08:38:46,148 main.py:1046 INFO] - No changes made to repos.json.
[2025-02-08 08:38:48,397 module.py:560 INFO] - * mlcr app,mlperf,inference,generic,_reference,_resnet50,_onnxruntime,_cpu,_test,_r5.0-dev_default,_offline
[2025-02-08 08:38:48,397 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,398 module.py:816 DEBUG] - - Found script::app-mlperf-inference, d775cac873ee4231 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/app-mlperf-inference
[2025-02-08 08:38:48,398 module.py:2412 DEBUG] - Prepared variations: _reference,_resnet50,_onnxruntime,_cpu,_test,_r5.0-dev_default,_offline,_float32
[2025-02-08 08:38:48,400 module.py:1628 DEBUG] - - Checking dependencies on other MLC scripts:
[2025-02-08 08:38:48,408 module.py:560 INFO] - * mlcr detect,os
[2025-02-08 08:38:48,408 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,408 module.py:816 DEBUG] - - Found script::detect-os, 863735b7db8c44fc in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os
[2025-02-08 08:38:48,411 module.py:1759 DEBUG] - - Running preprocess ...
[2025-02-08 08:38:48,414 module.py:5333 DEBUG] - - Running native script "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh" from temporal script "tmp-run.sh" in "/home/mlcuser" ...
[2025-02-08 08:38:48,414 module.py:5340 INFO] - ! cd /home/mlcuser
[2025-02-08 08:38:48,414 module.py:5341 INFO] - ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-02-08 08:38:48,442 module.py:5487 INFO] - ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-02-08 08:38:48,442 module.py:5550 DEBUG] - - Running postprocess ...
[2025-02-08 08:38:48,459 module.py:2192 INFO] - - running time of script "detect-os,detect,os,info": 0.06 sec.
[2025-02-08 08:38:48,465 module.py:560 INFO] - * mlcr get,sys-utils-cm
[2025-02-08 08:38:48,465 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,465 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,sys-utils-cm
[2025-02-08 08:38:48,465 module.py:667 DEBUG] - - Number of cached script outputs found: 1
[2025-02-08 08:38:48,465 module.py:816 DEBUG] - - Found script::get-sys-utils-cm, bc90993277e84b8e in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-sys-utils-cm
[2025-02-08 08:38:48,466 module.py:4823 DEBUG] - - Checking if script execution is already cached ...
[2025-02-08 08:38:48,466 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,sys-utils-cm,sys-utils-mlc
[2025-02-08 08:38:48,467 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-sys-utils-cm_edc7d689
[2025-02-08 08:38:48,467 module.py:1228 DEBUG] - - Checking dynamic dependencies on other MLC scripts:
[2025-02-08 08:38:48,467 module.py:1238 DEBUG] - - Processing env after dependencies ...
[2025-02-08 08:38:48,467 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:48,467 module.py:1261 DEBUG] - - Loading state from cached entry ...
[2025-02-08 08:38:48,468 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-sys-utils-cm_edc7d689/mlc-cached-state.json
[2025-02-08 08:38:48,468 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts:
[2025-02-08 08:38:48,468 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts:
[2025-02-08 08:38:48,468 module.py:2192 INFO] - - running time of script "get,sys-utils-cm,sys-utils-mlc": 0.01 sec.
[2025-02-08 08:38:48,477 module.py:560 INFO] - * mlcr get,python
[2025-02-08 08:38:48,477 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,477 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python
[2025-02-08 08:38:48,477 module.py:667 DEBUG] - - Number of cached script outputs found: 1
[2025-02-08 08:38:48,477 module.py:816 DEBUG] - - Found script::get-python3, d0b5dd74373f4a62 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-python3
[2025-02-08 08:38:48,479 module.py:4823 DEBUG] - - Checking if script execution is already cached ...
[2025-02-08 08:38:48,480 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python,python3,get-python,get-python3
[2025-02-08 08:38:48,480 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c
[2025-02-08 08:38:48,480 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:48,480 module.py:1261 DEBUG] - - Loading state from cached entry ...
[2025-02-08 08:38:48,481 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c/mlc-cached-state.json
[2025-02-08 08:38:48,481 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts:
[2025-02-08 08:38:48,481 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts:
[2025-02-08 08:38:48,481 module.py:2192 INFO] - - running time of script "get,python,python3,get-python,get-python3": 0.01 sec.
[2025-02-08 08:38:48,482 module.py:2220 INFO] - Path to Python: /home/mlcuser/venv/mlc/bin/python3
[2025-02-08 08:38:48,482 module.py:2220 INFO] - Python version: 3.10.12
[2025-02-08 08:38:48,499 module.py:560 INFO] - * mlcr get,mlcommons,inference,src
[2025-02-08 08:38:48,499 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,499 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,mlcommons,inference,src
[2025-02-08 08:38:48,499 module.py:667 DEBUG] - - Number of cached script outputs found: 2
[2025-02-08 08:38:48,500 module.py:816 DEBUG] - - Found script::get-mlperf-inference-src, 4b57186581024797 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-src
[2025-02-08 08:38:48,500 module.py:2412 DEBUG] - Prepared variations: _short-history
[2025-02-08 08:38:48,500 module.py:970 DEBUG] - - Requested version: == r5.0
[2025-02-08 08:38:48,502 module.py:4823 DEBUG] - - Checking if script execution is already cached ...
[2025-02-08 08:38:48,502 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,mlcommons,inference,src,source,inference-src,inference-source,mlperf,version-r5.0
[2025-02-08 08:38:48,502 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-src_0d0695b9
[2025-02-08 08:38:48,502 module.py:1228 DEBUG] - - Checking dynamic dependencies on other MLC scripts:
[2025-02-08 08:38:48,502 module.py:1238 DEBUG] - - Processing env after dependencies ...
[2025-02-08 08:38:48,502 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:48,502 module.py:1261 DEBUG] - - Loading state from cached entry ...
[2025-02-08 08:38:48,504 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-src_0d0695b9/mlc-cached-state.json
[2025-02-08 08:38:48,504 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts:
[2025-02-08 08:38:48,504 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts:
[2025-02-08 08:38:48,504 module.py:2192 INFO] - - running time of script "get,src,source,inference,inference-src,inference-source,mlperf,mlcommons": 0.02 sec.
[2025-02-08 08:38:48,510 module.py:560 INFO] - * mlcr get,mlperf,inference,utils
[2025-02-08 08:38:48,511 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,511 module.py:816 DEBUG] - - Found script::get-mlperf-inference-utils, e341e5f86d8342e5 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-utils
[2025-02-08 08:38:48,512 module.py:1628 DEBUG] - - Checking dependencies on other MLC scripts:
[2025-02-08 08:38:48,529 module.py:560 INFO] - * mlcr get,mlperf,inference,src
[2025-02-08 08:38:48,529 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,529 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,mlperf,inference,src
[2025-02-08 08:38:48,530 module.py:667 DEBUG] - - Number of cached script outputs found: 2
[2025-02-08 08:38:48,530 module.py:816 DEBUG] - - Found script::get-mlperf-inference-src, 4b57186581024797 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-src
[2025-02-08 08:38:48,530 module.py:2412 DEBUG] - Prepared variations: _short-history
[2025-02-08 08:38:48,530 module.py:970 DEBUG] - - Requested version: == r5.0
[2025-02-08 08:38:48,531 module.py:4823 DEBUG] - - Checking if script execution is already cached ...
[2025-02-08 08:38:48,531 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,mlperf,inference,src,source,inference-src,inference-source,mlcommons,version-r5.0
[2025-02-08 08:38:48,532 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-src_0d0695b9
[2025-02-08 08:38:48,532 module.py:1228 DEBUG] - - Checking dynamic dependencies on other MLC scripts:
[2025-02-08 08:38:48,532 module.py:1238 DEBUG] - - Processing env after dependencies ...
[2025-02-08 08:38:48,533 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:48,533 module.py:1261 DEBUG] - - Loading state from cached entry ...
[2025-02-08 08:38:48,533 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-src_0d0695b9/mlc-cached-state.json
[2025-02-08 08:38:48,534 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts:
[2025-02-08 08:38:48,534 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts:
[2025-02-08 08:38:48,534 module.py:2192 INFO] - - running time of script "get,src,source,inference,inference-src,inference-source,mlperf,mlcommons": 0.02 sec.
[2025-02-08 08:38:48,535 module.py:1637 DEBUG] - - Processing env after dependencies ...
[2025-02-08 08:38:48,536 module.py:1759 DEBUG] - - Running preprocess ...
[2025-02-08 08:38:48,540 module.py:5487 INFO] - ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-utils/customize.py
[2025-02-08 08:38:48,540 module.py:5550 DEBUG] - - Running postprocess ...
[2025-02-08 08:38:48,543 module.py:2192 INFO] - - running time of script "get,mlperf,inference,util,utils,functions": 0.04 sec.
[2025-02-08 08:38:48,552 module.py:560 INFO] - * mlcr get,dataset-aux,imagenet-aux
[2025-02-08 08:38:48,552 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,552 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,dataset-aux,imagenet-aux
[2025-02-08 08:38:48,552 module.py:667 DEBUG] - - Number of cached script outputs found: 1
[2025-02-08 08:38:48,552 module.py:816 DEBUG] - - Found script::get-dataset-imagenet-aux, bb2c6dd8c8c64217 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-dataset-imagenet-aux
[2025-02-08 08:38:48,553 module.py:2412 DEBUG] - Prepared variations: _from.berkeleyvision,_2012
[2025-02-08 08:38:48,553 module.py:4823 DEBUG] - - Checking if script execution is already cached ...
[2025-02-08 08:38:48,553 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,dataset-aux,imagenet-aux,aux,image-classification
[2025-02-08 08:38:48,554 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-aux_8c5cefe0
[2025-02-08 08:38:48,554 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:48,554 module.py:1261 DEBUG] - - Loading state from cached entry ...
[2025-02-08 08:38:48,555 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-aux_8c5cefe0/mlc-cached-state.json
[2025-02-08 08:38:48,555 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts:
[2025-02-08 08:38:48,555 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts:
[2025-02-08 08:38:48,555 module.py:2192 INFO] - - running time of script "get,aux,dataset-aux,image-classification,imagenet-aux": 0.01 sec.
[2025-02-08 08:38:48,556 module.py:1637 DEBUG] - - Processing env after dependencies ...
[2025-02-08 08:38:48,567 module.py:1759 DEBUG] - - Running preprocess ...
[2025-02-08 08:38:48,574 module.py:1834 DEBUG] - {
"+MLC_HOST_OS_DEFAULT_LIBRARY_PATH": [
"/usr/local/lib/x86_64-linux-gnu",
"/lib/x86_64-linux-gnu",
"/usr/lib/x86_64-linux-gnu",
"/usr/lib/x86_64-linux-gnu64",
"/usr/local/lib64",
"/lib64",
"/usr/lib64",
"/usr/local/lib",
"/lib",
"/usr/lib",
"/usr/x86_64-linux-gnu/lib64",
"/usr/x86_64-linux-gnu/lib"
],
"+PYTHONPATH": [
"/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/vision/classification_and_detection/python",
"/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/tools/submission",
"/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-mlperf-inference-utils"
],
"MLC_CNNDM_ACCURACY_DTYPE": "int32",
"MLC_DATASET_AUX_PATH": "/home/mlcuser/MLC/repos/local/cache/extract-file_5e1e9382",
"MLC_DATASET_AUX_VER": "2012",
"MLC_DOCKER_PRIVILEGED_MODE": "True",
"MLC_ENV_NVMITTEN_DOCKER_WHEEL_PATH": "/opt/nvmitten-0.1.3b0-cp38-cp38-linux_x86_64.whl",
"MLC_GET_PLATFORM_DETAILS": true,
"MLC_HOST_OS_BITS": "64",
"MLC_HOST_OS_FLAVOR": "ubuntu",
"MLC_HOST_OS_FLAVOR_LIKE": "debian",
"MLC_HOST_OS_GLIBC_VERSION": "2.35",
"MLC_HOST_OS_KERNEL_VERSION": "5.15.167.4-microsoft-standard-WSL2",
"MLC_HOST_OS_MACHINE": "x86_64",
"MLC_HOST_OS_PACKAGE_MANAGER": "apt",
"MLC_HOST_OS_PACKAGE_MANAGER_INSTALL_CMD": "DEBIAN_FRONTEND=noninteractive apt-get install -y",
"MLC_HOST_OS_PACKAGE_MANAGER_UPDATE_CMD": "apt-get update -y",
"MLC_HOST_OS_TYPE": "linux",
"MLC_HOST_OS_VERSION": "22.04",
"MLC_HOST_PLATFORM_FLAVOR": "x86_64",
"MLC_HOST_PYTHON_BITS": "64",
"MLC_HOST_SYSTEM_NAME": "e85017639ed6",
"MLC_IMAGENET_ACCURACY_DTYPE": "float32",
"MLC_LIBRISPEECH_ACCURACY_DTYPE": "float32",
"MLC_MLPERF_BACKEND": "onnxruntime",
"MLC_MLPERF_DEVICE": "cpu",
"MLC_MLPERF_FIND_PERFORMANCE_MODE": "yes",
"MLC_MLPERF_IMPLEMENTATION": "mlcommons_python",
"MLC_MLPERF_INFERENCE_3DUNET_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/vision/medical_imaging/3d-unet-kits19",
"MLC_MLPERF_INFERENCE_BERT_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/language/bert",
"MLC_MLPERF_INFERENCE_CLASSIFICATION_AND_DETECTION_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/vision/classification_and_detection",
"MLC_MLPERF_INFERENCE_CONF_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/mlperf.conf",
"MLC_MLPERF_INFERENCE_DLRM_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/recommendation/dlrm",
"MLC_MLPERF_INFERENCE_DLRM_V2_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/recommendation/dlrm_v2",
"MLC_MLPERF_INFERENCE_GPTJ_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/language/gpt-j",
"MLC_MLPERF_INFERENCE_POINTPAINTING_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/automotive/3d-object-detection",
"MLC_MLPERF_INFERENCE_RESULTS_DIR": "/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733",
"MLC_MLPERF_INFERENCE_RESULTS_VERSION": "r5.0-dev",
"MLC_MLPERF_INFERENCE_RGAT_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/graph/R-GAT",
"MLC_MLPERF_INFERENCE_RNNT_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/speech_recognition/rnnt",
"MLC_MLPERF_INFERENCE_SOURCE": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference",
"MLC_MLPERF_INFERENCE_SOURCE_VERSION": "5.0.15",
"MLC_MLPERF_INFERENCE_SUBMISSION_DIR": "/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-submission-dir_c73e05d3/mlperf-inference-submission",
"MLC_MLPERF_INFERENCE_TP_SIZE": "1",
"MLC_MLPERF_INFERENCE_VERSION": "5.0-dev",
"MLC_MLPERF_INFERENCE_VISION_PATH": "/home/mlcuser/MLC/repos/local/cache/get-git-repo_a010f5ba/inference/vision",
"MLC_MLPERF_LAST_RELEASE": "v5.0",
"MLC_MLPERF_LOADGEN_ALL_MODES": "no",
"MLC_MLPERF_LOADGEN_COMPLIANCE": "no",
"MLC_MLPERF_LOADGEN_EXTRA_OPTIONS": "",
"MLC_MLPERF_LOADGEN_MODE": "performance",
"MLC_MLPERF_LOADGEN_SCENARIO": "Offline",
"MLC_MLPERF_MODEL": "resnet50",
"MLC_MLPERF_MODEL_EQUAL_ISSUE_MODE": "no",
"MLC_MLPERF_MODEL_PRECISION": "float32",
"MLC_MLPERF_PRINT_SUMMARY": "no",
"MLC_MLPERF_PYTHON": "yes",
"MLC_MLPERF_QUANTIZATION": false,
"MLC_MLPERF_RESULT_PUSH_TO_GITHUB": "False",
"MLC_MLPERF_RUN_STYLE": "test",
"MLC_MLPERF_SKIP_SUBMISSION_GENERATION": "False",
"MLC_MLPERF_SUBMISSION_CHECKER_VERSION": "v5.0",
"MLC_MLPERF_SUBMISSION_DIVISION": "open",
"MLC_MLPERF_SUBMISSION_GENERATION_STYLE": "full",
"MLC_MLPERF_SUBMISSION_SYSTEM_TYPE": "edge",
"MLC_MLPERF_USE_DOCKER": "True",
"MLC_MODEL": "resnet50",
"MLC_OPENIMAGES_ACCURACY_DTYPE": "float32",
"MLC_OUTPUT_FOLDER_NAME": "test_results",
"MLC_PYTHON_BIN": "python3",
"MLC_PYTHON_BIN_PATH": "/home/mlcuser/venv/mlc/bin",
"MLC_PYTHON_BIN_WITH_PATH": "/home/mlcuser/venv/mlc/bin/python3",
"MLC_PYTHON_CACHE_TAGS": "version-3.10.12,non-virtual",
"MLC_PYTHON_MAJOR_VERSION": "3",
"MLC_PYTHON_MINOR_VERSION": "10",
"MLC_PYTHON_PATCH_VERSION": "12",
"MLC_PYTHON_VERSION": "3.10.12",
"MLC_QUIET": "yes",
"MLC_REGENERATE_MEASURE_FILES": "yes",
"MLC_RUN_MLPERF_INFERENCE_APP_DEFAULTS": "r5.0-dev_default",
"MLC_SKIP_SYS_UTILS": "yes",
"MLC_SQUAD_ACCURACY_DTYPE": "float32",
"MLC_TEST_QUERY_COUNT": "1000",
"MLC_TMP_CURRENT_PATH": "/home/mlcuser",
"MLC_TMP_CURRENT_SCRIPT_PATH": "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/app-mlperf-inference",
"MLC_TMP_CURRENT_SCRIPT_REPO_PATH": "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations",
"MLC_TMP_CURRENT_SCRIPT_REPO_PATH_WITH_PREFIX": "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations",
"MLC_TMP_PIP_VERSION_STRING": "",
"MLC_VERBOSE": "yes",
"MLC_WINDOWS": "yes",
"OUTPUT_BASE_DIR": "/home/mlcuser/MLC/repos/local/cache/get-mlperf-inference-results-dir_b14e6733"
}
[2025-02-08 08:38:48,574 module.py:1838 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:48,681 module.py:560 INFO] - * mlcr app,mlperf,reference,inference,_resnet50,_onnxruntime,_cpu,_offline,_fp32
[2025-02-08 08:38:48,682 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,682 module.py:816 DEBUG] - - Found script::app-mlperf-inference-mlcommons-python, ff149e9781fc4b65 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/app-mlperf-inference-mlcommons-python
[2025-02-08 08:38:48,682 module.py:2412 DEBUG] - Prepared variations: _resnet50,_onnxruntime,_cpu,_offline,_fp32,_python
[2025-02-08 08:38:48,685 module.py:1628 DEBUG] - - Checking dependencies on other MLC scripts:
[2025-02-08 08:38:48,691 module.py:560 INFO] - * mlcr detect,os
[2025-02-08 08:38:48,692 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,692 module.py:816 DEBUG] - - Found script::detect-os, 863735b7db8c44fc in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os
[2025-02-08 08:38:48,693 module.py:1759 DEBUG] - - Running preprocess ...
[2025-02-08 08:38:48,696 module.py:5333 DEBUG] - - Running native script "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh" from temporal script "tmp-run.sh" in "/home/mlcuser" ...
[2025-02-08 08:38:48,696 module.py:5340 INFO] - ! cd /home/mlcuser
[2025-02-08 08:38:48,696 module.py:5341 INFO] - ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-02-08 08:38:48,712 module.py:5487 INFO] - ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-02-08 08:38:48,712 module.py:5550 DEBUG] - - Running postprocess ...
[2025-02-08 08:38:48,718 module.py:2192 INFO] - - running time of script "detect-os,detect,os,info": 0.03 sec.
[2025-02-08 08:38:48,723 module.py:560 INFO] - * mlcr detect,cpu
[2025-02-08 08:38:48,723 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,724 module.py:816 DEBUG] - - Found script::detect-cpu, 586c8a43320142f7 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu
[2025-02-08 08:38:48,725 module.py:1628 DEBUG] - - Checking dependencies on other MLC scripts:
[2025-02-08 08:38:48,730 module.py:560 INFO] - * mlcr detect,os
[2025-02-08 08:38:48,731 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,731 module.py:816 DEBUG] - - Found script::detect-os, 863735b7db8c44fc in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os
[2025-02-08 08:38:48,732 module.py:1759 DEBUG] - - Running preprocess ...
[2025-02-08 08:38:48,735 module.py:5333 DEBUG] - - Running native script "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh" from temporal script "tmp-run.sh" in "/home/mlcuser" ...
[2025-02-08 08:38:48,736 module.py:5340 INFO] - ! cd /home/mlcuser
[2025-02-08 08:38:48,736 module.py:5341 INFO] - ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-02-08 08:38:48,750 module.py:5487 INFO] - ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-02-08 08:38:48,750 module.py:5550 DEBUG] - - Running postprocess ...
[2025-02-08 08:38:48,757 module.py:2192 INFO] - - running time of script "detect-os,detect,os,info": 0.03 sec.
[2025-02-08 08:38:48,758 module.py:1637 DEBUG] - - Processing env after dependencies ...
[2025-02-08 08:38:48,758 module.py:1759 DEBUG] - - Running preprocess ...
[2025-02-08 08:38:48,762 module.py:5333 DEBUG] - - Running native script "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh" from temporal script "tmp-run.sh" in "/home/mlcuser" ...
[2025-02-08 08:38:48,762 module.py:5340 INFO] - ! cd /home/mlcuser
[2025-02-08 08:38:48,762 module.py:5341 INFO] - ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/run.sh from tmp-run.sh
[2025-02-08 08:38:48,809 module.py:5487 INFO] - ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-cpu/customize.py
[2025-02-08 08:38:48,809 module.py:5550 DEBUG] - - Running postprocess ...
[2025-02-08 08:38:48,813 module.py:2192 INFO] - - running time of script "detect,cpu,detect-cpu,info": 0.09 sec.
[2025-02-08 08:38:48,820 module.py:560 INFO] - * mlcr get,sys-utils-cm
[2025-02-08 08:38:48,820 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,820 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,sys-utils-cm
[2025-02-08 08:38:48,821 module.py:667 DEBUG] - - Number of cached script outputs found: 1
[2025-02-08 08:38:48,821 module.py:816 DEBUG] - - Found script::get-sys-utils-cm, bc90993277e84b8e in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-sys-utils-cm
[2025-02-08 08:38:48,822 module.py:4823 DEBUG] - - Checking if script execution is already cached ...
[2025-02-08 08:38:48,822 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,sys-utils-cm,sys-utils-mlc
[2025-02-08 08:38:48,823 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-sys-utils-cm_edc7d689
[2025-02-08 08:38:48,823 module.py:1228 DEBUG] - - Checking dynamic dependencies on other MLC scripts:
[2025-02-08 08:38:48,823 module.py:1238 DEBUG] - - Processing env after dependencies ...
[2025-02-08 08:38:48,824 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:48,824 module.py:1261 DEBUG] - - Loading state from cached entry ...
[2025-02-08 08:38:48,824 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-sys-utils-cm_edc7d689/mlc-cached-state.json
[2025-02-08 08:38:48,824 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts:
[2025-02-08 08:38:48,825 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts:
[2025-02-08 08:38:48,825 module.py:2192 INFO] - - running time of script "get,sys-utils-cm,sys-utils-mlc": 0.01 sec.
[2025-02-08 08:38:48,834 module.py:560 INFO] - * mlcr get,python
[2025-02-08 08:38:48,834 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,834 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python
[2025-02-08 08:38:48,834 module.py:667 DEBUG] - - Number of cached script outputs found: 1
[2025-02-08 08:38:48,834 module.py:816 DEBUG] - - Found script::get-python3, d0b5dd74373f4a62 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-python3
[2025-02-08 08:38:48,835 module.py:4823 DEBUG] - - Checking if script execution is already cached ...
[2025-02-08 08:38:48,835 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python,python3,get-python,get-python3
[2025-02-08 08:38:48,836 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c
[2025-02-08 08:38:48,836 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:48,836 module.py:1261 DEBUG] - - Loading state from cached entry ...
[2025-02-08 08:38:48,836 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c/mlc-cached-state.json
[2025-02-08 08:38:48,836 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts:
[2025-02-08 08:38:48,836 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts:
[2025-02-08 08:38:48,837 module.py:2192 INFO] - - running time of script "get,python,python3,get-python,get-python3": 0.01 sec.
[2025-02-08 08:38:48,837 module.py:2220 INFO] - Path to Python: /home/mlcuser/venv/mlc/bin/python3
[2025-02-08 08:38:48,837 module.py:2220 INFO] - Python version: 3.10.12
[2025-02-08 08:38:48,895 module.py:560 INFO] - * mlcr get,generic-python-lib,_onnxruntime
[2025-02-08 08:38:48,895 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,895 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,generic-python-lib,_onnxruntime
[2025-02-08 08:38:48,895 module.py:667 DEBUG] - - Number of cached script outputs found: 1
[2025-02-08 08:38:48,895 module.py:816 DEBUG] - - Found script::get-generic-python-lib, 94b62a682bc44791 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib
[2025-02-08 08:38:48,896 module.py:2412 DEBUG] - Prepared variations: _onnxruntime
[2025-02-08 08:38:48,898 module.py:4823 DEBUG] - - Checking if script execution is already cached ...
[2025-02-08 08:38:48,899 module.py:4856 DEBUG] - - Prepared explicit variations: _onnxruntime
[2025-02-08 08:38:48,899 module.py:4875 DEBUG] - - Prepared variations: _onnxruntime
[2025-02-08 08:38:48,899 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,generic-python-lib,install,generic,pip-package,_onnxruntime,deps-python-version-3.10.12,deps-python-non-virtual
[2025-02-08 08:38:48,907 module.py:560 INFO] - * mlcr get,python3
[2025-02-08 08:38:48,907 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:48,907 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python3
[2025-02-08 08:38:48,907 module.py:667 DEBUG] - - Number of cached script outputs found: 1
[2025-02-08 08:38:48,907 module.py:816 DEBUG] - - Found script::get-python3, d0b5dd74373f4a62 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-python3
[2025-02-08 08:38:48,908 module.py:4823 DEBUG] - - Checking if script execution is already cached ...
[2025-02-08 08:38:48,908 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python3,python,get-python,get-python3
[2025-02-08 08:38:48,908 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c
[2025-02-08 08:38:48,908 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:48,908 module.py:1261 DEBUG] - - Loading state from cached entry ...
[2025-02-08 08:38:48,908 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c/mlc-cached-state.json
[2025-02-08 08:38:48,909 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts:
[2025-02-08 08:38:48,909 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts:
[2025-02-08 08:38:48,909 module.py:2220 INFO] - Path to Python: /home/mlcuser/venv/mlc/bin/python3
[2025-02-08 08:38:48,909 module.py:2220 INFO] - Python version: 3.10.12
[2025-02-08 08:38:48,909 module.py:5333 DEBUG] - - Running native script "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh" from temporal script "tmp-run.sh" in "/home/mlcuser" ...
[2025-02-08 08:38:48,909 module.py:5340 INFO] - ! cd /home/mlcuser
[2025-02-08 08:38:48,909 module.py:5341 INFO] - ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/validate_cache.sh from tmp-run.sh
[2025-02-08 08:38:48,988 module.py:5487 INFO] - ! call "detect_version" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-generic-python-lib/customize.py
Detected version: 1.20.1
[2025-02-08 08:38:48,993 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-generic-python-lib_77cd7fef
[2025-02-08 08:38:48,993 module.py:1228 DEBUG] - - Checking dynamic dependencies on other MLC scripts:
[2025-02-08 08:38:49,001 module.py:560 INFO] - * mlcr get,python3
[2025-02-08 08:38:49,001 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:49,001 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python3
[2025-02-08 08:38:49,002 module.py:667 DEBUG] - - Number of cached script outputs found: 1
[2025-02-08 08:38:49,002 module.py:816 DEBUG] - - Found script::get-python3, d0b5dd74373f4a62 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-python3
[2025-02-08 08:38:49,004 module.py:4823 DEBUG] - - Checking if script execution is already cached ...
[2025-02-08 08:38:49,004 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python3,python,get-python,get-python3
[2025-02-08 08:38:49,005 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c
[2025-02-08 08:38:49,005 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:49,005 module.py:1261 DEBUG] - - Loading state from cached entry ...
[2025-02-08 08:38:49,005 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c/mlc-cached-state.json
[2025-02-08 08:38:49,005 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts:
[2025-02-08 08:38:49,005 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts:
[2025-02-08 08:38:49,006 module.py:2192 INFO] - - running time of script "get,python,python3,get-python,get-python3": 0.01 sec.
[2025-02-08 08:38:49,006 module.py:2220 INFO] - Path to Python: /home/mlcuser/venv/mlc/bin/python3
[2025-02-08 08:38:49,006 module.py:2220 INFO] - Python version: 3.10.12
[2025-02-08 08:38:49,007 module.py:1238 DEBUG] - - Processing env after dependencies ...
[2025-02-08 08:38:49,007 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:49,007 module.py:1261 DEBUG] - - Loading state from cached entry ...
[2025-02-08 08:38:49,008 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-generic-python-lib_77cd7fef/mlc-cached-state.json
[2025-02-08 08:38:49,008 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts:
[2025-02-08 08:38:49,008 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts:
[2025-02-08 08:38:49,008 module.py:2192 INFO] - - running time of script "get,install,generic,pip-package,generic-python-lib": 0.17 sec.
[2025-02-08 08:38:49,033 module.py:560 INFO] - * mlcr get,ml-model,image-classification,resnet50,raw,_onnx,_fp32
[2025-02-08 08:38:49,033 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:49,033 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,ml-model,image-classification,resnet50,raw,_onnx,_fp32
[2025-02-08 08:38:49,033 module.py:667 DEBUG] - - Number of cached script outputs found: 1
[2025-02-08 08:38:49,033 module.py:816 DEBUG] - - Found script::get-ml-model-resnet50, 56203e4e998b4bc0 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-ml-model-resnet50
[2025-02-08 08:38:49,034 module.py:2412 DEBUG] - Prepared variations: _onnx,_fp32,_opset-11,_argmax
[2025-02-08 08:38:49,037 module.py:4823 DEBUG] - - Checking if script execution is already cached ...
[2025-02-08 08:38:49,037 module.py:4856 DEBUG] - - Prepared explicit variations: _onnx,_fp32
[2025-02-08 08:38:49,037 module.py:4875 DEBUG] - - Prepared variations: _onnx,_fp32
[2025-02-08 08:38:49,037 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,ml-model,image-classification,resnet50,raw,ml-model-resnet50,_onnx,_fp32
[2025-02-08 08:38:49,038 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-ml-model-resnet50_3edcd59c
[2025-02-08 08:38:49,038 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:49,039 module.py:1261 DEBUG] - - Loading state from cached entry ...
[2025-02-08 08:38:49,040 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-ml-model-resnet50_3edcd59c/mlc-cached-state.json
[2025-02-08 08:38:49,040 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts:
[2025-02-08 08:38:49,040 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts:
[2025-02-08 08:38:49,041 module.py:2192 INFO] - - running time of script "get,raw,ml-model,resnet50,ml-model-resnet50,image-classification": 0.03 sec.
[2025-02-08 08:38:49,041 module.py:2220 INFO] - Path to the ML model: /home/mlcuser/MLC/repos/local/cache/download-file_e94966f8/resnet50_v1.onnx
[2025-02-08 08:38:49,067 module.py:560 INFO] - * mlcr get,dataset,image-classification,imagenet,preprocessed,_NCHW,_full
[2025-02-08 08:38:49,067 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:49,067 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,dataset,image-classification,imagenet,preprocessed,_NCHW,_full
[2025-02-08 08:38:49,067 module.py:667 DEBUG] - - Number of cached script outputs found: 0
[2025-02-08 08:38:49,067 module.py:816 DEBUG] - - Found script::get-preprocessed-dataset-imagenet, f259d490bbaf45f5 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-preprocessed-dataset-imagenet
[2025-02-08 08:38:49,067 module.py:2412 DEBUG] - Prepared variations: _NCHW,_full,_validation,_mlcommons-reference-preprocessor,_resolution.224
[2025-02-08 08:38:49,070 module.py:4823 DEBUG] - - Checking if script execution is already cached ...
[2025-02-08 08:38:49,070 module.py:4856 DEBUG] - - Prepared explicit variations: _NCHW,_full
[2025-02-08 08:38:49,070 module.py:4875 DEBUG] - - Prepared variations: _NCHW,_full
[2025-02-08 08:38:49,070 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,dataset,image-classification,imagenet,preprocessed,ILSVRC,_NCHW,_full
[2025-02-08 08:38:49,070 module.py:1369 DEBUG] - - Creating new "cache" script artifact in the MLC local repository ...
[2025-02-08 08:38:49,070 module.py:1372 DEBUG] - - Tags: tmp,get,dataset,image-classification,imagenet,preprocessed,ILSVRC,_NCHW,_full,script-item-f259d490bbaf45f5
[2025-02-08 08:38:49,071 module.py:1400 DEBUG] - - Changing to /home/mlcuser/MLC/repos/local/cache/get-preprocessed-dataset-imagenet_34b7b905
[2025-02-08 08:38:49,072 module.py:1628 DEBUG] - - Checking dependencies on other MLC scripts:
[2025-02-08 08:38:49,080 module.py:560 INFO] - * mlcr get,python3
[2025-02-08 08:38:49,080 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:49,080 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python3
[2025-02-08 08:38:49,080 module.py:667 DEBUG] - - Number of cached script outputs found: 1
[2025-02-08 08:38:49,080 module.py:816 DEBUG] - - Found script::get-python3, d0b5dd74373f4a62 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-python3
[2025-02-08 08:38:49,081 module.py:4823 DEBUG] - - Checking if script execution is already cached ...
[2025-02-08 08:38:49,081 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,python3,python,get-python,get-python3
[2025-02-08 08:38:49,081 module.py:1218 DEBUG] - - Found cached script output: /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c
[2025-02-08 08:38:49,081 module.py:1248 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:49,081 module.py:1261 DEBUG] - - Loading state from cached entry ...
[2025-02-08 08:38:49,081 module.py:1274 INFO] - ! load /home/mlcuser/MLC/repos/local/cache/get-python3_e43c688c/mlc-cached-state.json
[2025-02-08 08:38:49,081 module.py:1309 DEBUG] - - Checking posthook dependencies on other MLC scripts:
[2025-02-08 08:38:49,081 module.py:1322 DEBUG] - - Checking post dependencies on other MLC scripts:
[2025-02-08 08:38:49,082 module.py:2192 INFO] - - running time of script "get,python,python3,get-python,get-python3": 0.01 sec.
[2025-02-08 08:38:49,082 module.py:2220 INFO] - Path to Python: /home/mlcuser/venv/mlc/bin/python3
[2025-02-08 08:38:49,082 module.py:2220 INFO] - Python version: 3.10.12
[2025-02-08 08:38:49,093 module.py:560 INFO] - * mlcr get,dataset,image-classification,original,_full
[2025-02-08 08:38:49,093 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:49,093 module.py:654 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,dataset,image-classification,original,_full
[2025-02-08 08:38:49,093 module.py:667 DEBUG] - - Number of cached script outputs found: 0
[2025-02-08 08:38:49,093 module.py:816 DEBUG] - - Found script::get-dataset-imagenet-val, 7afd58d287fe4f11 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/get-dataset-imagenet-val
[2025-02-08 08:38:49,093 module.py:2412 DEBUG] - Prepared variations: _full,_2012
[2025-02-08 08:38:49,095 module.py:4823 DEBUG] - - Checking if script execution is already cached ...
[2025-02-08 08:38:49,095 module.py:4856 DEBUG] - - Prepared explicit variations: _full
[2025-02-08 08:38:49,095 module.py:4875 DEBUG] - - Prepared variations: _full
[2025-02-08 08:38:49,095 module.py:4914 DEBUG] - - Searching for cached script outputs with the following tags: -tmp,get,dataset,image-classification,original,val,validation,imagenet,ILSVRC,_full
[2025-02-08 08:38:49,096 module.py:1369 DEBUG] - - Creating new "cache" script artifact in the MLC local repository ...
[2025-02-08 08:38:49,096 module.py:1372 DEBUG] - - Tags: tmp,get,dataset,image-classification,original,val,validation,imagenet,ILSVRC,_full,script-item-7afd58d287fe4f11
[2025-02-08 08:38:49,096 module.py:1400 DEBUG] - - Changing to /home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-val_52100728
[2025-02-08 08:38:49,097 module.py:1628 DEBUG] - - Checking dependencies on other MLC scripts:
[2025-02-08 08:38:49,103 module.py:560 INFO] - * mlcr detect,os
[2025-02-08 08:38:49,103 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:49,103 module.py:816 DEBUG] - - Found script::detect-os, 863735b7db8c44fc in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os
[2025-02-08 08:38:49,105 module.py:1759 DEBUG] - - Running preprocess ...
[2025-02-08 08:38:49,109 module.py:5333 DEBUG] - - Running native script "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh" from temporal script "tmp-run.sh" in "/home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-val_52100728" ...
[2025-02-08 08:38:49,109 module.py:5340 INFO] - ! cd /home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-val_52100728
[2025-02-08 08:38:49,109 module.py:5341 INFO] - ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-02-08 08:38:49,124 module.py:5487 INFO] - ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-02-08 08:38:49,124 module.py:5550 DEBUG] - - Running postprocess ...
[2025-02-08 08:38:49,130 module.py:2192 INFO] - - running time of script "detect-os,detect,os,info": 0.03 sec.
[2025-02-08 08:38:49,131 module.py:1637 DEBUG] - - Processing env after dependencies ...
[2025-02-08 08:38:49,131 module.py:1759 DEBUG] - - Running preprocess ...
[2025-02-08 08:38:49,135 module.py:1838 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:49,148 module.py:560 INFO] - * mlcr download-and-extract,file,_extract,_url.https://image-net.org/data/ILSVRC/2012/ILSVRC2012_img_val.tar
[2025-02-08 08:38:49,148 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:49,149 module.py:816 DEBUG] - - Found script::download-and-extract, c67e81a4ce2649f5 in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/download-and-extract
[2025-02-08 08:38:49,149 module.py:2412 DEBUG] - Prepared variations: _extract,_url.https://image-net.org/data/ILSVRC/2012/ILSVRC2012_img_val.tar,_cmutil,_keep
[2025-02-08 08:38:49,166 module.py:1759 DEBUG] - - Running preprocess ...
[2025-02-08 08:38:49,171 module.py:1838 DEBUG] - - Checking prehook dependencies on other MLC scripts:
[2025-02-08 08:38:49,183 module.py:560 INFO] - * mlcr download,file,_cmutil,_url.https://image-net.org/data/ILSVRC/2012/ILSVRC2012_img_val.tar
[2025-02-08 08:38:49,183 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:49,184 module.py:816 DEBUG] - - Found script::download-file, 9cdc8dc41aae437e in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/download-file
[2025-02-08 08:38:49,184 module.py:2412 DEBUG] - Prepared variations: _cmutil,_url.https://image-net.org/data/ILSVRC/2012/ILSVRC2012_img_val.tar
[2025-02-08 08:38:49,188 module.py:1628 DEBUG] - - Checking dependencies on other MLC scripts:
[2025-02-08 08:38:49,195 module.py:560 INFO] - * mlcr detect,os
[2025-02-08 08:38:49,195 module.py:592 DEBUG] - - Number of scripts found: 1
[2025-02-08 08:38:49,195 module.py:816 DEBUG] - - Found script::detect-os, 863735b7db8c44fc in /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os
[2025-02-08 08:38:49,196 module.py:1759 DEBUG] - - Running preprocess ...
[2025-02-08 08:38:49,201 module.py:5333 DEBUG] - - Running native script "/home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh" from temporal script "tmp-run.sh" in "/home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-val_52100728" ...
[2025-02-08 08:38:49,201 module.py:5340 INFO] - ! cd /home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-val_52100728
[2025-02-08 08:38:49,201 module.py:5341 INFO] - ! call /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
[2025-02-08 08:38:49,219 module.py:5487 INFO] - ! call "postprocess" from /home/mlcuser/MLC/repos/mlcommons@mlperf-automations/script/detect-os/customize.py
[2025-02-08 08:38:49,219 module.py:5550 DEBUG] - - Running postprocess ...
[2025-02-08 08:38:49,228 module.py:2192 INFO] - - running time of script "detect-os,detect,os,info": 0.04 sec.
[2025-02-08 08:38:49,229 module.py:1637 DEBUG] - - Processing env after dependencies ...
[2025-02-08 08:38:49,230 module.py:1759 DEBUG] - - Running preprocess ...
Downloading from https://image-net.org/data/ILSVRC/2012/ILSVRC2012_img_val.tar
Downloading to /home/mlcuser/MLC/repos/local/cache/get-dataset-imagenet-val_52100728/ILSVRC2012_img_val.tar
Downloaded: 10%
The text was updated successfully, but these errors were encountered: