Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Script execution failure during submission of results of offline scenario for ResNet50 #206

Open
KathpaliaChirag opened this issue Feb 10, 2025 · 3 comments
Labels
bug Something isn't working mlc-on-windows

Comments

@KathpaliaChirag
Copy link

Steps I followed :
I did a run on " mlcr run-mlperf,inference,_r5.0-dev --model=resnet50 --implementation=reference --framework=onnxruntime --category=edge --scenario=Offline --execution_mode=valid --device=cpu --quiet --use_dataset_from_user=yes --env.MLC_GET_PLATFORM_DETAILS=no --adr.loadgen.tags=_from-pip --pip_loadgen=yes " and got results

Image

but Now when I am trying to do submission I am getting error

steps I followed for submission :

  1. Did a fork this : https://github.com/gateOverflow/mlperf_inference_submissions_v5.0/

(in step 2 comes error)

I tried running
2. mlc run script --tags=generate,inference,submission --clean --preprocess_submission=yes --run-checker=yes --submitter=GATEOverflow --division=closed --env.CM_DETERMINE_MEMORY_CONFIGURATION=yes --quiet --hw_name="LENOVO IDEAPAD GAMING"

This gave error

then i tried some modifications in tags That i added during run mlc run script --tags=generate,inference,submission --clean --preprocess_submission=yes --run-checker=yes --submitter=GATEOverflow --division=closed --env.CM_DETERMINE_MEMORY_CONFIGURATION=yes --quiet --hw_name="LENOVO IDEAPAD GAMING" --use_dataset_from_user=yes --env.MLC_GET_PLATFORM_DETAILS=no --adr.loadgen.tags=_from-pip --pip_loadgen=yes

(both gave same error attaching log below)

chira@Sassy MINGW64 ~/OneDrive/Desktop/mlcnew (master)
$ mlc run script --tags=generate,inference,submission --clean --preprocess_submission=yes --run-checker=yes --submitter=GATEOverflow --division=closed --env.CM_DETERMINE_MEMORY_CONFIGURATION=yes --quiet --hw_name="LENOVO IDEAPAD GAMING"
[2025-02-10 10:29:51,823 module.py:560 INFO] - * mlcr generate,inference,submission
[2025-02-10 10:29:51,827 module.py:560 INFO] - * mlcr get,python3
[2025-02-10 10:29:51,828 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-python3_1e633417\mlc-cached-state.json
[2025-02-10 10:29:51,830 module.py:2220 INFO] - Path to Python: C:\Users\chira\AppData\Local\Programs\Python\Python310\python.exe
[2025-02-10 10:29:51,830 module.py:2220 INFO] - Python version: 3.10.0
[2025-02-10 10:29:51,841 module.py:560 INFO] - * mlcr mlcommons,inference,src
[2025-02-10 10:29:51,844 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-mlperf-inference-src_a6b8600f\mlc-cached-state.json
[2025-02-10 10:29:51,849 module.py:560 INFO] - * mlcr get,sut,system-description
[2025-02-10 10:29:51,853 module.py:560 INFO] - * mlcr detect,os
[2025-02-10 10:29:51,857 module.py:5340 INFO] - ! cd C:\Users\chira\OneDrive\Desktop\mlcnew
[2025-02-10 10:29:51,857 module.py:5341 INFO] - ! call C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\detect-os\run.bat from tmp-run.bat
[2025-02-10 10:29:51,889 module.py:5487 INFO] - ! call "postprocess" from C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\detect-os\customize.py
[2025-02-10 10:29:51,895 module.py:560 INFO] - * mlcr get,sys-utils-min
[2025-02-10 10:29:51,896 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-sys-utils-min_1d963fb9\mlc-cached-state.json
[2025-02-10 10:29:51,901 module.py:560 INFO] - * mlcr detect,cpu
[2025-02-10 10:29:51,906 module.py:560 INFO] - * mlcr detect,os
[2025-02-10 10:29:51,913 module.py:5340 INFO] - ! cd C:\Users\chira\OneDrive\Desktop\mlcnew
[2025-02-10 10:29:51,913 module.py:5341 INFO] - ! call C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\detect-os\run.bat from tmp-run.bat
[2025-02-10 10:29:51,942 module.py:5487 INFO] - ! call "postprocess" from C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\detect-os\customize.py
[2025-02-10 10:29:51,949 module.py:560 INFO] - * mlcr get,sys-utils-min
[2025-02-10 10:29:51,950 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-sys-utils-min_1d963fb9\mlc-cached-state.json
[2025-02-10 10:29:51,954 module.py:5340 INFO] - ! cd C:\Users\chira\OneDrive\Desktop\mlcnew
[2025-02-10 10:29:51,954 module.py:5341 INFO] - ! call C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\detect-cpu\run.bat from tmp-run.bat
[2025-02-10 10:29:53,222 module.py:5487 INFO] - ! call "postprocess" from C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\detect-cpu\customize.py
WARNING: tmp-systeminfo.csv file was not generated!
[2025-02-10 10:29:53,227 customize.py:90 WARNING] - WARNING: problem processing file tmp-wmic-cpu.csv (list index out of range)!
[2025-02-10 10:29:53,227 customize.py:98 WARNING] - WARNING: need to unify system and cpu output on Windows
[2025-02-10 10:29:53,233 module.py:560 INFO] - * mlcr get,python3
[2025-02-10 10:29:53,235 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-python3_1e633417\mlc-cached-state.json
[2025-02-10 10:29:53,235 module.py:2220 INFO] - Path to Python: C:\Users\chira\AppData\Local\Programs\Python\Python310\python.exe
[2025-02-10 10:29:53,235 module.py:2220 INFO] - Python version: 3.10.0
[2025-02-10 10:29:53,242 module.py:560 INFO] - * mlcr get,compiler
[2025-02-10 10:29:53,380 module.py:4107 INFO] - * C:\Program Files\LLVM\bin\clang.exe
[2025-02-10 10:29:53,382 module.py:5340 INFO] - ! cd C:\Users\chira\MLC\repos\local\cache\get-llvm_f3265103
[2025-02-10 10:29:53,382 module.py:5341 INFO] - ! call C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\get-llvm\run.bat from tmp-run.bat
'C:\Program' is not recognized as an internal or external command,
operable program or batch file.
[2025-02-10 10:29:53,426 module.py:560 INFO] - * mlcr install,llvm
[2025-02-10 10:29:53,428 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\install-llvm-prebuilt_cd49173d\mlc-cached-state.json
[2025-02-10 10:29:53,429 module.py:5340 INFO] - ! cd C:\Users\chira\MLC\repos\local\cache\get-llvm_f3265103
[2025-02-10 10:29:53,429 module.py:5341 INFO] - ! call C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\get-llvm\run.bat from tmp-run.bat
The system cannot find the path specified.
Traceback (most recent call last):
File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 86, in run_code
exec(code, run_globals)
File "C:\Users\chira\AppData\Local\Programs\Python\Python310\Scripts\mlc.exe_main
.py", line 7, in
File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\site-packages\mlc\main.py", line 1487, in main
res = method(run_args)
File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\site-packages\mlc\main.py", line 1250, in run
return self.call_script_module_function("run", run_args)
File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\site-packages\mlc\main.py", line 1230, in call_script_module_function
result = automation_instance.run(run_args) # Pass args to the run method
File "C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\automation\script\module.py", line 225, in run
r = self._run(i)
File "C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\automation\script\module.py", line 1631, in _run
r = self._call_run_deps(deps, self.local_env_keys, local_env_keys_from_meta, env, state, const, const_state, add_deps_recursive,
File "C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\automation\script\module.py", line 3538, in _call_run_deps
r = script._run_deps(deps, local_env_keys, env, state, const, const_state, add_deps_recursive, recursion_spaces,
File "C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\automation\script\module.py", line 3708, in _run_deps
r = self.action_object.access(ii)
File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\site-packages\mlc\main.py", line 96, in access
result = method(self, options)
File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\site-packages\mlc\main.py", line 1250, in run
return self.call_script_module_function("run", run_args)
File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\site-packages\mlc\main.py", line 1230, in call_script_module_function
result = automation_instance.run(run_args) # Pass args to the run method
File "C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\automation\script\module.py", line 225, in run
r = self._run(i)
File "C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\automation\script\module.py", line 1631, in _run
r = self._call_run_deps(deps, self.local_env_keys, local_env_keys_from_meta, env, state, const, const_state, add_deps_recursive,
File "C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\automation\script\module.py", line 3538, in _call_run_deps
r = script._run_deps(deps, local_env_keys, env, state, const, const_state, add_deps_recursive, recursion_spaces,
File "C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\automation\script\module.py", line 3708, in _run_deps
r = self.action_object.access(ii)
File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\site-packages\mlc\main.py", line 96, in access
result = method(self, options)
File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\site-packages\mlc\main.py", line 1250, in run
return self.call_script_module_function("run", run_args)
File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\site-packages\mlc\main.py", line 1240, in call_script_module_function
raise ScriptExecutionError(f"Script {function_name} execution failed. Error : {error}")
mlc.main.ScriptExecutionError: Script run execution failed. Error : MLC script failed (name = get-llvm, return code = 3)

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Please file an issue at https://github.com/mlcommons/mlperf-automations/issues along with the full MLC command being run and the relevant
or full console log.

chira@Sassy MINGW64 ~/OneDrive/Desktop/mlcnew (master)
$

Image

Image

@KathpaliaChirag KathpaliaChirag marked this as a duplicate of #207 Feb 10, 2025
@arjunsuresh
Copy link
Collaborator

This issue is coming from get-llvm script. The script doesn't seem to support "space" in the file path and so the "Program Files" folder is causing the issue. Can you please see if the below change has fixed it? (You can do mlc pull repo and do the submission generation command)
https://github.com/mlcommons/mlperf-automations/pull/209/files

If not, you can add this option --env.MLC_MLPERF_INFERENCE_LOADGEN_INSTALL_FROM_PIP=yes to the submission generation command.

@KathpaliaChirag
Copy link
Author

Thankyou above issue is resolved after mlc pull repo but, There is a new issue now, FileNotFoundError: [Errno 2] No such file or directory: 'C:\Users\chira\MLC\repos\local\cache\get-mlperf-inference-submission-dir_4a367a07\mlperf-inference-submission\closed\GATEOverflow\measurements\LENOVO IDEAPAD GAMING-reference-cpu-onnxruntime_v1.20.1-default_config\resnet50\offline\LENOVO IDEAPAD GAMING-reference-cpu-onnxruntime_v1.20.1-default_config.json'

I believe it is a naming issue when i tracked down the folder where file is not found there was a file with name user.conf

here is the path i tracked : C:\Users\chira\MLC\repos\local\cache\get-mlperf-inference-submission-dir_4a367a07\mlperf-inference-submission\closed\GATEOverflow\measurements\LENOVO IDEAPAD GAMING-reference-cpu-onnxruntime_v1.20.1-default_config\resnet50\offline

Screenshot of what is in that folder

Image

Here is the command I tried to run : mlc run script --tags=generate,inference,submission --clean --preprocess_submission=yes --run-checker=yes --submitter=GATEOverflow --division=closed --env.CM_DETERMINE_MEMORY_CONFIGURATION=yes --quiet --hw_name="LENOVO IDEAPAD GAMING" --use_dataset_from_user=yes --env.MLC_GET_PLATFORM_DETAILS=no --adr.loadgen.tags=_from-pip --pip_loadgen=yes --env.MLC_MLPERF_INFERENCE_LOADGEN_INSTALL_FROM_PIP=yes

Also it gives same error both ways with/without adding --env.MLC_MLPERF_INFERENCE_LOADGEN_INSTALL_FROM_PIP=yes

here is the log for error
chira@Sassy MINGW64 ~/OneDrive/Desktop/mlcnew (master)
$ mlc run script --tags=generate,inference,submission --clean --preprocess_submission=yes --run-checker=yes --submitter=GATEOverflow --division=closed --env.CM_DETERMINE_MEMORY_CONFIGURATION=yes --quiet --hw_name="LENOVO IDEAPAD GAMING" --use_dataset_from_user=yes --env.MLC_GET_PLATFORM_DETAILS=no --adr.loadgen.tags=_from-pip --pip_loadgen=yes --env.MLC_MLPERF_INFERENCE_LOADGEN_INSTALL_FROM_PIP=yes
[2025-02-11 14:22:29,039 module.py:560 INFO] - * mlcr generate,inference,submission
[2025-02-11 14:22:29,046 module.py:560 INFO] - * mlcr get,python3
[2025-02-11 14:22:29,048 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-python3_1e633417\mlc-cached-state.json
[2025-02-11 14:22:29,048 module.py:2220 INFO] - Path to Python: C:\Users\chira\AppData\Local\Programs\Python\Python310\python.exe
[2025-02-11 14:22:29,049 module.py:2220 INFO] - Python version: 3.10.0
[2025-02-11 14:22:29,063 module.py:560 INFO] - * mlcr mlcommons,inference,src
[2025-02-11 14:22:29,066 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-mlperf-inference-src_14844856\mlc-cached-state.json
[2025-02-11 14:22:29,073 module.py:560 INFO] - * mlcr get,sut,system-description
[2025-02-11 14:22:29,078 module.py:560 INFO] - * mlcr detect,os
[2025-02-11 14:22:29,083 module.py:5334 INFO] - ! cd C:\Users\chira\OneDrive\Desktop\mlcnew
[2025-02-11 14:22:29,084 module.py:5335 INFO] - ! call C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\detect-os\run.bat from tmp-run.bat
[2025-02-11 14:22:29,131 module.py:5481 INFO] - ! call "postprocess" from C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\detect-os\customize.py
[2025-02-11 14:22:29,139 module.py:560 INFO] - * mlcr get,sys-utils-min
[2025-02-11 14:22:29,141 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-sys-utils-min_1d963fb9\mlc-cached-state.json
[2025-02-11 14:22:29,149 module.py:560 INFO] - * mlcr detect,cpu
[2025-02-11 14:22:29,156 module.py:560 INFO] - * mlcr detect,os
[2025-02-11 14:22:29,167 module.py:5334 INFO] - ! cd C:\Users\chira\OneDrive\Desktop\mlcnew
[2025-02-11 14:22:29,167 module.py:5335 INFO] - ! call C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\detect-os\run.bat from tmp-run.bat
[2025-02-11 14:22:29,202 module.py:5481 INFO] - ! call "postprocess" from C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\detect-os\customize.py
[2025-02-11 14:22:29,211 module.py:560 INFO] - * mlcr get,sys-utils-min
[2025-02-11 14:22:29,212 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-sys-utils-min_1d963fb9\mlc-cached-state.json
[2025-02-11 14:22:29,218 module.py:5334 INFO] - ! cd C:\Users\chira\OneDrive\Desktop\mlcnew
[2025-02-11 14:22:29,218 module.py:5335 INFO] - ! call C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\detect-cpu\run.bat from tmp-run.bat
[2025-02-11 14:22:30,531 module.py:5481 INFO] - ! call "postprocess" from C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\detect-cpu\customize.py
WARNING: tmp-systeminfo.csv file was not generated!
[2025-02-11 14:22:30,544 customize.py:90 WARNING] - WARNING: problem processing file tmp-wmic-cpu.csv (list index out of range)!
[2025-02-11 14:22:30,544 customize.py:98 WARNING] - WARNING: need to unify system and cpu output on Windows
[2025-02-11 14:22:30,552 module.py:560 INFO] - * mlcr get,python3
[2025-02-11 14:22:30,553 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-python3_1e633417\mlc-cached-state.json
[2025-02-11 14:22:30,553 module.py:2220 INFO] - Path to Python: C:\Users\chira\AppData\Local\Programs\Python\Python310\python.exe
[2025-02-11 14:22:30,555 module.py:2220 INFO] - Python version: 3.10.0
[2025-02-11 14:22:30,614 module.py:560 INFO] - * mlcr get,generic-python-lib,_package.dmiparser
[2025-02-11 14:22:30,622 module.py:560 INFO] - * mlcr get,python3
[2025-02-11 14:22:30,623 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-python3_1e633417\mlc-cached-state.json
[2025-02-11 14:22:30,623 module.py:2220 INFO] - Path to Python: C:\Users\chira\AppData\Local\Programs\Python\Python310\python.exe
[2025-02-11 14:22:30,623 module.py:2220 INFO] - Python version: 3.10.0
[2025-02-11 14:22:30,623 module.py:5334 INFO] - ! cd C:\Users\chira\OneDrive\Desktop\mlcnew
[2025-02-11 14:22:30,623 module.py:5335 INFO] - ! call C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\get-generic-python-lib\validate_cache.bat from tmp-run.bat
[2025-02-11 14:22:30,775 module.py:5481 INFO] - ! call "detect_version" from C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\get-generic-python-lib\customize.py
Detected version: 5.1
[2025-02-11 14:22:30,786 module.py:560 INFO] - * mlcr get,python3
[2025-02-11 14:22:30,786 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-python3_1e633417\mlc-cached-state.json
[2025-02-11 14:22:30,786 module.py:2220 INFO] - Path to Python: C:\Users\chira\AppData\Local\Programs\Python\Python310\python.exe
[2025-02-11 14:22:30,786 module.py:2220 INFO] - Python version: 3.10.0
[2025-02-11 14:22:30,786 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-generic-python-lib_965a3639\mlc-cached-state.json
[2025-02-11 14:22:30,794 module.py:560 INFO] - * mlcr get,cache,dir,_name.mlperf-inference-sut-descriptions
[2025-02-11 14:22:30,794 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-cache-dir_f3dfe89a\mlc-cached-state.json
Generating SUT description file for LENOVO IDEAPAD GAMING
[2025-02-11 14:22:30,810 module.py:5481 INFO] - ! call "postprocess" from C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\get-mlperf-inference-sut-description\customize.py
[2025-02-11 14:22:30,818 module.py:560 INFO] - * mlcr install,pip-package,for-mlc-python,_package.tabulate
[2025-02-11 14:22:30,818 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\install-pip-package-for-mlc-python_18da3771\mlc-cached-state.json
[2025-02-11 14:22:30,826 module.py:560 INFO] - * mlcr get,mlperf,inference,utils
[2025-02-11 14:22:30,843 module.py:560 INFO] - * mlcr get,mlperf,inference,src
[2025-02-11 14:22:30,843 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-mlperf-inference-src_14844856\mlc-cached-state.json
[2025-02-11 14:22:30,851 module.py:5481 INFO] - ! call "postprocess" from C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\get-mlperf-inference-utils\customize.py
[2025-02-11 14:22:30,860 module.py:560 INFO] - * mlcr get,mlperf,results,dir,local
[2025-02-11 14:22:30,860 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-mlperf-inference-results-dir_e5877e03\mlc-cached-state.json
[2025-02-11 14:22:30,866 module.py:560 INFO] - * mlcr get,mlperf,submission,dir
[2025-02-11 14:22:30,866 module.py:1274 INFO] - ! load C:\Users\chira\MLC\repos\local\cache\get-mlperf-inference-submission-dir_4a367a07\mlc-cached-state.json
[2025-02-11 14:22:30,893 module.py:5481 INFO] - ! call "postprocess" from C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\generate-mlperf-inference-submission\customize.py

Cleaning C:\Users\chira\MLC\repos\local\cache\get-mlperf-inference-submission-dir_4a367a07\mlperf-inference-submission ...

  • MLPerf inference submission dir: C:\Users\chira\MLC\repos\local\cache\get-mlperf-inference-submission-dir_4a367a07\mlperf-inference-submission
  • MLPerf inference results dir: C:\Users\chira\MLC\repos\local\cache\get-mlperf-inference-results-dir_e5877e03\valid_results
  • MLPerf inference division: closed
  • MLPerf inference submitter: GATEOverflow
    sut info completely filled from C:\Users\chira\MLC\repos\local\cache\get-mlperf-inference-results-dir_e5877e03\valid_results\Sassy-reference-cpu-onnxruntime-v1.20.1-default_config\mlc-sut-info.json!
    The SUT folder name for submission generation is: LENOVO IDEAPAD GAMING-reference-cpu-onnxruntime_v1.20.1-default_config
  • MLPerf inference model: resnet50
    Traceback (most recent call last):
    File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
    File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 86, in run_code
    exec(code, run_globals)
    File "C:\Users\chira\AppData\Local\Programs\Python\Python310\Scripts\mlc.exe_main
    .py", line 7, in
    File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\site-packages\mlc\main.py", line 1487, in main
    res = method(run_args)
    File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\site-packages\mlc\main.py", line 1250, in run
    return self.call_script_module_function("run", run_args)
    File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\site-packages\mlc\main.py", line 1230, in call_script_module_function
    result = automation_instance.run(run_args) # Pass args to the run method
    File "C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\automation\script\module.py", line 225, in run
    r = self._run(i)
    File "C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\automation\script\module.py", line 1858, in _run
    r = prepare_and_run_script_with_postprocessing(
    File "C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\automation\script\module.py", line 5495, in prepare_and_run_script_with_postprocessing
    rr = run_postprocess(customize_code, customize_common_input, recursion_spaces, env, state, const,
    File "C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\automation\script\module.py", line 5562, in run_postprocess
    r = customize_code.postprocess(ii)
    File "C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\generate-mlperf-inference-submission\customize.py", line 742, in postprocess
    r = generate_submission(env, state, inp, submission_division)
    File "C:\Users\chira\MLC\repos\mlcommons@mlperf-automations\script\generate-mlperf-inference-submission\customize.py", line 537, in generate_submission
    shutil.copy(
    File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\shutil.py", line 417, in copy
    copyfile(src, dst, follow_symlinks=follow_symlinks)
    File "C:\Users\chira\AppData\Local\Programs\Python\Python310\lib\shutil.py", line 256, in copyfile
    with open(dst, 'wb') as fdst:
    FileNotFoundError: [Errno 2] No such file or directory: 'C:\Users\chira\MLC\repos\local\cache\get-mlperf-inference-submission-dir_4a367a07\mlperf-inference-submission\closed\GATEOverflow\measurements\LENOVO IDEAPAD GAMING-reference-cpu-onnxruntime_v1.20.1-default_config\resnet50\offline\LENOVO IDEAPAD GAMING-reference-cpu-onnxruntime_v1.20.1-default_config.json'

@arjunsuresh
Copy link
Collaborator

Not exactly sure why this would have happened, but can you please do

mlc pull repo

and retry the submission generation step?

@arjunsuresh arjunsuresh added the bug Something isn't working label Feb 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working mlc-on-windows
Projects
Status: No status
Development

No branches or pull requests

2 participants