Skip to content
This repository was archived by the owner on Sep 23, 2025. It is now read-only.

Commit 7b16ced

Browse files
authored
fixed benchmark error after removing HF token from build log (#259)
Signed-off-by: Jiafu Zhang <[email protected]>
1 parent cd99682 commit 7b16ced

File tree

1 file changed

+0
-18
lines changed

1 file changed

+0
-18
lines changed

.github/workflows/workflow_test_benchmark.yml

Lines changed: 0 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -92,24 +92,6 @@ jobs:
9292
TARGET=${{steps.target.outputs.target}}
9393
# Additional libraries required for pytest
9494
docker exec "${TARGET}" bash -c "pip install -r tests/requirements.txt"
95-
CMD=$(cat << EOF
96-
import yaml
97-
conf_path = "llm_on_ray/inference/models/llama-2-7b-chat-hf.yaml"
98-
with open(conf_path, encoding="utf-8") as reader:
99-
result = yaml.load(reader, Loader=yaml.FullLoader)
100-
result['model_description']["config"]["use_auth_token"] = "${{ env.HF_ACCESS_TOKEN }}"
101-
with open(conf_path, 'w') as output:
102-
yaml.dump(result, output, sort_keys=False)
103-
conf_path = "llm_on_ray/inference/models/vllm/llama-2-7b-chat-hf-vllm.yaml"
104-
with open(conf_path, encoding="utf-8") as reader:
105-
result = yaml.load(reader, Loader=yaml.FullLoader)
106-
result['model_description']["config"]["use_auth_token"] = "${{ env.HF_ACCESS_TOKEN }}"
107-
with open(conf_path, 'w') as output:
108-
yaml.dump(result, output, sort_keys=False)
109-
EOF
110-
)
111-
docker exec "${TARGET}" python -c "$CMD"
112-
docker exec "${TARGET}" bash -c "huggingface-cli login --token ${{ env.HF_ACCESS_TOKEN }}"
11395
docker exec "${TARGET}" bash -c "./tests/run-tests-benchmark.sh"
11496
- name: Stop Ray
11597
run: |

0 commit comments

Comments
 (0)