File tree Expand file tree Collapse file tree 2 files changed +1
-1
lines changed Expand file tree Collapse file tree 2 files changed +1
-1
lines changed Original file line number Diff line number Diff line change @@ -408,4 +408,3 @@ accuracy/test_llm_api_autodeploy.py::TestLlama3_1_8B::test_auto_dtype[False-4] S
408408accuracy/test_llm_api_autodeploy.py::TestLlama3_1_8B::test_auto_dtype[False-2] SKIP (https://nvbugs/5680312, https://nvbugs/5636912)
409409unittest/_torch/auto_deploy/unit/multigpu/test_ad_build_small_multi.py::test_build_ad[meta-llama/Meta-Llama-3.1-8B-Instruct-llm_extra_args0-2] SKIP (https://nvbugs/5680755)
410410examples/test_ray.py::test_ray_disaggregated_serving[tp2] SKIP (https://nvbugs/5683039)
411- full:H100_PCIe/unittest/llmapi/test_llm_pytorch.py::test_llama_7b_multi_lora_evict_and_reload_lora_gpu_cache SKIP (https://nvbugs/5682551)
Original file line number Diff line number Diff line change @@ -360,6 +360,7 @@ def _check_llama_7b_multi_lora_evict_load_new_adapters(
360360
361361
362362@skip_gpu_memory_less_than_40gb
363+ @skip_ray # https://nvbugs/5682551
363364def test_llama_7b_multi_lora_evict_and_reload_lora_gpu_cache ():
364365 """Test eviction and re-loading a previously evicted adapter from the LoRA GPU cache, within a single
365366 llm.generate call, that's repeated twice.
You can’t perform that action at this time.
0 commit comments