You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
The text was updated successfully, but these errors were encountered:
@ppanchad-amd It seems VLLM_MOE_SHUFFLE had been removed from the ROCm/vllmmain.
VLLM_MOE_PADDING set to True will still fail in the ROCm/vllm.
It seems AMD has fixed the padding issue on the upstream https://github.com/vllm-project/vllm/pull/14454/files.
The issues seem to come from these files of ROCm/vllm.
vllm/model_executor/layers/fused_moe/layer.py
vllm/model_executor/layers/fused_moe/fused_moe.py
Sync up from the vllm upstream, the padding issue will be fixed.
However, it would still require the input from your end as to if the https://github.com/ROCm/vllm/blob/main/vllm/model_executor/layers/fused_moe/fused_moe.py is still relevant.
If the vllm upstream is the correct implementation, can this be fixed in the ROCm/vllm main as well?
Your current environment
The output of `python collect_env.py`
Model Input Dumps
No response
🐛 Describe the bug
When running the following command where we enabled VLLM_MOE_PADDING=1 and VLLM_MOE_SHUFFLING=1,
NCCL_MIN_NCHANNELS=112 RAY_EXPERIMENTAL_NOSET_ROCR_VISIBLE_DEVICES=1 TRITON_HIP_USE_NEW_STREAM_PIPELINE=1 VLLM_MOE_PADDING=1 VLLM_MOE_SHUFFLING=1 HIP_FORCE_DEV_KERNARG=1 TORCH_BLAS_PREFER_HIPBLASLT=1 VLLM_SCHED_PREFILL_COUNT=0 VLLM_USE_ROCM_CUSTOM_PAGED_ATTN=1 VLLM_USE_TRITON_FLASH_ATTN=0 HIP_VISIBLE_DEVICES=7 HF_TOKEN= vllm serve mistralai/Mixtral-8x7B-Instruct-v0.1 -tp 1 --quantization fp8 --kv_cache_dtype fp8_e4m3
it throws the following error trace (only partial snippet of the error trace is include)
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: