Skip to content

Commit

Permalink
polish code
Browse files Browse the repository at this point in the history
  • Loading branch information
feifeibear committed Dec 26, 2024
1 parent 45af87e commit d7b4a69
Showing 1 changed file with 0 additions and 3 deletions.
3 changes: 0 additions & 3 deletions xfuser/core/long_ctx_attention/ring/ring_flash_attn.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,10 +11,7 @@
except ImportError:
flash_attn = None
_flash_attn_forward = None
<<<<<<< HEAD
from yunchang.kernels.attention import pytorch_attn_forward
=======
>>>>>>> main

def xdit_ring_flash_attn_forward(
process_group,
Expand Down

0 comments on commit d7b4a69

Please sign in to comment.