Skip to content

Conversation

fgh1999
Copy link

@fgh1999 fgh1999 commented Aug 21, 2024

  • self.offset应该不计入self.length
  • 传入self_attentionq的shape如在forward下的layer loop中不再reshape的话,应为(seq, n_kv_h * n_groups, dqkv)

Fix the shape comment of `q` in `self_attention`.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant