You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Within MultiHeadedAttention the ass_mask is not being passed into the attention method here and appears as if it's unused. IIUC the attention mask is necessary to prevent look ahead bias in the attention mechanism and should be masking off future values when calculating attention.
If this mask is unused, what was it's intent? Where is attention being masked? And how should that be appied?
The text was updated successfully, but these errors were encountered:
This mask is indeed not used since we make portfolio decision15 minutes ahead, while is not a long sequence prediction like the translation. This mask is just a not mature attempt at providing a long term strategy,you can just ignore it.
Within MultiHeadedAttention the ass_mask is not being passed into the attention method here and appears as if it's unused. IIUC the attention mask is necessary to prevent look ahead bias in the attention mechanism and should be masking off future values when calculating attention.
If this mask is unused, what was it's intent? Where is attention being masked? And how should that be appied?
The text was updated successfully, but these errors were encountered: