-
Notifications
You must be signed in to change notification settings - Fork 64
Added support of subfunction for VLMs #699
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Added support of subfunction for VLMs #699
Conversation
Signed-off-by: Abhishek Kumar Singh <[email protected]>
| - update the hidden_states, and fix for onnx model | ||
| """ | ||
|
|
||
| def get_repeated_layer_class(self) -> Type[nn.Module]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
rename to get_submodules_for_export
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
okay will do.
| cos = torch.cat([cos[0, ..., 0:32], cos[0, ..., 32:80], cos[0, ..., 80:128]], dim=-1).unsqueeze(0) | ||
| sin = torch.cat([sin[0, ..., 0:32], sin[0, ..., 32:80], sin[0, ..., 80:128]], dim=-1).unsqueeze(0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why hardcoding?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To get around (Split: Non-constant split tensor not supported) issue, although some tests are still pending. will let you know here.
| try: | ||
| _C._jit_pass_onnx_track_scope_attributes(graph, onnx_attrs) | ||
| except Exception as e: | ||
| logger.warning(f"Failed to track ONNX scope attributes: {e}. Skipping this step.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why warning?
We should fail here if _C._jit_pass_onnx_track_scope_attributes(graph, onnx_attrs) fails
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a hacky workaround for the issue. if you remember previously, we use to pass empty onnx_attrs. for VLMs there are additional issue now. We are skipping the failing step though it does not impact model execution.
No description provided.