Skip to content

Conversation

@zsxkib
Copy link
Collaborator

@zsxkib zsxkib commented Aug 20, 2025

  • Added safetensor_utils.py module with rename_lora_keys_for_pruna function
  • Integrated conversion in train.py's create_output_archive function
  • Converts 'diffusion_model' prefix to 'transformer' prefix in LoRA keys
  • This removes the 10s conversion delay on Pruna's side
  • Gracefully handles errors - continues even if conversion fails
  • Preserves all tensor data and validates checksums during conversion

zsxkib added 3 commits August 20, 2025 04:33
- Added safetensor_utils.py module with rename_lora_keys_for_pruna function
- Integrated conversion in train.py's create_output_archive function
- Converts 'diffusion_model' prefix to 'transformer' prefix in LoRA keys
- This removes the 10s conversion delay on Pruna's side
- Gracefully handles errors - continues even if conversion fails
- Preserves all tensor data and validates checksums during conversion
- Changed width and height parameters from required int to Optional[int]
- This allows the predictor to work with aspect_ratio/image_size presets when custom dimensions are not provided
- Maintains backward compatibility while fixing type hints
@zsxkib zsxkib force-pushed the add-safetensor-conversion branch from b08fa55 to 3914498 Compare August 20, 2025 19:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants