Description
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...].
Pruna is an open-source AI model optimisation framework.
As discussed with @SunMarc about doing this for transformers but we could do something similar for diffusers as well. Something like *Pipeline.from_pretrained
interface as an alternative to the PrunaModel interface.
Currently, the code looks as follows.
Currently, the code looks as follows.
from pruna import PrunaModel
loaded_model = PrunaModel.from_hub(
"PrunaAI/FLUX.1-dev-smashed"
)
We could go for something like.
import torch
from diffusers import FluxPipeline
pipe = FluxPipeline.from_pretrained("PrunaAI/FLUX.1-dev-smashed")
Describe the solution you'd like.
It would be a nice integration.
Describe alternatives you've considered.
We could also add the library as an explicit tab selector within the Hub, similar to llama-cpp/unsloth and other frameworks.
Additional context.
NA