Submitted by alkibijad t3_10a6whe in MachineLearning
LetterRip t1_j43v3yi wrote
This group did such a distillation but didn't share the weights, they got it down to 24 MB.
LAION or stability.ai or huggingface might be willing to provide free compute to distill one of the openCLIP models.
Come to think of it, stability.ai should be releasing the distilled stablediffusion latter this month (week or two?) and it presumably will have a distilled clip.
alkibijad OP t1_j462o4r wrote
Cool, I wasn't aware of the distilled diffusion! That could be useful, thanks for sharing!
LetterRip t1_j47qjhj wrote
I don't know for certain that the CLIP was distilled also, that is an assumption on my part. Also EMAD has been fuzzy about exactly when the release would be.
Viewing a single comment thread. View all comments