Viewing a single comment thread. View all comments

LetterRip t1_j43v3yi wrote

This group did such a distillation but didn't share the weights, they got it down to 24 MB.

https://www.reddit.com/r/MachineLearning/comments/p1o2bd/research_we_distilled_clip_model_vit_only_from/

LAION or stability.ai or huggingface might be willing to provide free compute to distill one of the openCLIP models.

Come to think of it, stability.ai should be releasing the distilled stablediffusion latter this month (week or two?) and it presumably will have a distilled clip.

5

alkibijad OP t1_j462o4r wrote

Cool, I wasn't aware of the distilled diffusion! That could be useful, thanks for sharing!

3

LetterRip t1_j47qjhj wrote

I don't know for certain that the CLIP was distilled also, that is an assumption on my part. Also EMAD has been fuzzy about exactly when the release would be.

2