wally1002 t1_jcx6232 wrote on March 20, 2023 at 6:30 AM Reply to How noticeable is the difference training a model 4080 vs 4090 by Numerous_Talk7940 For deeplearning higher VRAM is always preferable. 12/16GB limits the kind of models you can run/infer. With LLMs getting democratised it's better to be future proof. Permalink 12
wally1002 t1_jcx6232 wrote
Reply to How noticeable is the difference training a model 4080 vs 4090 by Numerous_Talk7940
For deeplearning higher VRAM is always preferable. 12/16GB limits the kind of models you can run/infer. With LLMs getting democratised it's better to be future proof.