Submitted by Rishh3112 t3_120gvgw in deeplearning
I made a model for handwritten text recognition. The model is training on CPU but when I use gpu I get cuda out of memory error in the validation step. Can someone please tell me why this is happening?
Old-Chemistry-7050 t1_jdhbvrc wrote
Model too big or there’s a memory issue somewhere in ur code