Submitted by netw0rkf10w t3_10rtis6 in MachineLearning
puppet_pals t1_j701uqt wrote
Reply to comment by netw0rkf10w in [D] ImageNet normalization vs [-1, 1] normalization by netw0rkf10w
>I think normalization will be here to stay (maybe not the ImageNet one though), as it usually speeds up training.
the reality is you are tied to the normalization scheme of whatever you are transfer learning from. (assuming you are transfer learning). Framework authors and people publishing weights should make normalization as easy as possible; typically via a 1/255.0 rescaling operation (or x/127.5 - 1, I'm indifferent though I opt for 1/255 personally)
Viewing a single comment thread. View all comments