Submitted by zxkj t3_1126g64 in MachineLearning
Wondering if there’s a term for this.
I’m training NNs for a scenario that works best with a small batch size, there are therefore many batches.
There are a couple particular samples that are VERY important. Let’s say 3 important samples out of thousands I train to.
I found end application is best when I include these important samples, repeated, in every batch. This is opposed to simply giving the samples a large weight, because the large weight doesn’t matter after looping through many batches in an epoch.
So the NN learns the other less important stuff while being forced to remain in good agreement with the important samples.
Does this technique have a name?
EDIT: In case anyone is curious, these are physics informed NNs and the important samples are equilibrium mechanical structures. The NN therefore learns what equilibrium is, with everything else being small deviations from equilibrium.
jrkirby t1_j8ibjzo wrote
https://en.wikipedia.org/wiki/Importance_sampling