Submitted by olmec-akeru t3_z6p4yv in MachineLearning
olmec-akeru OP t1_iy8ajq0 wrote
Reply to comment by new_name_who_dis_ in [D] What method is state of the art dimensionality reduction by olmec-akeru
> the beauty of the PCA reduction was that one dimension was responsible for the size of the nose
You posit that an eigenvector will represent the nose when there are meaningful variations of scale, rotation, and position?
This is very different to saying all variance will be explained across the full set of eigenvectors (which very much is true).
new_name_who_dis_ t1_iy8b0jr wrote
It was just an example. Sure not all sizes of nose are found along the same eigenvector.
Viewing a single comment thread. View all comments