So basically, if you want to estimate the average of more than three averages, there is a better way to go than just averaging them together. Sort of like how when computing the Standard Deviation for a sample population you divide by n-1 instead of n. Anyways, that's the James-Stein estimator, and the authors apply it to PCA dimensionality reduction; they de-bias and improve the leading eigenvector with their technique.
Ctrl-f for "fig. 3" to see their results in action.
This looks like it might affect a wide number of fields. I appreciate the concrete examples (batting averages, finance / Markowitz) as well.
PCA is an interative algo. Once you build the leading vector, you subtract it from your data and then build the next.
Do you know by any chance why you can't use this new method recursively for building the whole SVD basis ? (I haven't read the paper carefully yet.. It's a bit at the limit of my understanding )
This looks like it might affect a wide number of fields. I appreciate the concrete examples (batting averages, finance / Markowitz) as well.