Of course, the author, being human, could not avoid emphasizing the techniques with which he is most familiar. This is healthy, since writing a book is a means of sharing the author's expertise and point of view with readers. This is why I strongly believe that a new book does not come to replace previous ones, but to complement previously published points of view. (查看原文)
For a fixed number of training points, N, in the data sets D, trying to minimize the variance term results in an increase of the bias term and vice versa. This is because, in order to reduce the bias term, one has to increase the complexity (more free parameters) of the adopted estimator f (·; D). This, in turn, results in higher variance as we change the training sets. This is a manifestation of the over fitting issue that we have already discussed. The only way to reduce both terms simultaneously is to increase the number of the training data points, N, and at the same time increase the complexity of the model carefully, so as to achieve the aforementioned goal. If one increases the number of training points and at the same time increases the model complexity excessively, the overall MSE... (查看原文)