During the past decade there has been an explosion in computation and information technology. With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and...
During the past decade there has been an explosion in computation and information technology. With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book descibes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learing (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting--the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful <EM>An Introduction to the Bootstrap</EM>. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.
The Elements of Statistical Learning的创作者
· · · · · ·
Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie co-developed much of the statistical modeling software and environment in R/S-PLUS and invented principal curves and surf...
Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie co-developed much of the statistical modeling software and environment in R/S-PLUS and invented principal curves and surfaces. Tibshirani proposed the lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, projection pursuit and gradient boosting.
LAR uses least squares directions in the active set of variables.
Lasso uses least square directions; if a variable crosses zero, it is removed from the active set.
Boosting uses non-negative least squares directions in the active set. (查看原文)
Elements of statistic learning is one of the most important textbooks on algorithm analysis in the field of machine learning. The authors of this book, Trevor Hastie, Robert Tibshirani and Jerome Friedman, are pioneers in the area and have done really b...
(展开)
1 有用 一枚酸奶 2014-04-26 00:39:04
1. 一点都不基础 被虐惨了 2. 新手千万不要看 3. 得读好几遍 = =
1 有用 欧阳大爷 2016-06-14 10:30:12
被称为工具书之神,被虐惨了,完全搞不懂
0 有用 nEw2 2013-02-16 06:53:38
嗯外国大牛就喜欢给巨难的书起个简单名字。风格是点到为止和欲言又止,一点都不罗哩罗嗦,有基础的会热血沸腾,没基础的跟看天书差不多。后几章习题找不到答案。
1 有用 原来是这样 2019-02-13 03:34:16
半年攻下!
0 有用 kkkc 2009-10-21 15:13:08
值得反复研读。