SVM与核函数讲的太水了
线性不可分时,SVM与核函数都没一起讲。真不知道咋写的,线性不可分时加个惩罚项就能就出找到最优超平面了么。。。。明明需要降维打击的,还是继续看vapnik的原书好了。
以下为《The Natureof Statistical Learning Theory》的前言:
Between 1960 and 1980 are volutionin statistic soccurred:Fisher's paradigm,introduced in the 1920s and 1930s was replaced by anewone. This paradigm reflects a new answer to the fundament alquestion:
What must one know a priori about an unknown functional dependency in order to estimateiton the basis of observations?
In Fisher's paradigm the answer was very restrictive--one must know almost everything.
Namely,one must know the desired dependency up to the values of afinite number of parameters.
Estimating the values of these parameters was considered to be the problem of dependency estimation. The new paradigm overcame the restriction of the old one.
It was shown that in order to estimate dependency from the data,it is sufficient to know some general properties of the set of functions to which the unknown dependency belongs.
Determining general conditions under which estimating the unknown dependency is possible, describing the (inductive) principles that allow one to find the best approximation to the unknown dependency,andfinally developing effective algorithms for implementing these principles are the subjects of the new theory.