Author | Vapnik, Vladimir N. author |
---|---|

Title | The Nature of Statistical Learning Theory [electronic resource] / by Vladimir N. Vapnik |

Imprint | New York, NY : Springer New York : Imprint: Springer, 1995 |

Connect to | http://dx.doi.org/10.1007/978-1-4757-2440-0 |

Descript | XV, 188 p. 18 illus. online resource |

SUMMARY

The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning from the general point of view of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include: - the general setting of learning problems and the general model of minimizing the risk functional from empirical data - a comprehensive analysis of the empirical risk minimization principle and shows how this allows for the construction of necessary and sufficient conditions for consistency - non-asymptotic bounds for the risk achieved using the empirical risk minimization principle - principles for controlling the generalization ability of learning machines using small sample sizes - introducing a new type of universal learning machine that controls the generalization ability

CONTENT

Introduction: Four Periods in the Research of the Learning Problem -- 1 Setting of the Learning Problem -- 2 Consistency of Learning Processes -- 3 Bounds on the Rate of Convergence of Learning Processes -- 4 Controlling the Generalization Ability of Learning Processes -- 5 Constructing Learning Algorithms -- Conclusion: What is Important in Learning Theory? -- References -- Remarks on References -- References

Mathematics
Artificial intelligence
Probabilities
Statistics
Mathematics
Probability Theory and Stochastic Processes
Statistics general
Artificial Intelligence (incl. Robotics)