Description

Book Synopsis
This book focuses on Least Squares Support Vector Machines (LS-SVMs) which are reformulations to standard SVMs. LS-SVMs are closely related to regularization networks and Gaussian processes but additionally emphasize and exploit primal-dual interpretations from optimization theory. The authors explain the natural links between LS-SVM classifiers and kernel Fisher discriminant analysis. Bayesian inference of LS-SVM models is discussed, together with methods for imposing sparseness and employing robust statistics.The framework is further extended towards unsupervised learning by considering PCA analysis and its kernel version as a one-class modelling problem. This leads to new primal-dual support vector machine formulations for kernel PCA and kernel CCA analysis. Furthermore, LS-SVM formulations are given for recurrent networks and control. In general, support vector machines may pose heavy computational challenges for large data sets. For this purpose, a method of fixed size LS-SVM is proposed where the estimation is done in the primal space in relation to a Nyström sampling with active selection of support vectors. The methods are illustrated with several examples.

Table of Contents
Support Vector Machines; Basic Methods of Least Squares Support Vector Machines; Bayesian Inference for LS-SVM Models; Robustness; Large Scale Problems; LS-SVM for Unsupervised Learning; LS-SVM for Recurrent Networks and Control.

Least Squares Support Vector Machines

Product form

£90.00

Includes FREE delivery

RRP £100.00 – you save £10.00 (10%)

Order before 4pm tomorrow for delivery by Tue 20 Jan 2026.

A Hardback by Johan A K Suykens, Tony Van Gestel, Joseph De Brabanter

Out of stock


    View other formats and editions of Least Squares Support Vector Machines by Johan A K Suykens

    Publisher: World Scientific Publishing Co Pte Ltd
    Publication Date: 14/11/2002
    ISBN13: 9789812381514, 978-9812381514
    ISBN10: 9812381511
    Also in:
    Machine learning

    Description

    Book Synopsis
    This book focuses on Least Squares Support Vector Machines (LS-SVMs) which are reformulations to standard SVMs. LS-SVMs are closely related to regularization networks and Gaussian processes but additionally emphasize and exploit primal-dual interpretations from optimization theory. The authors explain the natural links between LS-SVM classifiers and kernel Fisher discriminant analysis. Bayesian inference of LS-SVM models is discussed, together with methods for imposing sparseness and employing robust statistics.The framework is further extended towards unsupervised learning by considering PCA analysis and its kernel version as a one-class modelling problem. This leads to new primal-dual support vector machine formulations for kernel PCA and kernel CCA analysis. Furthermore, LS-SVM formulations are given for recurrent networks and control. In general, support vector machines may pose heavy computational challenges for large data sets. For this purpose, a method of fixed size LS-SVM is proposed where the estimation is done in the primal space in relation to a Nyström sampling with active selection of support vectors. The methods are illustrated with several examples.

    Table of Contents
    Support Vector Machines; Basic Methods of Least Squares Support Vector Machines; Bayesian Inference for LS-SVM Models; Robustness; Large Scale Problems; LS-SVM for Unsupervised Learning; LS-SVM for Recurrent Networks and Control.

    Recently viewed products

    © 2026 Book Curl

      • American Express
      • Apple Pay
      • Diners Club
      • Discover
      • Google Pay
      • Maestro
      • Mastercard
      • PayPal
      • Shop Pay
      • Union Pay
      • Visa

      Login

      Forgot your password?

      Don't have an account yet?
      Create account