Support Vector Machines are learning machines that can perform binary classification (pattern recognition) and real valued function approximation (regression estimation) tasks. Support Vector Machines non-linearly map their n-dimensional input space into a high dimensional feature space. In this high dimesional feature space a linear classifier is constructed. For further information see our publications list.
Please note that the Support Vector Machine is under password protection. Please contact the CLRC Director to obtain the necessary username and password.
The SVM can be used for NON-COMMERCIAL purposes.
Arrangements for any other usages may be made
by contacting the CLRC
Director.
Also, we would appreciate it if the reference below be made
in your research and/or teaching and your publications.
A new technique for "hedging" predictions was presented and discussed
recently by Alexander Gammerman
and Vladimir Vovk
at a special meeting of the
British Computer Society.
The method can be applied to many algorithms,
including Support Vector Machines, Kernel Ridge Regression,
Kernel Nearest Neighbours and other state-of-the-art methods.
The hedged predictions include confidence measures that are provably
valid and it becomes possible to control the number of errors
by selecting a suitable confidence level.
The discussants of the technique included
Vladimir Vapnik,
Alexey
Chervonenkis,
Glenn Shafer,
Zhiyuan Luo and many others.
The paper and the discussion can be found
here.