This is a beta version of a MATLAB toolbox implementing Vapnik's support vector machine, as described in [1]. Training is performed using the SMO algorithm, due to Platt [2], implemented as a mex file (for speed). Before you use the toolbox you need to run the compilemex script to recompile them (if there are problems running this script, make sure you have the mex compiler set up correctly - you may need to see your sys-admin to do this).

At the moment this is the only documentation for the toolbox but the file demo.m provides a simple demonstration that ought to be enough to get started. For a good introduction to support vector machines, see the excellent book by Cristianini and Shawe-Taylor [3]. Key features of this toolbox:

• C++ MEX implementation of the SMO training algorithm, with caching of kernel evaluations for efficiency.
• Support for multi-class support vector classification using max wins, pairwise [4] and DAG-SVM [5] algorithms.
• A model selection criterion (the xi-alpha bound [6,7] on the leave-one-out cross-validation error).
• Object oriented design, currently this just means that you can supply bespoke kernel functions for particular applications, but will in future releases also support a range of training algorithms, model selection criteria etc.

### Licencing Arrangements

The toolbox is provided free for non-commercial use under the terms of the GNU General Public License (license.txt), however, I would be grateful if:

• you let me know about any bugs you find
• you send suggestions of ideas to improve the toolbox (e.g. references to other training algorithms)
• reference the toolbox web page in any publication describing research performed using the toolbox, or software derived from the toolbox. A suitable BibTeX entry would look something like this:

@misc{Cawley2000,
author       = "Cawley, G. C.",
title        = "{MATLAB} Support Vector Machine Toolbox (v0.55$\beta$) $[$
\texttt{http://theoval.sys.uea.ac.uk/\~{}gcc/svm/toolbox}$]$",
howpublished = "University of East Anglia, School of Information Systems,
Norwich, Norfolk, U.K. NR4 7TJ",
year         = 2000

• Current version - v0.55beta (zip 138 KB)

### "To Do" List

• Find time to write a proper list of things to do!
• Documentation.
• Automated model selection.
• Sparse matrix support
• Re-implement SMO algorithm in C for ease of compilation

### References

1. V.N. Vapnik, The Nature of Statistical Learning Theory, Springer-Verlag, New York, ISBN 0-387-94559-8, 1995.
2. J. C. Platt, Fast training of support vector machines using sequential minimal optimization, in Advances in Kernel Methods - Support Vector Learning, (Eds) B. Scholkopf, C. Burges, and A. J. Smola, MIT Press, Cambridge, Massachusetts, chapter 12, pp 185-208, 1999.
3. N. Cristianini and J. Shawe-Taylor, Support Vector Machines and other kernel-based learning methods, Cambridge University Press, ISBN 0-521-78019-5, 2000.
4. U. Kressel, Pairwise Classification and Support Vector Machines, in Advances in Kernel Methods - Support Vector Learning, (Eds) B. Scholkopf, C. Burges, and A. J. Smola, MIT Press, Cambridge, Massachusetts, chapter 15, 1999.
5. J. Platt, N. Cristianini, J. Shawe-Taylor, Large Margin DAGs for Multiclass Classification, in Advances in Neural Information Processing Systems 12, pp. 547-553, MIT Press, 2000.(PlattCristianiniShawe-Taylor.ps.gz).
6. T. Joachims, Estimating the Generalization Performance of a SVM Efficiently, 25, University Dortmund, LS VIII, 1999.
7. T. Joachims, Estimating the Generalization Performance of a SVM Efficiently, in Proceedings of the International Conference on Machine Learning, Morgan Kaufman, 2000.