Keynote/Invited Speakers:

Alexander Gorban

Chair in Applied Mathematics
University of Leicester, UK
Chief Scientist, Institute of Computational Modeling, Krasnoyarsk, Russia

Tom Heskes

(Editor-in-Chief of Neurocomputing)
Professor in Artificial Intelligence
Head of Machine Learning Group, Intelligent Systems
Institute for Computing and Information Sciences (iCIS)
Radboud University Nijmegen

Richard Harvey

Senior Lecturer in Computer Lip Reading
School of Computing Sciences
University of East Anglia, UK

Gavin Cawley

Senior Lecturer in Machine Learning
School of Computing Sciences
University of East Anglia, UK

Iead Rezek

Computational, Cognitive, and Clinical Neuroscience (C3NL) Laboratory
Centre for Neuroscience<
Department of Medicine<
Imperial College London

More will be arranged later.

Keynote/Invited Plenary Talks: Title, Abstract and Biosketch of the Speakers

1. Geometry and Topology of Data Spaces

Speaker: Professor Alexender Gorban (University of Leicester, Leicester, UK)

Abstract: Revealing geometry and topology in a finite dataset is an intriguing problem. We present several methods of non-linear data modelling and construction of principal manifolds and principal graphs. These methods are based on the metaphor of elasticity (the elastic principal graph approach). The elastic energy functionals are quadratic and, hence, the computational procedures are not very expensive. The simplest algorithms have the classical expectation/maximization (or splitting) structure.

For the complexity control, several types of complexity are introduced: geometric complexity, structural complexity and construction complexity. The geometric complexity measures how far a principal object deviates from its ideal configuration. The structural complexity counts the number of various elements. It may be represented, for example, by some non-decreasing function of the number of vertices, edges and k-stars of different orders. The construction complexity is defined with respect to a graph grammar as a number of applications of elementary transformations.

Construction of principal graphs with controlled complexity is based on the graph grammar approach and on the idea of pluriharmonic graphs as ideal approximate objects. We present several applications for microarray analysis and visualization of various datasets from genomics, medical and social research. The GIS-inspired methods of datasets cartography are used. In particular, we demonstrate estimation and visualization of uncertainty. 

In a series of case studies we compare performance of nonlinear dimensionality reduction to the linear PCA. Nonlinear methods demonstrate better data approximation (as it is expected), better quality of distance mapping (the higher correlation coefficient between the pair-wise distances before and after projection onto the principal object), better quality of point neighborhood preservation and better class compactness.

Biosketch:Prof. Dr. Alexander Gorban obtained his PhD degree in differential equations and mathematical physics in 1980, and Dr. Sc. degree in biophysics in 1990. He holds now a chair of Applied Mathematics at the University of Leicester, UK, and he is the Chief Scientist at the Institute for Computational Modelling Russian Academy of Sciences (Krasnoyarsk, Russia). His scientific interest include interdisciplinary problem of model reduction, topological dynamics, physical and chemical kinetics, mathematical biology and data mining.

2. Bayesian machine learning for reading the brain

Speaker: Professor Tom Heskes (Radboud University Nijmegen, the Netherlands)

Abstract: Machine learning is about learning models from data. In so-called Bayesian machine learning we build probabilistic models and use probability calculus, in particular Bayes' rule, to infer the unknown model parameters given the observed data. In my presentation I will show where this leads to by highlighting some of the applications that we work on related to neuroimaging: brain-computer interfaces based on covert attention and brain reading by decoding fMRI images.

Biosketch:Dr Tom Heskes is a Professor in Artificial Intelligence, and he leads the Machine Learning Group, at the Institute for Computing and Information Sciences,Radboud University Nijmegen, the Netherlands. He is further affiliated Principal Investigator at the Donders Centre for Neuroscience and director of the Institute for Computing and Information Sciences.

Prof Heskes' research is on artificial intelligence, in particular (Bayesian) machine learning. He works on Bayesian inference (approximate inference, hierarchical modeling, dynamic Bayesian networks, preference elicitation); machine learning (multi-task learning, bias-variance decompositions); and neural networks (on-line learning, self-organizing maps, time-series prediction) with applications to, among others, neuroscience and bioinformatics.  Prof Heskes is the Editor-in-Chief of Neurocomputing.

3. Lip Reading: Science Fact or Fiction

Speaker: Dr. Richard Harvey (University of East Anglia, UK)

Abstract: HAL9000, the demented computer in 2001 A Space Odyssey, appears to learn to read lips with ease and many humans claim to have mastered this skill. But how difficult is it really and is it feasible to use machine learning methods to tackle this problem? This talk will describe the state of the art in computer lip-reading, will show how these methods benchmark against humans and will describe the key problem which is speaker variability. I will give some description of the palliatives against speaker variability but this is an open problem and will need more input from the machine learning community. If time allows I will show how our work can be extended to operate in multiple languages. 

Biosketch: Richard Harvey is the Dean, UEA London and Director of Admissions for the University. In his professional life he currently teaches Computer Science having started his career as an engineer. His early work was in non-linear signal processing methods which led him to work in computer vision where refined and developed a set of scale-space analysis tools known as sieves. A subset of these systems are now known as Maximally Stable Extremal Regions (MSERs) which are the best performing key point extraction methods. Sieves can also be applied to computer lip-reading and our early work on lip-reading used these methods. More recently he has been leading a multidisciplinary team on lip-reading with the aim of building a speaker-independent multi-lingual system in conjunction with the University of Surrey. The current lip-reading system is believed to be the best in the world. The research was reported on the New Scientist in April 2009. He has also directed several of the University's spin-out companies and has recently set-up a new university campus in London.

4. Model Selection - The step where performance gains are most easily obtained

Speaker: Dr. Gavin Cawley (University of East Anglia, UK)

Abstract: over-fitting, and its avoidance, has been a major focus in the development of machine learning algorithms for decades. However while over-fitting is a familiar problem in training statistical models, it is rather less well appreciated that over-fitting can also occur during model selection. As the majority of modern machine learning algorithms have a small number of hyper-parameters that need to be optimised (e.g. the regularisation and kernel parameters in the case of a support vector machine), steps to avoid over-fitting during model selection are also required. It is possible that research on training algorithms has reached the point of diminishing returns and that performance gains are now most easily obtained by developments in model selection. We present results of a thorough empirical study that demonstrates that over-fitting in model selection is a significant issue in the practical application of machine learning methods, and suggest some methods by which it may be addressed.

Biosketch: Dr Cawley obtained his PhD in electronic systems engineering from the University of Essex in 1996, and is currently a senior lecturer in the School of Computing Sciences at the University of East Anglia. His research interests lie in theoretical and algorithmic issues with a direct impact in the practical application of machine learning techniques, including topics such as feature selection, model selection, performance estimation, model comparison, covariate shift, dealing with imbalanced or "non-standard" data and predictive uncertainty. Application areas focus on computational biology and environmental sciences. He has won several competitions on machine learning and data mining challenges associated with IEEE World Congress on Computational Intelligence -- International Joint Conferences on Neural Networks.

5. Variational Inference - A tutorial Introduction and Applications

Speaker: Dr. Iead Rezek (Imperial College, UK)

Abstract: Variational methods have recently become popular in the context of inference problems. For problems requiring large-scale inference they often offer a practical alternative to Markov Chain Monte Carlo methods. In this tutorial I describe the Bayesian variational approximation to inference in probabilistic graphical models. I will start from first principles using only statistical terminology, as it is more accessible that the usual statistical physics description. I will highlight connections to the other algorithms, e.g. the well-known Expectation Maximisation and Message Passing algorithms, and explain the concepts using numerous pattern recognition examples.

Biosketch: Iead Rezek is a lecturer at Imperial College, a post he held since 2008. He received the Dipl.-Ing. from the Fachhochschule Ulm, Germany, and the BSc from the University of Plymouth, U.K. in 1991, both in electrical engineering. He received his MSc in Physical Sciences and Engineering in Medicine in 1992, and his PhD in Machine Learning in 1997, both from Imperial College, London, U.K. He joined Oxford University from 2000-2007 as a Research Fellow. His main research interests are in data-driven and statistical methods for pattern recognition, predominantly with application to biomedical and biological problems. He is currently a member of the Royal Statistical Society.