A Modified Neutral Point Method for Kernel-Based Fusion of Pattern-Recognition Modalities with Incomplete Data Sets
This book constitutes the thoroughly refereed post-conference proceedings of the 8th International Conference on Learning and Optimization, LION 8, which was held in Gainesville, FL, USA, in February 2014. The 33 contributions presented were carefully reviewed and selected for inclusion in this book. A large variety of topics are covered, such as algorithm configuration; multiobjective optimization; metaheuristics; graphs and networks; logistics and transportation; and biomedical applications.
Support Vector Machines (SVM) is one of the well known supervised classes of learning algorithms. SVM have wide applications to many fields in recent years and also many algorithmic and modeling variations. Basic SVM models are dealing with the situation where the exact values of the data points are known. This paper presents a survey of SVM when the data points are uncertain. When a direct model cannot guarantee a generally good performance on the uncertainty set, robust optimization is introduced to deal with the worst case scenario and still guarantee an optimal performance. The data uncertainty could be an additive noise which is bounded by norm, where some efficient linear programming models are presented under certain conditions; or could be intervals with support and extremum values; or a more general case of polyhedral uncertainties with formulations presented. Another field of the uncertainty analysis is chance constrained SVM which is used to ensure the small probability of misclassification for the uncertain data. The multivariate Chebyshev inequality and Bernstein bounding schemes have been used to transform the chance constraints through robust optimization. The Chebyshev based model employs moment information of the uncertain training points. The Bernstein bounds can be less conservative than the Chebyshev bounds since it employs both support and moment information, but it also makes a strong assumption that all the elements in the data set are independent.
In this paper, we use robust optimization models to formulate the support vector machines (SVMs) with polyhedral uncertainties of the input data points. The formulations in our models are nonlinear and we use Lagrange multipliers to give the first-order optimality conditions and reformulation methods to solve these problems. In addition, we have proposed the models for transductive SVMs with input uncertainties.
This book constitutes the refereed proceedings of the 10th International Conference on Machine Learning and Data Mining in Pattern Recognition, MLDM 2014, held in St. Petersburg, Russia in July 2014. The 40 full papers presented were carefully reviewed and selected from 128 submissions. The topics range from theoretical topics for classification, clustering, association rule and pattern mining to specific data mining methods for the different multimedia data types such as image mining, text mining, video mining and Web mining.
Summarizes the latest applications of robust optimization in data mining.
An essential accompaniment for theoreticians and data miners Data uncertainty is a concept closely related with most real life applications that involve data collection and interpretation. Examples can be found in data acquired with biomedical instruments or other experimental techniques. Integration of robust optimization in the existing data mining techniques aims to create new algorithms resilient to error and noise.
Support vector machines (SVM) is one of the well known supervised classes of learning algorithms. Basic SVM models are dealing with the situation where the exact values of the data points are known. This paper studies SVM when the data points are uncertain. With some properties known for the distributions, chance-constrained SVM is used to ensure the small probability of misclassification for the uncertain data. As infinite number of distributions could have the known properties, the robust chance-constrained SVM requires efficient transformations of the chance constraints to make the problem solvable. In this paper, robust chance-constrained SVM with second-order moment information is studied and we obtain equivalent semidefinite programming and second order cone programming reformulations. The geometric interpretation is presented and numerical experiments are conducted. Three types of estimation errors for mean and covariance information are studied in this paper and the corresponding formulations and techniques to handle these types of errors are presented.