Support Vector Machines (SVM) is one of the well known supervised classes of learning algorithms. SVM have wide applications to many fields in recent years and also many algorithmic and modeling variations. Basic SVM models are dealing with the situation where the exact values of the data points are known. This paper presents a survey of SVM when the data points are uncertain. When a direct model cannot guarantee a generally good performance on the uncertainty set, robust optimization is introduced to deal with the worst case scenario and still guarantee an optimal performance. The data uncertainty could be an additive noise which is bounded by norm, where some efficient linear programming models are presented under certain conditions; or could be intervals with support and extremum values; or a more general case of polyhedral uncertainties with formulations presented. Another field of the uncertainty analysis is chance constrained SVM which is used to ensure the small probability of misclassification for the uncertain data. The multivariate Chebyshev inequality and Bernstein bounding schemes have been used to transform the chance constraints through robust optimization. The Chebyshev based model employs moment information of the uncertain training points. The Bernstein bounds can be less conservative than the Chebyshev bounds since it employs both support and moment information, but it also makes a strong assumption that all the elements in the data set are independent.