Chapter 8: Support Vector Machines

Chapter 8: Support Vector Machines includes about Introduction, Support vector machine for binary classification, Multi-class classifiction, Learning with soft margin, SVM Tools, Applications of SVMs. | Chapter 8 Support Vector Machines Assoc. Prof. Dr. Duong Tuan Anh Faculty of Computer Science and Engineering, HCMC Univ. of Technology 3/2015 Outline 1. Introduction 2. Support vector machine for binary classification 3. Multi-class classifiction 4. Learning with soft margin 5. SVM Tools 6. Applications of SVMs A new method for the classification of both linear and nonlinear data. SVM is an algorithm that works as follows: It uses a nonlinear mapping to transform the original training data into a higher dimension. Within this new dimension, it searches for the linear optimal separating hyperplane. With an appropriate nonlinear mapping to a sufficiently high dimension, data from two classes can be separated by a hyperplane. SVM finds this hyperplane using support vectors (“essential” training examples) and margins (defined by the support vectors) Introduction (cont.) The first paper on SVM: 1992 by V. Vapnik et al. Although the training time of SVMs can be very | Chapter 8 Support Vector Machines Assoc. Prof. Dr. Duong Tuan Anh Faculty of Computer Science and Engineering, HCMC Univ. of Technology 3/2015 Outline 1. Introduction 2. Support vector machine for binary classification 3. Multi-class classifiction 4. Learning with soft margin 5. SVM Tools 6. Applications of SVMs A new method for the classification of both linear and nonlinear data. SVM is an algorithm that works as follows: It uses a nonlinear mapping to transform the original training data into a higher dimension. Within this new dimension, it searches for the linear optimal separating hyperplane. With an appropriate nonlinear mapping to a sufficiently high dimension, data from two classes can be separated by a hyperplane. SVM finds this hyperplane using support vectors (“essential” training examples) and margins (defined by the support vectors) Introduction (cont.) The first paper on SVM: 1992 by V. Vapnik et al. Although the training time of SVMs can be very slow, they are highly accurate. SVMs are much less prone to overfitting. SVMs can be used for prediction and classification. SVMs can be applied to: Handwritten digit recognition, object recognition, speaker identification, time series prediction. 2. Support vector machine for binary classification The case when the data are linearly separable Example: Two-class problem, linearly separateble. D = {(X1,y1),(X2, y2), ,(Xm, ym)} Xi is a training tuple, yi is associated class label. yi {+1, -1} Assume each tuple has two input attributes A1, A2. From the graph, the 2-D data are linearly separatable. There are an infinite number of separating lines that could be drawn to separate all of the tuples of class +1 from all of the tuples of class -1. Linearly separable data There are an infinite number of separating hyperplanes or “decision boundaries”. Which one is best? Figure The 2-D data are linearly separable SVM - Maximum marginal hyperplane Generalizing to n dimensions, .

Không thể tạo bản xem trước, hãy bấm tải xuống
TÀI LIỆU MỚI ĐĂNG
5    410    5    09-05-2024
143    80    2    09-05-2024
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.