Support Vector Machine (SVM) Math: A Conceptual Look at Support Vector Machines
Math
| Beginner
- 8 videos | 59m 21s
- Includes Assessment
- Earns a Badge
Simple to use yet efficient and reliable, support vector machines (SVMs) are supervised learning methods popularly used for classification tasks. This course uncovers the math behind SVMs, focusing on how an optimum SVM hyperplane for classification is computed. Explore the representation of data in a feature space, finding a hyperplane to separate the data linearly. Then, learn how to separate non-linear data. Investigate the optimization problem for SVM classifiers, looking at how the weights of the model can be adjusted during training to get the best hyperplane separating the data points. Furthermore, apply gradient descent to solve the optimization problem for SVMs. When you're done, you'll have the foundational knowledge you need to start building and applying SVMs for machine learning.
WHAT YOU WILL LEARN
-
Discover the key concepts covered in this courseRecognize the place of support vector machines (svms) in the machine learning landscapeOutline how svms can be used to classify data, how hyperplanes are defined, and the qualities of an optimum hyperplaneRecall the qualities of an optimum hyperplane, outline how scaling works with svm, distinguish soft and hard margins, and recognize when and how to use either margin
-
Recall the techniques that can be applied to classify data that are not linearly separableFormulate the optimization problem for support vector machinesApply the gradient descent algorithm to solve for the optimum hyperplaneSummarize the key concepts covered in this course
IN THIS COURSE
-
2m 47s
-
6m 8s
-
11m 27s
-
6m 40s
-
4m 58s
-
12m 32s
-
12m 40s
-
2m 9s
EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE
Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.
Digital badges are yours to keep, forever.