Exploring the Math in Support Vector Machines

235 views
Download
  • Share
Create Account or Sign In to post comments
#Machine Learning #SVM #Kernel Machines

This talk examines some of the mathematical intuition behind SVM. The speaker, Dr. Vishnu S. Pendyala, presents ideas from the method of Lagrange Multipliers, Convex Hull, Orthogonal Functions, Hilbert Space, Duality, Cover’s Theorem, Mercer’s Theorem, and in a more approachable and easy-to-understand way.

Support Vector Machines (SVMs) are used for supervised machine learning and have been successful in many applications, including those like image classification that favor deep learning. SVM owes its power to the intriguing math involved in its fabrication. This talk introduces SVM and covers some of that math.

Topics covered also include constrained and unconstrained optimization, convexity, the general notion of function space, min-max equilibrium, duality, Cover theorem, Kernels, and Mercer theorem.

IEEE Day event organized by the Silicon Valley Chapter of IEEE Computer Society, Oct 4, 2022. Mathematical intuition in SVM For access to past video webinars or to join our Dlist to hear about future programs, please visit https://r6.ieee.org/scv-cs

This talk examines some of the mathematical intuition behind SVM. The speaker, Dr. Vishnu S. Pendyala, presents ideas from the method of Lagrange Multipliers, Convex Hull, Orthogonal Functions, Hilbert Space, Duality, Cover’s Theorem, Mercer’s Theorem, and in a more approachable and easy-to-understand way.

Support Vector Machines (SVMs) are used for...

Speakers in this video

Advertisment

Advertisment