Support Vector Machines (SVMs) are supervised learning models for classification and regression problems. Support Vector Machines(SVMs) are supervised learning models for classification and regression problems. They can solve linear and nonlinear problems and use the concept of Margin to classify between classes. SVMs give better accuracy than KNNs, Decision Trees, Naive Bayes Classifiers in most cases and have been known to outperform neural networks in a few instances. SVMs are highly preferred as they are easy to implement, produce significant accuracy with less computation power, and a must-have in every machine learning expert’s arsenal.
How do SVMs work?
The support vector machine algorithm’s objective is to find a hyperplane in an N-dimensional space that distinctly classifies the data points. In SVM, we plot each data point in the dataset in an N-dimensional space. Many possible hyperplanes could be chosen To separate the two classes of data points. Our objective is to find the optimal separating hyperplane or maximum-margin hyperplane, which separates the N different data points clusters and is equidistant from them. Finding the maximum margin hyperplane reinforces so that future data points can be classified with higher confidence.
What is a hyperplane?
Hyperplanes are decision boundaries that aid in classifying the data points. In an N-dimensional space, a hyperplane is a flat affine subspace of dimension N-1. Visually, in a 2D space, a hyperplane will be a line, and in 3D space, it will be a flat plane. In simple terms, a hyperplane is a decision boundary, and data points falling on either side of the hyperplane can be associated with different classes.
What are Support Vectors?
Support Vectors are the data points that are on or closest to the hyperplane and influence the hyperplane’s position and orientation. Using these support vectors, we maximize the Margin of the classifier. Deleting the support vectors will change the position of the hyperplane. Support Vectors are equidistant from the hyperplane and help in building the SVM. Support vectors are called so because if their position shifts, the hyperplane shifts as well. That means the hyperplane only depends on the position of support vectors.
Classification of higher-dimensional data using the kernel trick
A Kernel function is always used by SVM, whether it is linear or nonlinear data, but its true strength is leveraged only when the data is inseparable in its current form. In the case of nonlinear data, SVM uses the Kernel-trick. The idea is to map the nonlinearly separable data from a lower dimension into a higher dimensional space to find a hyperplane. For example, the mapping function transforms the 2D nonlinear input space into a 3D output space using kernel functions. The complexity of finding the mapping function in SVM reduces significantly by using Kernel Functions.
Implementing SVM using python
Write the following code for implementing SVMs using the sci-kit learn library:
#importing sci-kit learn and other important libraries
from sklearn.datasets import make_circles
from sklearn import svm
from matplotlib import pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
import numpy as np
X,Y = make_circles(n_samples=500,noise=0.02)
plt.scatter(X[:,0],X[:,1],c=Y)
plt.show()
def phi(X):
""""Non Linear Transformation"""
X1 = X[:,0]
X2 = X[:,1]
X3 = X1**2 + X2**2
X_ = np.zeros((X.shape[0],3))
print(X_.shape)
X_[:,:-1] = X
X_[:,-1] = X3
return X_
def plot3d(X,show=True):
fig = plt.figure(figsize=(10,10))
ax = fig.add_subplot(111,projection='3d')
X1 = X[:,0]
X2 = X[:,1]
X3 = X[:,2]
ax.scatter(X1,X2,X3,zdir='z',s=20,c=Y,depthshade=True)
if(show==True):
plt.show()
return ax
ax = plot3d(X_)
# using the rbf kernel function to use the kernel trick
svc = svm.SVC(kernel="rbf")
svc.fit(X,Y)
svc.score(X,Y)


Visit scikit-learn documentation of Support Vector Machines to learn more.
Happy learning!!
Nitish is a computer science undergraduate with keen interest in the field of deep learning. He has done various projects related to deep learning and closely follows the new advancements taking place in the field.