In these scenarios, we can’t have linear hyper-plane between the two classes, so how does SVM classify these two classes? SVM can solve this problem easily! It solves this problem by introducing additional feature. Here, we will add a new feature z.
For this case, if we take a new feature z as |x|. Now, plot these data points on axis x and z, here you would be able to generate a linear hyperplane.
But, one of the question which arises is, should we need to add this feature manually to have a hyper-plane. No, SVM has a technique called the kernel trick. These are functions which takes low dimensional input space and transform it to a higher dimensional space i.e. it converts not separable problem to separable problem, these functions are called kernels. It is mostly useful in non-linear separation problem. Simply put, it does some extremely complex data transformations, then find out the process to separate the data based on the labels or outputs you’ve defined.
For more detail, you can refer article http://www.analyticsvidhya.com/blog/2015/10/understaing-support-vector-machine-example-code/.
Hope this helps!