Most SVM software packages (including svmTrain.m) automatically add the extra featurex0 = 1 for you and automatically take care of learning the intercept term ✓0. So when passing your training data to the SVM software, there is no need to add this extra feature x0 = 1 yourself. In particular, in Octave/MATLAB your code should be working with training examples x2Rn (rather than x2Rn+1); If you are training an SVM on a real problem, especially if you need to scale to a larger dataset, we strongly recommend instead using a highly optimized SVM toolbox such as LIBSVM.function sim = gaussianKernel(x1, x2, sigma) % Ensure that x1 and x2 are column vectors x1 = x1(:); x2 = x2(:); sim = 0; sim=e^((-sum((x1-x2).^2))/(2*(sigma^2))); end
When implementing cross validation to select the best C and parameter to use, you need to evaluate the error on the cross validation set. Recall that for classiﬁcation, the error is deﬁned as the fraction of the cross validation examples that were classiﬁed incorrectly. In Octave/MATLAB, you can compute this error using mean(double(predictions ~= yval)), wherepredictions is a vector containing all the predictions from the SVM, and yval are the true labels from the cross validation set.