I have tried using logistic regression with polynomial feature, and fortunately its working fine for me and also I am able to plot the decision curve. I have used map_feature function for polynomial features. (I referred Prof Andrew's notes on logistic regression with regularization) : http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=MachineLearning&doc=exercises/ex5/ex5.html
Now I am trying to achieve the same using Gaussian Kernel instead of taking polynomial features. Fortunately my cost function (j_theta) works fine and decreases after every iteration and I get my final theta value. The problem that I face now is HOW DO I PLOT THE DECISION BOUNDARY here
I am using Octave to develop the algorithms and plot the graphs..
Below is the details to my data set size
Original Data set:
Data Set (x): [20*3] where the first column is the intercept or the bias column
1.00 2.0000 1.0000
1.00 3.0000 1.0000
1.00 4.0000 1.0000
1.00 5.0000 2.0000
1.00 5.0000 3.0000
.
.
.
Data set with new features after implementation of Gaussian Kernal
Data set (f) : [20*21] the first column is the intercept column with all values as 1
1.0000e+000 1.0000e+000 6.0653e-001 1.3534e-001 6.7379e-003 . . . . . . . .
1.0000e+000 6.0653e-001 1.0000e+000 6.0653e-001 8.2085e-002 . . . . . . . .
1.0000e+000 1.3534e-001 6.0653e-001 1.0000e+000 3.6788e-001
1.0000e+000 6.7379e-003 8.2085e-002 3.6788e-001 1.0000e+000
. .
. .
. .
. .
. .
The cost Function graph that I get after applying gradient descent on my new featured data set (f) is :
Hence I get my new theta value:
theta: [21*1]
3.8874e+000
1.1747e-001
3.5931e-002
-8.5937e-005
-1.2666e-001
-1.0584e-001
.
.
.
The problem that I face now is how do I construct my decision curve upon my original dataset having new features data set and theta value. I have no clue how do I proceed.
I would be glad if I get some clue, or tutorials, or link that could help me solve my problem.
Appreciate you help . Thanks
The referenced Andrew's note actually contains a very good example of how to draw the decision boundary. Also see this stackoverflow post. The basic steps to follow are as below:
X
. Z
variable (the height on the contour plot), plot the contour curve. In the sample code below, we assume a 2d feature space each ranges from -1 to 200. We choose a step size of 1.5 and then for each point in the grid, we call the model predictor
-- map_feature(u,v) x theta
to get the point score. Finally the plot is drawn by calling contour
function in matlab.
Plotting the decision boundary here will be trickier than plotting the best-fit curve in linear regression. You will need to plot the $\\theta^T x = 0$ line implicity, by plotting a contour. This can be done by evaluating $\\theta^Tx$ over a grid of points representing the original $u$ and $v$ inputs, and then plotting the line where $\\theta^Tx$ evaluates to zero. The plot implementation for Matlab/Octave is given below.
% Define the ranges of the grid
u = linspace(-1, 1.5, 200);
v = linspace(-1, 1.5, 200);
% Initialize space for the values to be plotted
z = zeros(length(u), length(v));
% Evaluate z = theta*x over the grid
for i = 1:length(u)
for j = 1:length(v)
% Notice the order of j, i here!
z(j,i) = map_feature(u(i), v(j))*theta;
end
end
% Because of the way that contour plotting works
% in Matlab, we need to transpose z, or
% else the axis orientation will be flipped!
z = z'
% Plot z = 0 by specifying the range [0, 0]
contour(u,v,z, [0, 0], 'LineWidth', 2)
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.