Andrew Ng机器学习week2(Linear Regression)编程习题
1、Warm-up Exercise
function A = warmUpExercise()
A =
[];
A =
eye(
5);
end
2、Computing Cost(for One Variable)
function J = computeCost(X, y, theta)
m =
length(y);
J =
0;
J = sum((X * theta - y) .^
2) / (
2 * m);
end
3、Gradient Descent(for One Variable)
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
m =
length(y);
J_history =
zeros(num_iters,
1);
for iter =
1:num_iters
theta = theta - alpha * (
X' * (X * theta - y)) / m;
J_history(iter) = computeCost(X, y, theta);
endfor
end
4、Feature Normalization
function [X_norm, mu, sigma] = featureNormalize(X)
转载请注明原文地址: https://ju.6miu.com/read-35759.html