UFLDL Exercise: Softmax Regression

    xiaoxiao2021-12-14  18

    这是UFLDL关于softmax回归的练习题。

    练习主要编写softmaxCost.m和softmaxPredict.m两个文件。

    softmaxCost.m

    function [cost, grad] = softmaxCost(theta, numClasses, inputSize, lambda, data, labels) % numClasses - the number of classes % inputSize - the size N of the input vector % lambda - weight decay parameter % data - the N x M input matrix, where each column data(:, i) corresponds to % a single test set % labels - an M x 1 matrix containing the labels corresponding for the input data % % Unroll the parameters from theta theta = reshape(theta, numClasses, inputSize); numCases = size(data, 2); groundTruth = full(sparse(labels, 1:numCases, 1)); %cost = 0; %thetagrad = zeros(numClasses, inputSize); %% ---------- YOUR CODE HERE -------------------------------------- % Instructions: Compute the cost and gradient for softmax regression. % You need to compute thetagrad and cost. % The groundTruth matrix might come in handy. M = theta * data; M = bsxfun(@minus, M, max(M, [], 1)); M = exp(M); M = bsxfun(@rdivide, M, sum(M)); thetagrad = (groundTruth-M)*data'/(-numCases) + lambda*theta; cost = groundTruth(:)'*log(M(:))/(-numCases) + sum(theta(:).^2)*lambda/2; % ------------------------------------------------------------------ % Unroll the gradient matrices into a vector for minFunc grad = [thetagrad(:)]; end softmaxPredict.m

    function [pred] = softmaxPredict(softmaxModel, data) % softmaxModel - model trained using softmaxTrain % data - the N x M input matrix, where each column data(:, i) corresponds to % a single test set % % Your code should produce the prediction matrix % pred, where pred(i) is argmax_c P(y(c) | x(i)). % Unroll the parameters from theta theta = softmaxModel.optTheta; % this provides a numClasses x inputSize matrix pred = zeros(1, size(data, 2)); %% ---------- YOUR CODE HERE -------------------------------------- % Instructions: Compute pred using theta assuming that the labels start % from 1. %numClasses = softmaxModel.numClasses; %inputSize = softmaxModel.inputSize; %theta = reshape(theta, numClasses, inputSize); M = theta*data; M = bsxfun(@minus,M,max(M,[],1)); M = exp(M); M = bsxfun(@rdivide,M,sum(M)); [maxv,pred] = max(M); % --------------------------------------------------------------------- end 迭代100次,Accuracy: 92.640%。

    参考:

    [1]http://deeplearning.stanford.edu/wiki/index.php/Softmax回归

    [2]http://deeplearning.stanford.edu/wiki/index.php/Exercise:Softmax_Regression

    [3]http://www.cnblogs.com/tornadomeet/archive/2013/03/23/2977621.html【参考此文章修改了部分实现,加快了训练速度】

    转载请注明原文地址: https://ju.6miu.com/read-965516.html

    最新回复(0)