Exercise:PCA in 2D

习题的链接:Exercise:PCA in 2D

pca_2d.m

close all

%%================================================================
%% Step : Load data
% We have provided the code to load data from pcaData.txt into x.
% x is a * matrix, where the kth column x(:,k) corresponds to
% the kth data point.Here we provide the code to load natural image data into x.
% You do not need to change the code below. x = load('pcaData.txt','-ascii');
figure();
scatter(x(, :), x(, :));
title('Raw data'); %%================================================================
%% Step 1a: Implement PCA to obtain U
% Implement PCA to obtain the rotation matrix U, which is the eigenbasis
% sigma. % -------------------- YOUR CODE HERE --------------------
%u = zeros(size(x, )); %You need to compute this
sigma = (x*x') ./ size(x,2); %covariance matrix
[u,s,v] = svd(sigma); % --------------------------------------------------------
hold on
plot([ u(,)], [ u(,)]);
plot([ u(,)], [ u(,)]);
scatter(x(, :), x(, :));
hold off %%================================================================
%% Step 1b: Compute xRot, the projection on to the eigenbasis
% Now, compute xRot by projecting the data on to the basis defined
% by U. Visualize the points by performing a scatter plot. % -------------------- YOUR CODE HERE --------------------
%xRot = zeros(size(x)); % You need to compute this
xRot = u'*x; % -------------------------------------------------------- % Visualise the covariance matrix. You should see a line across the
% diagonal against a blue background.
figure();
scatter(xRot(, :), xRot(, :));
title('xRot'); %%================================================================
%% Step : Reduce the number of dimensions from to .
% Compute xRot again (this time projecting to dimension).
% Then, compute xHat by projecting the xRot back onto the original axes
% to see the effect of dimension reduction % -------------------- YOUR CODE HERE --------------------
k = ; % Use k = and project the data onto the first eigenbasis
%xHat = zeros(size(x)); % You need to compute this
%Recovering an Approximation of the Data
xRot(k+:size(x,), :) = ;
xHat = u*xRot; % --------------------------------------------------------
figure();
scatter(xHat(, :), xHat(, :));
title('xHat'); %%================================================================
%% Step : PCA Whitening
% Complute xPCAWhite and plot the results. epsilon = 1e-;
% -------------------- YOUR CODE HERE --------------------
%xPCAWhite = zeros(size(x)); % You need to compute this
xPCAWhite = diag( ./ sqrt(diag(s)+epsilon)) * u' * x; % --------------------------------------------------------
figure();
scatter(xPCAWhite(, :), xPCAWhite(, :));
title('xPCAWhite'); %%================================================================
%% Step : ZCA Whitening
% Complute xZCAWhite and plot the results. % -------------------- YOUR CODE HERE --------------------
%xZCAWhite = zeros(size(x)); % You need to compute this
xZCAWhite = u * xPCAWhite; % --------------------------------------------------------
figure();
scatter(xZCAWhite(, :), xZCAWhite(, :));
title('xZCAWhite'); %% Congratulations! When you have reached this point, you are done!
% You can now move onto the next PCA exercise. :)
05-11 09:31