簡體   English   中英

高斯過程回歸

[英]Gaussian Process Regression

我正在編寫一個高斯過程回歸算法。 這是代碼:

% Data generating function

fh = @(x)(2*cos(2*pi*x/10).*x);

% range

x = -5:0.01:5;
N = length(x);

% Sampled data points from the generating function

M = 50;
selection = boolean(zeros(N,1));
j = randsample(N, M);

% mark them

selection(j) = 1;
Xa = x(j);

% compute the function and extract mean

f = fh(Xa) - mean(fh(Xa));
sigma2 = 1;

% computing the interpolation using all x's
% It is expected that for points used to build the GP cov. matrix, the
% uncertainty is reduced...

K = squareform(pdist(x'));
K = exp(-(0.5*K.^2)/sigma2);

% upper left corner of K

Kaa = K(selection,selection);

% lower right corner of K

Kbb = K(~selection,~selection);

% upper right corner of K

Kab = K(selection,~selection);

% mean of posterior

m = Kab'*inv(Kaa+0.001*eye(M))*f';

% cov. matrix of posterior

D = Kbb - Kab'*inv(Kaa + 0.001*eye(M))*Kab;

% sampling M functions from from GP

[A,B,C] = svd(Kaa);
F0 = A*sqrt(B)*randn(M,M);
% mean from GP using sampled points

F0m = mean(F0,2);
F0d = std(F0,0,2);

%%
% put together data and estimation

F = zeros(N,1);
S = zeros(N,1);
F(selection) = f' + F0m;
S(selection) = F0d;

% sampling M function from posterior

[A,B,C] = svd(D);
a = A*sqrt(B)*randn(N-M,M);
% mean from posterior GPs

Fm = m + mean(a,2);
Fmd = std(a,0,2);
F(~selection) = Fm;
S(~selection) = Fmd;

%%

figure;
% show what we got...

plot(x, F, ':r', x, F-2*S, ':b', x, F+2*S, ':b'), grid on;
hold on;
% show points we got

plot(Xa, f, 'Ok');
% show the whole curve

plot(x, fh(x)-mean(fh(x)), 'k');
grid on;

我希望得到一個不錯的數字,未知數據點的不確定性會很大,而采樣數據點的不確定性會很小。 我有一個奇怪的數字,甚至更奇怪的是,采樣數據點周圍的不確定性大於其余數據。 有人可以向我解釋我做錯了什么嗎? 謝謝!!

您的代碼有些錯誤。 以下是最重要的幾點:

  • 使一切都出錯的主要錯誤是f的索引。 您正在定義Xa = x(j) ,但實際上應該執行Xa = x(selection) ,以便索引與您在內核矩陣K上使用的索引一致。

  • 減去樣本均值f = fh(Xa) - mean(fh(Xa))沒有任何作用,並且會使圖中的圓偏離實際函數。 (如果您選擇減去某些東西,它應該是一個固定的數字或函數,而不取決於隨機采樣的觀測值。)

  • 您應該直接根據mD計算后驗均值和方差; 無需從后驗中取樣,然后獲得這些樣本的估計值。

這是腳本的修改版本,其中上述幾點已修復。

%% Init
% Data generating function
fh = @(x)(2*cos(2*pi*x/10).*x);
% range
x = -5:0.01:5;
N = length(x);
% Sampled data points from the generating function
M = 5;
selection = boolean(zeros(N,1));
j = randsample(N, M);
% mark them
selection(j) = 1;
Xa = x(selection);

%% GP computations
% compute the function and extract mean
f = fh(Xa);
sigma2 = 2;
sigma_noise = 0.01;
var_kernel = 10;
% computing the interpolation using all x's
% It is expected that for points used to build the GP cov. matrix, the
% uncertainty is reduced...
K = squareform(pdist(x'));
K = var_kernel*exp(-(0.5*K.^2)/sigma2);
% upper left corner of K
Kaa = K(selection,selection);
% lower right corner of K
Kbb = K(~selection,~selection);
% upper right corner of K
Kab = K(selection,~selection);
% mean of posterior
m = Kab'/(Kaa + sigma_noise*eye(M))*f';
% cov. matrix of posterior
D = Kbb - Kab'/(Kaa + sigma_noise*eye(M))*Kab;

%% Plot
figure;
grid on;
hold on;
% GP estimates
plot(x(~selection), m);
plot(x(~selection), m + 2*sqrt(diag(D)), 'g-');
plot(x(~selection), m - 2*sqrt(diag(D)), 'g-');
% Observations
plot(Xa, f, 'Ok');
% True function
plot(x, fh(x), 'k');

從中得到的具有5個隨機選擇的觀察結果的圖,其中真實函數顯示為黑色,后均值顯示為藍色,而置信區間顯示為綠色。

GP估算

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM