繁体   English   中英

高斯过程回归

[英]Gaussian Process Regression

我正在编写一个高斯过程回归算法。 这是代码:

% Data generating function

fh = @(x)(2*cos(2*pi*x/10).*x);

% range

x = -5:0.01:5;
N = length(x);

% Sampled data points from the generating function

M = 50;
selection = boolean(zeros(N,1));
j = randsample(N, M);

% mark them

selection(j) = 1;
Xa = x(j);

% compute the function and extract mean

f = fh(Xa) - mean(fh(Xa));
sigma2 = 1;

% computing the interpolation using all x's
% It is expected that for points used to build the GP cov. matrix, the
% uncertainty is reduced...

K = squareform(pdist(x'));
K = exp(-(0.5*K.^2)/sigma2);

% upper left corner of K

Kaa = K(selection,selection);

% lower right corner of K

Kbb = K(~selection,~selection);

% upper right corner of K

Kab = K(selection,~selection);

% mean of posterior

m = Kab'*inv(Kaa+0.001*eye(M))*f';

% cov. matrix of posterior

D = Kbb - Kab'*inv(Kaa + 0.001*eye(M))*Kab;

% sampling M functions from from GP

[A,B,C] = svd(Kaa);
F0 = A*sqrt(B)*randn(M,M);
% mean from GP using sampled points

F0m = mean(F0,2);
F0d = std(F0,0,2);

%%
% put together data and estimation

F = zeros(N,1);
S = zeros(N,1);
F(selection) = f' + F0m;
S(selection) = F0d;

% sampling M function from posterior

[A,B,C] = svd(D);
a = A*sqrt(B)*randn(N-M,M);
% mean from posterior GPs

Fm = m + mean(a,2);
Fmd = std(a,0,2);
F(~selection) = Fm;
S(~selection) = Fmd;

%%

figure;
% show what we got...

plot(x, F, ':r', x, F-2*S, ':b', x, F+2*S, ':b'), grid on;
hold on;
% show points we got

plot(Xa, f, 'Ok');
% show the whole curve

plot(x, fh(x)-mean(fh(x)), 'k');
grid on;

我希望得到一个不错的数字,未知数据点的不确定性会很大,而采样数据点的不确定性会很小。 我有一个奇怪的数字,甚至更奇怪的是,采样数据点周围的不确定性大于其余数据。 有人可以向我解释我做错了什么吗? 谢谢!!

您的代码有些错误。 以下是最重要的几点:

  • 使一切都出错的主要错误是f的索引。 您正在定义Xa = x(j) ,但实际上应该执行Xa = x(selection) ,以便索引与您在内核矩阵K上使用的索引一致。

  • 减去样本均值f = fh(Xa) - mean(fh(Xa))没有任何作用,并且会使图中的圆偏离实际函数。 (如果您选择减去某些东西,它应该是一个固定的数字或函数,而不取决于随机采样的观测值。)

  • 您应该直接根据mD计算后验均值和方差; 无需从后验中取样,然后获得这些样本的估计值。

这是脚本的修改版本,其中上述几点已修复。

%% Init
% Data generating function
fh = @(x)(2*cos(2*pi*x/10).*x);
% range
x = -5:0.01:5;
N = length(x);
% Sampled data points from the generating function
M = 5;
selection = boolean(zeros(N,1));
j = randsample(N, M);
% mark them
selection(j) = 1;
Xa = x(selection);

%% GP computations
% compute the function and extract mean
f = fh(Xa);
sigma2 = 2;
sigma_noise = 0.01;
var_kernel = 10;
% computing the interpolation using all x's
% It is expected that for points used to build the GP cov. matrix, the
% uncertainty is reduced...
K = squareform(pdist(x'));
K = var_kernel*exp(-(0.5*K.^2)/sigma2);
% upper left corner of K
Kaa = K(selection,selection);
% lower right corner of K
Kbb = K(~selection,~selection);
% upper right corner of K
Kab = K(selection,~selection);
% mean of posterior
m = Kab'/(Kaa + sigma_noise*eye(M))*f';
% cov. matrix of posterior
D = Kbb - Kab'/(Kaa + sigma_noise*eye(M))*Kab;

%% Plot
figure;
grid on;
hold on;
% GP estimates
plot(x(~selection), m);
plot(x(~selection), m + 2*sqrt(diag(D)), 'g-');
plot(x(~selection), m - 2*sqrt(diag(D)), 'g-');
% Observations
plot(Xa, f, 'Ok');
% True function
plot(x, fh(x), 'k');

从中得到的具有5个随机选择的观察结果的图,其中真实函数显示为黑色,后均值显示为蓝色,而置信区间显示为绿色。

GP估算

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM