简体   繁体   中英

MATLAB: 3d reconstruction using eight point algorithm

I am trying to achieve 3d reconstruction from 2 images. Steps I followed are,

1. Found corresponding points between 2 images using SURF.
2. Implemented eight point algo to find "Fundamental matrix"
3. Then, I implemented triangulation.

I have got Fundamental matrix and results of triangulation till now. How do i proceed further to get 3d reconstruction? I'm confused reading all the material available on internet.

Also, This is code. Let me know if this is correct or not.

Ia=imread('1.jpg');
Ib=imread('2.jpg');
Ia=rgb2gray(Ia);
Ib=rgb2gray(Ib);
% My surf addition
% collect Interest Points from Each Image
blobs1 = detectSURFFeatures(Ia);
blobs2 = detectSURFFeatures(Ib);
figure;
imshow(Ia);
hold on;
plot(selectStrongest(blobs1, 36));
figure;
imshow(Ib);
hold on;
plot(selectStrongest(blobs2, 36));
title('Thirty strongest SURF features in I2');
[features1, validBlobs1] = extractFeatures(Ia, blobs1);
[features2, validBlobs2] = extractFeatures(Ib, blobs2);
indexPairs = matchFeatures(features1, features2);
matchedPoints1 = validBlobs1(indexPairs(:,1),:);
matchedPoints2 = validBlobs2(indexPairs(:,2),:);
figure;
showMatchedFeatures(Ia, Ib, matchedPoints1, matchedPoints2);
legend('Putatively matched points in I1', 'Putatively matched points in I2');

for i=1:matchedPoints1.Count
    xa(i,:)=matchedPoints1.Location(i);
    ya(i,:)=matchedPoints1.Location(i,2);
    xb(i,:)=matchedPoints2.Location(i);
    yb(i,:)=matchedPoints2.Location(i,2);
end

matchedPoints1.Count
figure(1) ; clf ;
imshow(cat(2, Ia, Ib)) ;
axis image off ;
hold on ;
xbb=xb+size(Ia,2);
set=[1:matchedPoints1.Count];
h = line([xa(set)' ; xbb(set)'], [ya(set)' ; yb(set)']) ;

pts1=[xa,ya];
pts2=[xb,yb];
pts11=pts1;pts11(:,3)=1;
pts11=pts11';
pts22=pts2;pts22(:,3)=1;pts22=pts22';

width=size(Ia,2);
height=size(Ib,1);
F=eightpoint(pts1,pts2,width,height);

[P1new,P2new]=compute2Pmatrix(F);
XP = triangulate(pts11, pts22,P2new);

eightpoint()

function [ F ] = eightpoint( pts1, pts2,width,height)

X = 1:width;
Y = 1:height;
[X, Y] = meshgrid(X, Y);
x0 = [mean(X(:)); mean(Y(:))];
X = X - x0(1);
Y = Y - x0(2);
denom = sqrt(mean(mean(X.^2+Y.^2)));
N = size(pts1, 1);

%Normalized data
T = sqrt(2)/denom*[1 0 -x0(1); 0 1 -x0(2); 0 0 denom/sqrt(2)];
norm_x = T*[pts1(:,1)'; pts1(:,2)'; ones(1, N)];
norm_x_ = T*[pts2(:,1)';pts2(:,2)'; ones(1, N)];
x1 = norm_x(1, :)';
y1= norm_x(2, :)';
x2 = norm_x_(1, :)';
y2 = norm_x_(2, :)';

A = [x1.*x2, y1.*x2, x2, ...
       x1.*y2, y1.*y2, y2, ...
       x1,       y1,     ones(N,1)];

% compute the SVD
[~, ~, V] = svd(A);
F = reshape(V(:,9), 3, 3)';
[FU, FS, FV] = svd(F);
FS(3,3) = 0; %rank 2 constrains
F = FU*FS*FV';

% rescale fundamental matrix
F = T' * F * T;

end

triangulate()

function [ XP ] = triangulate( pts1,pts2,P2 )

n=size(pts1,2);
X=zeros(4,n);
for i=1:n
    A=[-1,0,pts1(1,i),0;
        0,-1,pts1(2,i),0;
        pts2(1,i)*P2(3,:)-P2(1,:);
        pts2(2,i)*P2(3,:)-P2(2,:)];
  [~,~,va] = svd(A);
  X(:,i) = va(:,4);
end
XP(:,:,1) = [X(1,:)./X(4,:);X(2,:)./X(4,:);X(3,:)./X(4,:); X(4,:)./X(4,:)];

end

function [ P1,P2 ] = compute2Pmatrix( F )

P1=[1,0,0,0;0,1,0,0;0,0,1,0];
[~, ~, V] = svd(F');
ep = V(:,3)/V(3,3);
P2 = [skew(ep)*F,ep];
end

From a quick look, it looks correct. Some notes are as follows:

You normalized code in eightpoint() is no ideal.

It is best done on the points involved. Each set of points will have its scaling matrix. That is:

[pts1_n, T1] = normalize_pts(pts1);
[pts2_n, T2] = normalize-pts(pts2);

% ... code
% solution
F = T2' * F * T

As a side note (for efficiency) you should do

[~,~,V] = svd(A, 0);

You also want to enforce the constraint that the fundamental matrix has rank-2. After you compute F, you can do:

[U,D,v] = svd(F);
F = U * diag([D(1,1),D(2,2), 0]) * V';

In either case, normalization is not the only key to make the algorithm work. You'll want to wrap the estimation of the fundamental matrix in a robust estimation scheme like RANSAC.

Estimation problems like this are very sensitive to non Gaussian noise and outliers. If you have a small number of wrong correspondence, or points with high error, the algorithm will break.

Finally, In 'triangulate' you want to make sure that the points are not at infinity prior to the homogeneous division.

I'd recommend testing the code with 'synthetic' data. That is, generate your own camera matrices and correspondences. Feed them to the estimate routine with varying levels of noise. With zero noise, you should get an exact solution up to floating point accuracy. As you increase the noise, your estimation error increases.

In its current form, running this on real data will probably not do well unless you 'robustify' the algorithm with RANSAC, or some other robust estimator.

Good luck.

Good luck.

Which version of MATLAB do you have?

There is a function called estimateFundamentalMatrix in the Computer Vision System Toolbox, which will give you the fundamental matrix. It may give you better results than your code, because it is using RANSAC under the hood, which makes it robust to spurious matches. There is also a triangulate function, as of version R2014b.

What you are getting is sparse 3D reconstruction. You can plot the resulting 3D points, and you can map the color of the corresponding pixel to each one. However, for what you want, you would have to fit a surface or a triangular mesh to the points. Unfortunately, I can't help you there.

If what you're asking is how to I proceed from fundamental Matrix + corresponding points to a dense model then you still have a lot of work ahead of you.

relative camera locations (R,T) can be calculated from a fundamental matrix assuming you know the internal camera params (up to scale, rotation, translation). To get a full dense matrix there are a few ways to go. you can try using an existing library (PMVS for example). I'd look into OpenMVG but I'm not sure about matlab interface.

Another way to go, you can compute a dense optical flow (many available for matlab). Look for a epipolar OF (It takes a fundamental matrix and restricts the solution to lie on the epipolar lines). Then you can triangulate every pixel to get a depthmap.

Finally you will have to play with format conversions to get from a depthmap to VRML (You can look at meshlab)

Sorry my answer isn't more Matlab oriented.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM