简体   繁体   中英

How to superimpose two images and get SSIM (similarity index) value for these two images?

I have a clean image, and a noisy image. I created a denoiser and applied it to the noisy image, that was my final output. Now to compare how much this image is close to a clean image I need to compare it using PSNR and SSIM, but due to different positions of the image I am unable to compare.

Now I am getting SSIM as 0.5, which is very low, due to the improper placement of both the images. If the images are registered properly, then I guess SSIM should come around 0.80+. But I have not been able to accomplish this.

How can I align these two images to obtain a good SSIM value?

I have two coin images, 1st image (CLEAN), 2nd image (IMPROVED a NOISY IMG), for comparison.

Clean Img:

在此处输入图像描述

Noisy Img:

在此处输入图像描述

Due to positions of images at different positions ssim(img1,img2) is giving incorrect output. I tried cropping but that did not work. Here is what I have tried so far:

Attempt 1:

function [valPSNR,valSSIM,badpict]=getSSIM(clean_img,img2)
% pad reference image since object is so close to edges
refpict = padarray(mat2gray(clean_img),[20 20],'replicate','both');
% crop test image down to extract the object alone
badpict = imcrop(mat2gray(img2),[2.5 61.5 357 363]);
% maximize normalized cross-correlation to find offset
szb = size(badpict);
c = normxcorr2(badpict,refpict);
[idxy idxx] = find(c == max(c(:)));
osy = idxy-szb(1);
osx = idxx-szb(2);
% crop the reference pict to the ROI
refpict = refpict(osy:idxy-1,osx:idxx-1);
%imshow(imfuse(badpict,refpict,'checkerboard'));
%imagesc(badpict);
valSSIM=ssim(badpict,refpict);
valPSNR=getPSNR(badpict,refpict);
img2=badpict;
clean_img=refpict;
figure; imshowpair(clean_img,img2);
figure; montage({mat2gray(clean_img),mat2gray(img2)}, 'Size', [1 2], 'BackgroundColor', 'w', 'BorderSize', [2 2]);
end

Attempt 2:

function [valPSNR,valSSIM,badpict]=getSSIM2(clean_img,img2)
% pad reference image since object is so close to edges
bw1 = im2bw(mat2gray(clean_img));
bw2 = imclose(im2bw(mat2gray(img2),0.3),strel('disk',9));
bw2 = bwareafilt(bw2,1);
% make same size
[r,c] = find(bw1);
clean_img = clean_img(min(r):max(r),min(c):max(c));
[r,c] = find(bw2);
img2 = img2(min(r):max(r),min(c):max(c));
img2= imresize(img2, size(clean_img),'bilinear');
valPSNR=getPSNR(mat2gray(clean_img),mat2gray(img2));
valSSIM=ssim(mat2gray(clean_img),mat2gray(img2));
badpict=img2;
figure; imshowpair(clean_img,img2);
figure; montage({mat2gray(clean_img),mat2gray(img2)}, 'Size', [1 2], 'BackgroundColor', 'w', 'BorderSize', [2 2]);
end

As others have pointed out, the resampling required by registration will have some non-zero error. But, here is some sample code that will take you through the registration part that is the crux of your question.

% SSIM isn't defined on RGB images, convert to grayscale.
ref = rgb2gray(imread('https://i.stack.imgur.com/tPKEJ.png'));
X = rgb2gray(imread('https://i.stack.imgur.com/KmU4y.png'));

% The input image data has bright borders at the edges that create
% artifacts in resampling, best to just crop those or maybe there are
% aquisitions that don't have these borders?
X = X(3:end-2,3:end-2);
ref = ref(4:end-3,4:end-3);

figure
montage({X,ref});

tform = imregcorr(X,ref,"translation");

Xreg = imwarp(X,tform,OutputView=imref2d(size(ref)),SmoothEdges=true);

figure
imshowpair(Xreg,ref)

ssim(Xreg,ref)

Maybe you can refer to mygithub .

I implemented a template matching algorithm by OpenCV which you can use NCC-Based Pattern Matching to find targets, and then get a score (similarity).

You can then use this score to decide if it is clean.

Besides, tranforming c++ code may be an issue for you, but just find the all corresponded function in matlab version.

Here are effects (red blocks are areas with similarity higher than threshold 0.85 in comparison with golden sample): 在此处输入图像描述

The whole function is too long to be posted here. Part of the function:

for (int i = 0; i < iSize; i++)
{
    Mat matRotatedSrc, matR = getRotationMatrix2D (ptCenter, vecAngles[i], 1);
    Mat matResult;
    Point ptMaxLoc;
    double dValue, dMaxVal;
    double dRotate = clock ();
    Size sizeBest = GetBestRotationSize (vecMatSrcPyr[iTopLayer].size (), pTemplData->vecPyramid[iTopLayer].size (), vecAngles[i]);
    float fTranslationX = (sizeBest.width - 1) / 2.0f - ptCenter.x;
    float fTranslationY = (sizeBest.height - 1) / 2.0f - ptCenter.y;
    matR.at<double> (0, 2) += fTranslationX;
    matR.at<double> (1, 2) += fTranslationY;
    warpAffine (vecMatSrcPyr[iTopLayer], matRotatedSrc, matR, sizeBest);


    MatchTemplate (matRotatedSrc, pTemplData, matResult, iTopLayer);

    minMaxLoc (matResult, 0, &dMaxVal, 0, &ptMaxLoc);

    vecMatchParameter[i * (m_iMaxPos + MATCH_CANDIDATE_NUM)] = s_MatchParameter (Point2f (ptMaxLoc.x - fTranslationX, ptMaxLoc.y - fTranslationY), dMaxVal, vecAngles[i]);

    for (int j = 0; j < m_iMaxPos + MATCH_CANDIDATE_NUM - 1; j++)
    {
        ptMaxLoc = GetNextMaxLoc (matResult, ptMaxLoc, -1, pTemplData->vecPyramid[iTopLayer].cols, pTemplData->vecPyramid[iTopLayer].rows, dValue, m_dMaxOverlap);
        vecMatchParameter[i * (m_iMaxPos + MATCH_CANDIDATE_NUM) + j + 1] = s_MatchParameter (Point2f (ptMaxLoc.x - fTranslationX, ptMaxLoc.y - fTranslationY), dValue, vecAngles[i]);
    }
}
FilterWithScore (&vecMatchParameter, m_dScore-0.05*iTopLayer);

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM