We are using the registration algorithm of ITK but we only want the affine transformation matrix and not directly apply the registration. In a previous issues we already solved a misunderstanding regarding the image/transform orientation: How to get transformation affine from ITK registration?
We did now run into a sample where the current solution does not properly work. The rotation is good but the result is slightly translated. The image output of ITK is perfect, so we know that the registration worked. That's why we will reduce the problem description below to the affine calculation with the specific matrices.
From the ITK registration we get/read the following parameters:
parameter_map = result_transform_parameters.GetParameterMap(0)
rot00, rot01, rot02, rot10, rot11, rot12, rot20, rot21, rot22 = parameter_map[
'TransformParameters'][:9]
A = np.array([
[rot00, rot01, rot02, 0],
[rot10, rot11, rot12, 0],
[rot20, rot21, rot22, 0],
[ 0, 0, 0, 1],
], dtype=float) # yapf: disable
tx, ty, tz = parameter_map['TransformParameters'][9:]
t = np.array([
[1, 0, 0, tx],
[0, 1, 0, ty],
[0, 0, 1, tz],
[0, 0, 0, 1],
], dtype=float) # yapf: disable
# In world coordinates
cx, cy, cz = parameter_map['CenterOfRotationPoint']
c = np.array([
[1, 0, 0, cx],
[0, 1, 0, cy],
[0, 0, 1, cz],
[0, 0, 0, 1],
], dtype=float) # yapf: disable
ox, oy, oz = parameter_map['Origin']
o = np.array([
[1, 0, 0, ox],
[0, 1, 0, oy],
[0, 0, 1, oz],
[0, 0, 0, 1],
], dtype=float) # yapf: disable
moving_ras = moving_image.affine
Where A
is the direction/rotation matrix, t
the translation matrix, c
the center of rotation (CoR), and moving_ras
the affine of the moving image in RAS orientation.
The translation and direction matrix can be combined to one transform matrix:
transform = t @ A
We are not sure how to factor in the CenterOfRotationPoint
. Based on this , this , and that exchange questions, I thought one might need to do it like that:
transform = c @ transform @ np.linalg.inv(c)
Finally, we need to add the orientation flip between RAS and LPS:
registration = FLIPXY_44 @ transform @ FLIPXY_44
But this does not result in the correct transformation affine.
On the ITK docs and in a GitHub issue we got this formula to apply the above parameters to points:
T(x) = A ( x - c ) + (t + c)
While we can not directly use that since we do not want to directly transform the image but we only want to calculate the correct affine transformation matrix, one can see how the formula is pretty similar to what we are already doing as explained above.
We are again at a dead end with our knowledge.
Things we noticed that might make issues here:
EDIT : I noticed that my current minimal code example is not quite comprehensive. Therefore here an update. The included affine matrices are taken from the ITK coregistration. The ITK code was omitted for brevity.
Here with new test data (you can view these images via MRIcoGL):
Here a minimal code example:
from pathlib import Path
import nibabel
import numpy as np
from monai.transforms.spatial.array import Affine
from monai.utils.enums import GridSampleMode, GridSamplePadMode
from nibabel import Nifti1Image
np.set_printoptions(suppress=True) # type: ignore
folder = Path('.')
FLIPXY_44 = np.diag([-1, -1, 1, 1])
# rot00, rot01, rot02, rot10, rot11, rot12, rot20, rot21, rot22 = parameter_map['TransformParameters'][:9]
A = np.array([[ 1.02380734, -0.05137566, -0.00766465, 0. ],
[ 0.01916231, 0.93276486, -0.23453097, 0. ],
[ 0.01808809, 0.2667324 , 0.94271694, 0. ],
[ 0. , 0. , 0. , 1. ]]) # yapf: disable
# tx, ty, tz = parameter_map['TransformParameters'][9:]
t = np.array([[ 1. , 0. , 0. , 1.12915465 ],
[ 0. , 1. , 0. , 11.76880151 ],
[ 0. , 0. , 1. , 41.54685788 ],
[ 0. , 0. , 0. , 1. ]]) # yapf: disable
# cx, cy, cz = parameter_map['CenterOfRotationPoint']
c = np.array([[ 1. , 0. , 0. , -0.1015625 ],
[ 0. , 1. , 0. , -24.5521698 ],
[ 0. , 0. , 1. , 0.1015625 ],
[ 0. , 0. , 0. , 1. ]]) # yapf: disable
# Moving image affine
x = np.array([[ 2. , 0. , 0. , -125.75732422],
[ 0. , 2. , 0. , -125.23828888],
[ 0. , 0. , 2. , -99.86506653],
[ 0. , 0. , 0. , 1. ]]) # yapf: disable
o = np.array([
[1., 0., 0., 126.8984375],
[0., 1., 0., 102.4478302],
[0., 0., 1., -126.8984375],
[0., 0., 0., 1.],
])
moving_ras = x
# Combine the direction and translation
transform = t @ A
# Factor in the center of rotation
# transform = c @ transform @ np.linalg.inv(c)
# Switch from LPS to RAS orientation
registration = FLIPXY_44 @ transform @ FLIPXY_44
y = np.array([[ 2. , 0. , 0. , -126.8984375 ],
[ 0. , 2. , 0. , -102.4478302 ],
[ 0. , 0. , 2. , -126.8984375 ],
[ 0. , 0. , 0. , 1. ]]) # yapf: disable
fixed_image_affine = y
moving_image_ni: Nifti1Image = nibabel.load(folder / 'real_moving.nii.gz')
moving_image_np: np.ndarray = moving_image_ni.get_fdata() # type: ignore
affine_transform = Affine(affine=registration,
image_only=True,
mode=GridSampleMode.NEAREST,
padding_mode=GridSamplePadMode.BORDER)
reg_monai = np.squeeze(affine_transform(moving_image_np[np.newaxis, ...]))
out = Nifti1Image(reg_monai, fixed_image_affine)
nibabel.save(out, folder / 'reg_monai.nii.gz')
When you executed this code, the resulting reg_monai.nii.gz
should match the real_fixed.nii.gz
(in position and outline - not in the actual content).
Currently the result looks like this (viewed via MRIcoGL):
But the result should look like this (this is the direct ITK registration output where the hardcoded affine matrices come from - which should prove that the registration worked and that the parameters generally should be good):
For the sake of completeness, here also the code to perform the ITK registration and to get the above affine matrices:
from pathlib import Path
import itk
import numpy as np
np.set_printoptions(suppress=True) # type: ignore
folder = Path('.')
moving_image = itk.imread(str(folder / 'real_moving.nii.gz'), itk.F)
fixed_image = itk.imread(str(folder / 'real_fixed.nii.gz'), itk.F)
# Import Default Parameter Map
parameter_object = itk.ParameterObject.New()
affine_parameter_map = parameter_object.GetDefaultParameterMap('affine', 4)
affine_parameter_map['FinalBSplineInterpolationOrder'] = ['1']
affine_parameter_map['MaximumNumberOfIterations'] = ['512']
parameter_object.AddParameterMap(affine_parameter_map)
# Call registration function
result_image, result_transform_parameters = itk.elastix_registration_method( # type: ignore
fixed_image, moving_image, parameter_object=parameter_object)
itk.imwrite(result_image, str(folder / 'real_reg_itk.nii.gz'), compression=True)
parameter_map = result_transform_parameters.GetParameterMap(0)
rot00, rot01, rot02, rot10, rot11, rot12, rot20, rot21, rot22 = parameter_map['TransformParameters'][:9]
A = np.array([
[rot00, rot01, rot02, 0],
[rot10, rot11, rot12, 0],
[rot20, rot21, rot22, 0],
[ 0, 0, 0, 1],
], dtype=float) # yapf: disable
tx, ty, tz = parameter_map['TransformParameters'][9:]
t = np.array([
[1, 0, 0, tx],
[0, 1, 0, ty],
[0, 0, 1, tz],
[0, 0, 0, 1],
], dtype=float) # yapf: disable
# In world coordinates
cx, cy, cz = parameter_map['CenterOfRotationPoint']
c = np.array([
[1, 0, 0, cx],
[0, 1, 0, cy],
[0, 0, 1, cz],
[0, 0, 0, 1],
], dtype=float) # yapf: disable
ox, oy, oz = parameter_map['Origin']
o = np.array([
[1, 0, 0, ox],
[0, 1, 0, oy],
[0, 0, 1, oz],
[0, 0, 0, 1],
], dtype=float) # yapf: disable
Package versions:
itk-elastix==0.12.0
monai==0.8.0
nibabel==3.1.1
numpy==1.19.2
I guess this is not the solution, but this easy code/ transformation seems to let the image pointing in the same direction and almost aligned, which makes me question if is really LPS
to RAS
because this looks a completely different transformation of axis:
transform_matrix= np.array([
[0, 0, 1, 0],
[0, 1, 0, 0],
[1, 0, 0, 0],
[0, 0, 0, 1]], dtype=float)
to_transform: Nifti1Image = nibabel.load('file/real_moving.nii.gz')
to_transform: np.ndarray = to_transform.get_fdata()
affine_transform = Affine(affine=transform_matrix, image_only=True,
mode=GridSampleMode.NEAREST, padding_mode=GridSamplePadMode.BORDER)
transformed_img = np.squeeze(affine_transform(to_transform[np.newaxis, ...]))
On the other hand I was not able to find the proper order of the parameters in the parameter_map (on the documentation). Are you sure that A and t are multiplied and not summed (with a zero diagonal on t), maybe you can point me to the documentation where that is written.
On the center of rotation I found this , which as far as I know will mean:
transform = c @ transform @ c_minus
Where c have again no diagonal, whether if this should be applied after or before the t, I have no answer, but for me no option worked for me, as I couldn't even reproduce your images with this data set.
I found some useful information with jupyter examples at the documentation of itk-elastix here
This is the result of the first piece of code, but the images doesn't seem to be the same as yours.
I let you some pictures of how the data appears into my machine with the input, transformed image and reference image at the end.
I know this is not a final solution, but hope it is still useful.
What I see is that the image registration process is not actually working.
def registration_test(moving_image, fixed_image, niter=512):
# Import Default Parameter Map
parameter_object = itk.ParameterObject.New()
affine_parameter_map = parameter_object.GetDefaultParameterMap('affine', 4)
affine_parameter_map['FinalBSplineInterpolationOrder'] = ['1']
affine_parameter_map['MaximumNumberOfIterations'] = [str(niter)]
parameter_object.AddParameterMap(affine_parameter_map)
# Call registration function
result_image, result_transform_parameters = itk.elastix_registration_method( # type: ignore
fixed_image, moving_image, parameter_object=parameter_object)
#transform_parameters = parameter_map['TransformParameters']
#transform_origin = parameter_map['CenterOfRotationPoint']
transform_parameters = result_transform_parameters.GetParameter(0, 'TransformParameters')
transform_origin = result_transform_parameters.GetParameter(0, 'CenterOfRotationPoint')
r = np.asarray(transform_parameters).reshape(4, 3)
c = np.asarray(transform_origin, dtype=float)
A = np.eye(4)
A[:3,3] = r[3]
A[:3,:3] = r[:3].T
print(A, c)
C = np.eye(4)
C[:3, 3] = c;
C_inv = np.eye(4)
C_inv[:3,3] = -c;
affine_transform = Affine(affine=C @ A @ C_inv,
image_only=True,
mode=GridSampleMode.NEAREST,
padding_mode=GridSamplePadMode.BORDER)
moving_image_np = np.asarray(moving_image)
reg_monoai = affine_transform(moving_image_np[..., np.newaxis])
obtained = reg_monoai[..., 0]
print(obtained.shape)
plt.figure(figsize=(9,9))
plt.subplot(331)
plt.imshow(fixed_image[64,:,:], origin='lower'); plt.xticks([]); plt.yticks([])
plt.ylabel('fixed_image'); plt.title('plane 0')
plt.subplot(334)
plt.imshow(obtained[64,:,:], origin='lower'); plt.xticks([]); plt.yticks([])
plt.ylabel('result')
plt.subplot(337)
plt.imshow(moving_image[64,:,:], origin='lower'); plt.xticks([]); plt.yticks([])
plt.ylabel('moving_image');
plt.subplot(332)
plt.imshow(fixed_image[:,64,:], origin='lower'); plt.xticks([]); plt.yticks([])
plt.title('plane 1')
plt.subplot(335)
plt.imshow(obtained[:,64,:], origin='lower'); plt.xticks([]); plt.yticks([])
plt.subplot(338)
plt.imshow(moving_image[:,64,:], origin='lower'); plt.xticks([]); plt.yticks([])
plt.subplot(333)
plt.title('plane 2');
plt.imshow(fixed_image[:,:,64], origin='lower'); plt.xticks([]); plt.yticks([])
plt.subplot(336)
plt.imshow(obtained[:,:,64], origin='lower'); plt.xticks([]); plt.yticks([])
plt.subplot(339)
plt.imshow(moving_image[:,:,64], origin='lower'); plt.xticks([]); plt.yticks([])
Then with the pair you sent, that are very close to each other if I run 1000 iterations, here is what I have
%%time
registration_test(moving_image, fixed_image, 1000)
[[ 1.02525991 0.01894165 0.02496272 1.02504064]
[-0.05196394 0.93458484 0.26571434 11.92591955]
[-0.00407657 -0.23543312 0.94091849 41.62065545]
[ 0. 0. 0. 1. ]] [ -0.1015625 -24.5521698 0.1015625]
(128, 128, 128)
CPU times: user 15.9 s, sys: 654 ms, total: 16.6 s
Wall time: 10.9 s
Using this function to rotate around one axis
def imrot(im, angle, axis=1):
x,y = [i for i in range(3) if i != axis]
A = np.eye(4)
A[x,x] = np.cos(angle);
A[x,y] = np.sin(angle);
A[y,x] = -np.sin(angle);
A[y,y] = np.cos(angle);
f = Affine(affine=A,
image_only=True,
mode=GridSampleMode.NEAREST,
padding_mode=GridSamplePadMode.BORDER)
return itk.image_from_array(f(np.asarray(im)[np.newaxis])[0])
I see that over 10 iterations the moving_image
is not significantly modified
%%time
e2e_test(moving_image, imrot(fixed_image, 0.5), 10)
[[ 0.9773166 -0.05882861 -0.09435328 -8.29016604]
[ 0.01960457 1.01097845 -0.06601224 -4.62307826]
[ 0.09305988 0.07375327 1.06381763 0.74783361]
[ 0. 0. 0. 1. ]] [63.5 63.5 63.5]
(128, 128, 128)
CPU times: user 3.57 s, sys: 148 ms, total: 3.71 s
Wall time: 2.24 s
But if I increase the number of iterations to 100, instead of approximating the fixed image as I would expect it seems lost
[[ 1.12631932 -0.33513615 -0.70472146 -31.57349579]
[ -0.07239085 1.08080123 -0.42268541 -28.72943354]
[ -0.24096706 -0.08024728 0.80870164 -5.86050765]
[ 0. 0. 0. 1. ]] [63.5 63.5 63.5]
After 1000 iterations
[[ 1.28931626 -0.36533121 -0.52561289 -37.00919916]
[ 0.02204954 1.23661994 -0.29418401 -34.36979156]
[ -0.32713001 -0.13135651 0.96500969 2.75931824]
[ 0. 0. 0. 1. ]] [63.5 63.5 63.5]
After 10000 iterations
[[ 1.46265277 0.02692694 0.14337441 -61.37788428]
[ -0.15334478 1.37362513 0.16242297 -52.59833838]
[ -0.53333714 -0.51411401 0.80381994 -4.97349468]
[ 0. 0. 0. 1. ]] [63.5 63.5 63.5]
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.