简体   繁体   中英

Numpy appending arrays

I am trying to to grow an array/matrix with each iteration within a for loop. Following is my code

import numpy as np


sino = []; 
for n in range(0, 4):
    fileName = 'D:\zDeflectionBP\data\headData_zdef\COSWeighted_trunk_' + str(n) + '.bin'
    f = open(fileName, "rb")
    data = np.fromfile(f, np.float32)
    sino = np.append(sino, data)
f.close()

fileName = 'D:\zDeflectionBP\data\headData_zdef\Head_FFS_COSWeighted.bin'
f = open(fileName, "wb")
f.write(bytes(sino))
f.close()

Each iteration the data is loaded witThere four

However, in the end, I found the size (in terms of number of bytes) of sino is twice as it should be.

For example: Each size of data : 3MB then, since I have four data , the size of the sino should be: 3MB X 4 = 12MB. But I found the size of the size is 24MB.

What is happening here? I'd like for sino to be only 12MB, which only contains data from the four data variable. How should I do it? Thanks.

Your sino isn't a numpy array initially but a Python list.

Numpy converts it to a 64 bit array the first time by default on a 64 bit installation, after that it stays that way, twice as large as you expected.

All the times you append data it's converted to 64 bit, since that's the format of the target.

Make sino a np.float32 array right from the start to solve the problem.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM