I am developing a convolution neural network (CNN) model to predict whether a patient in category 1,2,3 or 4. I use Keras on top of TensorFlow.
I have 64 breast cancer patient data, classified into four category (1=no disease, 2= …., 3=….., 4=progressive disease). In each patient's data, I have 3 set of MRI scan images taken at different dates and inside each MRI folder, I have 7 to 8 sub folders containing MRI images in different plane (such as coronal plane/sagittal plane etc).
I learned how to deal with basic “Cat-Dog-CNN-Classifier”, it was easy as I put all the cat & dog images into a single folder to train the network. But how do I tackle the problem in my breast cancer patient data? It has multiple folders and sub-solders.
Please suggest.
You need to build your dataset navigating through folders. This can be done easily in python.
So, suppose you have the following folder tree:
root/
\__ Patient_1/
\__ \_______ MRI_1/
\__ \_______ \___ folder_1/
\__ \_______ \___ folder_2/
\__ \_______ \___ folder_3/
\__ \_______ \___ ... /
\__ \_______ \___ folder_8/
\__ \_______ MRI_2/
\__ \_______ ... /...
\__ Patient_2/
...
First, you need to get your current working directory (cwd):
import os
cwd = os.getcwd()
Then you can start building your dataset:
dataset = [];
for p in range(1, 65):
for mri in range(1, 4):
mri_folder = cwd + '/Patient_{}/MRI_{}/'.format(p, mri)
# Since we do not know how many folders there will be,
# we need to list all the folders in the current MRI folder.
folders_names = os.listdir(mri_folder)
for f in folder_names:
# list all images in current folder
fnames = os.listdir('{}/{}/'.format(mri_folder, f))
for fname in fnames:
# Now you can open your image and append it to dataset
Hope it helps!
use this loop:
for _,_,files in os.walk("data_folder/"):
for name in files:
FP = os.path.join(root, name)
使用os.walk
递归访问子目录中的所有文件os.walk
加到数据集。
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.