- #1
BRN
- 108
- 10
Hello everybody,
for my Deep Learning exam, I have to develop a project that provides for the generation of 3D images in format .nii using the DCGAN algorithm trained with some real images of MRI scans of brains of patients with Alzahimer.
I have a serious problem. I should load three different datasets that weigh 3GB, 5GB and 8GB respectively. Unfortunately I am forced to use an old PC with only 4GB of RAM and 2GB of Swap memory, so I am impossible to upload the files at one time using this simple code:
Would any of you know how to give me some alternative solution? Is there a way to upload the files a little at a time without overloading memory? In DCGAN algorithm the batch size is set to 64 files, but I will certainly have to decrease to 30.
Thank you!
for my Deep Learning exam, I have to develop a project that provides for the generation of 3D images in format .nii using the DCGAN algorithm trained with some real images of MRI scans of brains of patients with Alzahimer.
I have a serious problem. I should load three different datasets that weigh 3GB, 5GB and 8GB respectively. Unfortunately I am forced to use an old PC with only 4GB of RAM and 2GB of Swap memory, so I am impossible to upload the files at one time using this simple code:
loading files:
train_data = []
data_path = './data/ADNI_test/'
for i in range(len(filenames)):
mri_file = data_path + filenames[i]
train_data.append(nib.load(mri_file).get_fdata())
Would any of you know how to give me some alternative solution? Is there a way to upload the files a little at a time without overloading memory? In DCGAN algorithm the batch size is set to 64 files, but I will certainly have to decrease to 30.
Thank you!