[英]Pytorch Tensors using all RAM
I have a list of tensors, which is too heavy for my RAM.我有一个张量列表,这对我的 RAM 来说太重了。 I would like to save them in filesystem and load them when needed我想将它们保存在文件系统中并在需要时加载它们
torch.save(single_tensor, 'tensor_<idx>.pt')
If I want to use batches while training, is there an automatic way to load tensors when needed?如果我想在训练时使用批处理,是否有一种在需要时自动加载张量的方法? I was thinking about using TensorDataset
and DataLoader
, but since now I don't have tensors in a list but in filesystem, how should I build them?我正在考虑使用TensorDataset
和DataLoader
,但是现在我在列表中没有张量,但在文件系统中,我应该如何构建它们?
Firstly save the tensors one by one to file with torch.save()
首先使用torch.save()
将张量一一保存到文件中
torch.save(tensor, 'path/to/file.pt')
Then this Dataset
class allows to load the tensors only when they are really needed:然后这个Dataset
class 允许仅在真正需要时加载张量:
class EmbedDataset(torch.utils.data.Dataset):
def __init__(self, first_embed_path, second_embed_path, labels):
self.first_embed_path = first_embed_path
self.second_embed_path = second_embed_path
self.labels = labels
def __len__(self):
return len(self.labels)
def __getitem__(self, i):
label = self.labels[i]
embed = torch.load(os.path.join(self.first_embed_path, str(i) + '.pt'))
pos = torch.load(os.path.join(self.second_embed_path, str(i) + '.pt'))
tensor = torch.cat((embed, pos))
return tensor, label
Here the tensors are named with numbers, eg 1.pt
or 1816.pt
这里张量用数字命名,例如1.pt
或1816.pt
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.