简体   繁体   中英

How to easily convert a PyTorch dataloader to tf.Dataset?

How can we convert a pytorch dataloader to a tf.Dataset ?

I spied this snippet:-

def convert_pytorch_dataloader_to_tf_dataset(dataloader, batch_size, shuffle=True):
    dataset = tf.data.Dataset.from_generator(
        lambda: dataloader,
        output_types=(tf.float32, tf.float32),
        output_shapes=(tf.TensorShape([256, 512]), tf.TensorShape([2,]))
    )
    if shuffle:
        dataset = dataset.shuffle(buffer_size=len(dataloader.dataset))
    dataset = dataset.batch(batch_size)
    return dataset

But it doesn't work at all.

Is there an in-built option to export dataloaders to tf.Dataset s easily? I have a very complex dataloader, so a simple solutions should ensure things are bug-free:)

For your data in h5py format, you can use the script below. name_x is the features' name in your h5py and name_y is your label's file name. This method is memory efficient and you can feed the data batch by batch.

class Generator(object):

def __init__(self,open_directory,batch_size,name_x,name_y):

    self.open_directory = open_directory

    data_f = h5py.File(open_directory, "r")

    self.x = data_f[name_x]
    self.y = data_f[name_y]

    if len(self.x.shape) == 4:
        self.shape_x = (None, self.x.shape[1], self.x.shape[2], self.x.shape[3])

    if len(self.x.shape) == 3:
        self.shape_x = (None, self.x.shape[1], self.x.shape[2])

    if len(self.y.shape) == 4:
        self.shape_y = (None, self.y.shape[1], self.y.shape[2], self.y.shape[3])

    if len(self.y.shape) == 3:
        self.shape_y = (None, self.y.shape[1], self.y.shape[2])

    self.num_samples = self.x.shape[0]
    self.batch_size = batch_size
    self.epoch_size = self.num_samples//self.batch_size+1*(self.num_samples % self.batch_size != 0)

    self.pointer = 0
    self.sample_nums = np.arange(0, self.num_samples)
    np.random.shuffle(self.sample_nums)


def data_generator(self):

    for batch_num in range(self.epoch_size):

        x = []
        y = []

        for elem_num in range(self.batch_size):

            sample_num = self.sample_nums[self.pointer]

            x += [self.x[sample_num]]
            y += [self.y[sample_num]]

            self.pointer += 1

            if self.pointer == self.num_samples:
                self.pointer = 0
                np.random.shuffle(self.sample_nums)
                break

        x = np.array(x,
                     dtype=np.float32)
        y = np.array(y,
                     dtype=np.float32)

        yield x, y

def get_dataset(self):
    dataset = tf.data.Dataset.from_generator(self.data_generator,
                                             output_types=(tf.float32,
                                                           tf.float32),
                                             output_shapes=(tf.TensorShape(self.shape_x),
                                                            tf.TensorShape(self.shape_y)))
    dataset = dataset.prefetch(1)

    return dataset

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM