简体   繁体   中英

what is the best way to train an LSTM network on batches of sequences with different lengths?

So I have a sequence to sequence problem where the input is many multi-variate sequences with different lengths and the output is a sequence of binary vectors with the same length as its input counterparts. I grouped sequences with the same length together in a separate folder and called the fit function like this:

for e in range(epochs):
    print('Epoch', e+1)
    for i in range(3,19):
        train_x_batch,train_y_batch,batch_size= get_data(i)
        history=model.fit_(train_x_batch,train_y_batch,
                    batch_size=batch_size,
                    validation_split=0.15,
                    callbacks=[tensorboard_cb])

def get_data(i):
    train_x = np.load(os.path.join(cwd, "lab_values","batches",f"f_{i}","train_x.npy"), allow_pickle=True)
    train_y = np.load(os.path.join(cwd, "lab_values","batches",f"f_{i}","train_y.npy"), allow_pickle=True)
    print(f"batch no {i} Train X size= ", train_x.shape)
    print(f"batch no {i} Train Y size= ", train_y.shape)
    batch_Size=train_x.shape[0]
    return train_x,train_y,batch_size

so the question is there a better way of doing this? I heared I can use a generator for this for unfortunatly I could not implement such one.

You are trying to Train on the entire Data (npy file) instead of Training the Model in Batches.

We can write a Generator and Train the Model in Batches .

We extract Batches of Data from an Existing Numpy File using the code,

train_x = np.load(os.path.join(cwd, "lab_values","batches",f"f_{i}","train_x.npy"), mmap_mode='r', allow_pickle=True)

and

x_batch = train_x[start:end].copy() .

Complete code for the Generator and the code for Training is shown below:

import numpy as np

for e in range(epochs):
    print('Epoch', e+1)
    for i in range(3,19):
        #train_x_batch,train_y_batch = get_data(i)
        batch_size = 32
        history=model.fit_(get_data(i),
                    batch_size=batch_size,
                    validation_split=0.15,
                    callbacks=[tensorboard_cb],epochs = 20
                          steps_per_epoch = 500, val_steps = 10)

def get_data(i):
    train_x = np.load(os.path.join(cwd, "lab_values","batches",f"f_{i}","train_x.npy"), 
                      mmap_mode='r', allow_pickle=True)
    train_y = np.load(os.path.join(cwd, "lab_values","batches",f"f_{i}","train_y.npy"),
                      mmap_mode='r', allow_pickle=True)
    print(f"batch no {i} Train X size= ", train_x.shape)
    print(f"batch no {i} Train Y size= ", train_y.shape)
    Number_Of_Rows = train_x.shape[0]
    batch_size = 32
    start = np.random.choice(Number_of_Rows - batch_size)
    end = start + batch_size
    x_batch = train_x[start:end].copy()
    y_batch = train_y[start:end].copy()        
    yield x_batch,y_batch

For more information please refer this SO Question and this SO Question too.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM