I'm trying to implement a simple word-level sequence-to-sequence model with Keras in Colab. I'm using the Keras Attention layer. Here is the definitio ...
I'm trying to implement a simple word-level sequence-to-sequence model with Keras in Colab. I'm using the Keras Attention layer. Here is the definitio ...
Is there a way to avoid tfp.distributions.Categorical.log_probraising an error if the input is a label out of range? I am passing a batch of samples ...
I want to build an LSTM model for the FashionMNIST dataset in PyTorch. I will later on need to extend this to a different dataset that contains videos ...
I'm trying to go seq2seq with a Transformer model. My input and output are the same shape (torch.Size([499, 128]) where 499 is the sequence length and ...
I have an encoder decoder network mimicking the one produced in this tutorial: https://towardsdatascience.com/how-to-implement-seq2seq-lstm-model-in-k ...
I have a seq to seq model trained of some clever bot data: justphrases_X is a list of sentences and justphrases_Y is a list of responses to those sen ...
I am trying to understanding how to implement a seq-to-seq model with attention from this website. My question: Is nn.embedding just returns some IDs ...
Im trying to to predict a variable length input/output many to many sequence using Keras, the dataframe below is a representation of the data . 5 colu ...
I am debugging a sequence-to-sequence model and purposely tried to perfectly overfit a small dataset of ~200 samples (sentence pairs of length between ...
I am trying to develop chatbot with an attention mechanism. but it gives errors like this. my input shape of x_train is (None, 27) and output shape is ...
I've tried to build a sequence to sequence model to predict a sensor signal over time based on its first few inputs (see figure below) The model wor ...
I got a big problem. For my bachelor thesis I have to make a machine tranlation model with BERT. But I am not getting anywhere right now. Do you know ...
This is apparently the code for seq2seq model with embedding that i wrote we are using inference mode for predictions particularly encoder and deco ...
I have been debugging this issue for a while now. I have developed an LSTM encoder decoder model which I plan to deploy in C++. Having saved the mod ...
I have couple of questions: In a seq to seq model with varying input length, if you don't use the attention mask the RNN may end up computing the ...
I have a list of sentences. I want to add padding to them; but when I use keras pad_sequence like this: the result is: Why it is not working pro ...
I have an encoder-decoder model whose structure is the same as the one at machinelearningmastery.com with num_encoder_tokens = 1949, num_decoder_toke ...
I am trying to build a sequence to sequence model in Keras using LSTM and dense neural network. The encoder encodes the input, the encoded state and t ...
I'm working on a sequence forecasting problem and I don't have much experience in this area, so some of the below questions might be naive. FYI: I've ...
I am currently trying to include an embedding layer to my sequence-to-sequence autoencoder, built with the keras functional API. The model code looks ...