model= Sequential()
model.add(keras.layers.Embedding(vocab_size,output_dim=100,input_length=input_len,weights=[embedding_matrix],trainable=False))
model.add(keras.layers.Bidirectional(keras.layers.LSTM(512, return_sequences=True,recurrent_dropout=0.2, dropout=0.2)))
model.add(keras.layers.Bidirectional(keras.layers.LSTM(512, return_sequences=True,recurrent_dropout=0.2, dropout=0.2)))
model.add(keras.layers.Dense(128, activation="relu"))
model.add(keras.layers.TimeDistributed(keras.layers.Dense(vocab_size_label, activation="softmax")))
model.compile(optimizer=optim,loss='sparse_categorical_crossentropy',metrics=["accuracy"])
model.summary()
I have built a Bi-lstm model for NER Tagging and now I want to introduce CRF layer in it. I am confused how can I insert CRF layer using Tensorflow
tfa.text.crf_log_likelihood(
inputs,
tag_indices,
sequence_lengths,
transition_params=None
)
I found this in tfa.txt and have 3 queries regarding this function: 1. How do I pass these arguments? 2. Do I have to use output of this as loss (negative of log_likelihood) in compiler. Can someone plz help me on this?
I am looking for this solution too, and I guess you should create a custom class to wrap the tfa.text.crf_log_likelihood
method, and then integrate it in keras.Sequence
.
Maybe something like https://github.com/tensorflow/addons/issues/723#issuecomment-559636561
Or even more pytorch-style , like this https://github.com/saiwaiyanyu/bi-lstm-crf-ner-tf2.0/blob/master/model.py
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.