简体   繁体   中英

How to fix “ValueError: not enough values to unpack (expected 2, got 1)”

I am trying to do sentiment analysis on a german tweet-data-set with the bert-base-german-cased modell which i imported over transformers from hugginface.

To be able to calculate the predicted probabilities i want to Softmax of Numpy and here does the issue begin.

F.softmax(model(input_ids, attention_mask), dim=1)

I got the error:

ValueError: not enough values to unpack (expected 2, got 1)

Does anyone know, which values are here expected?

All works when i try to run it with:

self.bert = BertModel.from_pretrained(PRE_TRAINED_MODEL_NAME)

getting the error when i switch to

self.bert = AutoModelWithLMHead.from_pretrained("bert-base-german-cased")

As you can probaly see, i am a noob. therefore I please ask for simple and detailed explanations (understandable for a fish:D).

Code 0

Code 1

Code 2

Input_ID' and 'Attention_mask' are output values of the tokenizations process.

It's a late answer but may help.

I had the same error. My problem was that 'input_ids' and 'attention_mask' have to be 2D tensor but I got them as 1D tensor. So do

input_ids = input_ids.unsqueeze(0)
attention_mask = attention_mask.unsqueeze(0)

in your case.

BertModel expects a batch of training instances (eg input_id [[...][...]]). Hence, there should be no problem if you first batch your dataset (with sth like DataLoader) and iterate over it.

It seems like you given a single training instance (eg input_id [...]) for now.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM