简体   繁体   中英

TypeError: unhashable type: 'list' lemmatizer

There is an issue with the word_patterns list

This is the full error message

Traceback (most recent call last): File "C:/Users/narek/PycharmProjects/pythonProject/AI Bot/training.py", line 48, in word_patterns = [lemmatizer.lemmatize(word) for word in word_patterns] File "C:/Users/narek/PycharmProjects/pythonProject/AI Bot/training.py", line 48, in word_patterns = [lemmatizer.lemmatize(word) for word in word_patterns] File "C:\Users\narek\AppData\Local\Programs\Python\Python38\lib\site-packages\nltk\stem\wordnet.py", line 38, in lemmatize lemmas = wordnet._morphy(word, pos) File "C:\Users\narek\AppData\Local\Programs\Python\Python38\lib\site-packages\nltk\corpus\reader\wordnet.py", line 1897, in _morphy if form in exceptions: TypeError: unhashable type: 'list'

import pickle
import random
import numpy as np

import nltk
from tensorflow.python.keras.models import Sequential
from tensorflow.python.layers.core import Dense, Dropout
from tensorflow.keras.optimizers import SGD

nltk.download('punkt')
nltk.download('wordnet')
from nltk.stem import WordNetLemmatizer

lemmatizer = WordNetLemmatizer()

intents = json.loads(open('intense.json').read())

words = []
classes = []
documents = []
ignore_letters = ['?', '!', '.', ',']

for intent in intents['intents']:
    for pattern in intent['patterns']:
        word_list = nltk.word_tokenize(pattern)
        words.extend(word_list)
        documents.append((word_list, intent['tag']))
        if intent['tag'] not in classes:
            classes.append(intent['tag'])

words = [lemmatizer.lemmatize(word) for word in words if word not in ignore_letters]
words = sorted(set(words))

classes = sorted(set(classes))

pickle.dump(words, open('words.pkl', 'wb'))
pickle.dump(words, open('classes.pkl', 'wb'))

training = []
output_empty = [0] * len(classes)

for document in documents:
    bag = []
    word_patterns = documents[0]
    # Error Happens Here 
    word_patterns = [lemmatizer.lemmatize(word) for word in word_patterns]
    for word in words:
        bag.append(1) if word in word_patterns else bag.append(0)

    output_row = list(output_empty)
    output_row[classes.index(document[1])] = 1
    training.append([bag, output_row])

Try replacing

documents.append((word_list, intent['tag']))

with

documents.extend((word_list, intent['tag']))

found under that "for intent in intents['intents']:" loop

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM