简体   繁体   English

类型错误:“str”对象不是迭代器

[英]TypeError: 'str' object is not an iterator

I am trying to run a basic CNN on using macOS Anaconda.我正在尝试使用 macOS Anaconda 运行基本的 CNN。 All Keras ati is up to date (atleast i think so, but im sure it is)所有 Keras ati 都是最新的(至少我是这么认为的,但我确定是)

I am able to run everything except for when i need to run this line,除了需要运行这条线时,我可以运行所有东西,

classifier.fit_generator('training_set',
                     steps_per_epoch = 8000,
                     epochs = 25,
                     validation_data = test_set

When i attempt to run that i get the error,当我尝试运行时出现错误,

TypeError: 'str' object is not an iterator类型错误:“str”对象不是迭代器

This is my code,这是我的代码

# Importing the Keras libraries and packages
from keras.models import Sequential
from keras.layers import Conv2D
from keras.layers import MaxPooling2D
from keras.layers import Flatten
from keras.layers import Dense

# Initialising the CNN
classifier = Sequential()

# Step 1 - Convolution
classifier.add(Conv2D(32, (3, 3), input_shape = (64, 64, 3), activation = 'relu'))

# Step 2 - Pooling
classifier.add(MaxPooling2D(pool_size = (2, 2)))

# Adding a second convolutional layer
classifier.add(Conv2D(32, (3, 3), activation = 'relu'))
classifier.add(MaxPooling2D(pool_size = (2, 2)))

# Step 3 - Flattening
classifier.add(Flatten())

# Step 4 - Full connection
classifier.add(Dense(units = 128, activation = 'relu'))
classifier.add(Dense(units = 1, activation = 'sigmoid'))

# Compiling the CNN
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])

# Part 2 - Fitting the CNN to the images

from keras.preprocessing.image import ImageDataGenerator

train_datagen = ImageDataGenerator(rescale = 1./255,
                                   shear_range = 0.2,
                                   zoom_range = 0.2,
                                   horizontal_flip = True)

test_datagen = ImageDataGenerator(rescale = 1./255)

training_set = train_datagen.flow_from_directory('/Users/Dan/Desktop/CNN/dataset/training_set',
                                                 target_size = (64, 64),
                                                 batch_size = 32,
                                                 class_mode = 'binary')

test_set = test_datagen.flow_from_directory('/Users/Dan/Desktop/CNN/dataset/test_set',
                                            target_size = (64, 64),
                                            batch_size = 32,
                                            class_mode = 'binary')

classifier.fit_generator('training_set',
                         steps_per_epoch = 8000,
                         epochs = 25,
                         validation_data = test_set,
                         validation_steps = 2000)

# Saving Weights
weights = classifier.save_weights

"""
Single Prediction
"""
import numpy as np
from keras.preprocessing import image


test_image = image.load_img(('dataset/predictions/cat_or_dog_2.jpg'), target_size=(64, 64))
test_image = image.img_to_array(test_image)
test_image = np.expand_dims(test_image, axis = 0)
result = classifier.predict(test_image)
training_set.class_indices
if result[0][0] == 1:
    prediction = 'Dog'
else:
    prediction = 'Cat'

And this is the code itself running up to the error,这是代码本身运行到错误,

from keras.models import Sequential
from keras.layers import Conv2D
from keras.layers import MaxPooling2D
from keras.layers import Flatten
from keras.layers import Dense

# Initialising the CNN
classifier = Sequential()

# Step 1 - Convolution
classifier.add(Conv2D(32, (3, 3), input_shape = (64, 64, 3), activation = 'relu'))

# Step 2 - Pooling
classifier.add(MaxPooling2D(pool_size = (2, 2)))

# Adding a second convolutional layer
classifier.add(Conv2D(32, (3, 3), activation = 'relu'))
classifier.add(MaxPooling2D(pool_size = (2, 2)))

# Step 3 - Flattening
classifier.add(Flatten())

# Step 4 - Full connection
classifier.add(Dense(units = 128, activation = 'relu'))
classifier.add(Dense(units = 1, activation = 'sigmoid'))

# Compiling the CNN
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
Using TensorFlow backend.
2019-11-25 19:39:19.093497: I tensorflow/core/platform/cpu_feature_guard.cc:145] This TensorFlow binary is optimized with Intel(R) MKL-DNN to use the following CPU instructions in performance critical operations:  SSE4.1 SSE4.2 AVX AVX2 FMA
To enable them in non-MKL-DNN operations, rebuild TensorFlow with the appropriate compiler flags.
2019-11-25 19:39:19.095093: I tensorflow/core/common_runtime/process_util.cc:115] Creating new thread pool with default inter op setting: 4. Tune using inter_op_parallelism_threads for best performance.

from keras.preprocessing.image import ImageDataGenerator

train_datagen = ImageDataGenerator(rescale = 1./255,
                                   shear_range = 0.2,
                                   zoom_range = 0.2,
                                   horizontal_flip = True)

test_datagen = ImageDataGenerator(rescale = 1./255)

training_set = train_datagen.flow_from_directory('/Users/Dan/Desktop/CNN/dataset/training_set',
                                                 target_size = (64, 64),
                                                 batch_size = 32,
                                                 class_mode = 'binary')
Found 8000 images belonging to 2 classes.

test_set = test_datagen.flow_from_directory('/Users/Dan/Desktop/CNN/dataset/test_set',
                                            target_size = (64, 64),
                                            batch_size = 32,
                                            class_mode = 'binary')
Found 2000 images belonging to 2 classes.

classifier.fit_generator('training_set',
                         steps_per_epoch = 8000,
                         epochs = 25,
                         validation_data = test_set,
                         validation_steps = 2000)
Epoch 1/25
Traceback (most recent call last):

  File "<ipython-input-7-e4696e5027ff>", line 5, in <module>
    validation_steps = 2000)

  File "/Users/Dan/opt/anaconda3/lib/python3.7/site-packages/keras/legacy/interfaces.py", line 91, in wrapper
    return func(*args, **kwargs)

  File "/Users/Dan/opt/anaconda3/lib/python3.7/site-packages/keras/engine/training.py", line 1732, in fit_generator
    initial_epoch=initial_epoch)

  File "/Users/Dan/opt/anaconda3/lib/python3.7/site-packages/keras/engine/training_generator.py", line 185, in fit_generator
    generator_output = next(output_generator)

  File "/Users/Dan/opt/anaconda3/lib/python3.7/site-packages/keras/utils/data_utils.py", line 742, in get
    six.reraise(*sys.exc_info())

  File "/Users/Dan/opt/anaconda3/lib/python3.7/site-packages/six.py", line 696, in reraise
    raise value

  File "/Users/Dan/opt/anaconda3/lib/python3.7/site-packages/keras/utils/data_utils.py", line 711, in get
    inputs = future.get(timeout=30)

  File "/Users/Dan/opt/anaconda3/lib/python3.7/multiprocessing/pool.py", line 657, in get
    raise self._value

  File "/Users/Dan/opt/anaconda3/lib/python3.7/multiprocessing/pool.py", line 121, in worker
    result = (True, func(*args, **kwds))

  File "/Users/Dan/opt/anaconda3/lib/python3.7/site-packages/keras/utils/data_utils.py", line 650, in next_sample
    return six.next(_SHARED_SEQUENCES[uid])

TypeError: 'str' object is not an iterator

Is there something i am missing?有什么我想念的吗? or a line that is wrong because im sure everything is correct.或者是错误的一行,因为我确定一切都是正确的。

You are passing a string as a first argument, you want to pass the training_set variable.您将字符串作为第一个参数传递,您想传递 training_set 变量。

classifier.fit_generator(training_set,
                         steps_per_epoch = 8000,
                         epochs = 25,
                         validation_data = test_set,
                         validation_steps = 2000)

Not familiar with the package, but checking the documentation it shows that training_set should be a generator:不熟悉包,但检查文档它表明 training_set 应该是一个生成器:

generator: A generator or an instance of Sequence (keras.utils.Sequence) object in order to avoid duplicate data when using multiprocessing.生成器:生成器或 Sequence (keras.utils.Sequence) 对象的实例,以避免在使用多处理时出现重复数据。 The output of the generator must be either a tuple (inputs, targets) a tuple (inputs, targets, sample_weights).生成器的输出必须是元组(输入、目标)或元组(输入、目标、sample_weights)。 This tuple (a single output of the generator) makes a single batch.这个元组(生成器的单个输出)构成一个批次。 Therefore, all arrays in this tuple must have the same length (equal to the size of this batch).因此,该元组中的所有数组必须具有相同的长度(等于该批次的大小)。 Different batches may have different sizes.不同的批次可能有不同的尺寸。 For example, the last batch of the epoch is commonly smaller than the others, if the size of the dataset is not divisible by the batch size.例如,如果数据集的大小不能被批量大小整除,则该时代的最后一批通常比其他批次小。 The generator is expected to loop over its data indefinitely.预计生成器将无限期地循环其数据。 An epoch finishes when steps_per_epoch batches have been seen by the model.当模型看到 steps_per_epoch 批次时,一个 epoch 结束。

But you are using a string with value 'training_set', I'm guessing you mean training_set (without quotes).但是您使用的是值为“training_set”的字符串,我猜您的意思是 training_set(不带引号)。 https://keras.io/models/sequential/ https://keras.io/models/sequential/

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM