[英]Data Augmentation for Inception v3
我正在尝试使用 inception v3 对图像进行分类,但我的数据集非常小(不能有比这更多的 img),我想通过旋转或反转等转换来增强它。 我是 TF 的新手,不知道该怎么做,我已经阅读了ImageDataGenerator
的文档,它应该会增加我的数据,但是在训练时我仍然收到错误消息,指出我没有足够的数据。 我也可以使用面具,但不知道如何在 tf. 有人可以启发我吗? 非常感谢任何输入
这是我的代码:
train_datagen = ImageDataGenerator(rescale = 1./255.,
rotation_range = 180,
width_shift_range = 0.2,
height_shift_range = 0.2,
zoom_range = 0.2,
horizontal_flip = True,
vertical_flip = True)
test_datagen = ImageDataGenerator(rescale = 1./255.,
rotation_range = 180,
width_shift_range = 0.2,
height_shift_range = 0.2,
zoom_range = 0.2,
horizontal_flip = True,
vertical_flip = True)
train_generator = train_datagen.flow_from_directory(train_dir,
batch_size = 100,
class_mode = 'binary',
target_size = (224, 224))
validation_generator = test_datagen.flow_from_directory(validation_dir,
batch_size = 100,
class_mode = 'binary',
target_size = (224, 224))
base_model = InceptionV3(input_shape = (224, 224, 3),
include_top = False,
weights = 'imagenet')
for layer in base_model.layers:
layer.trainable = False
%%time
x = layers.Flatten()(base_model.output)
x = layers.Dense(1024, activation='relu')(x)
x = layers.Dropout(0.2)(x)
x = layers.Dense(1, activation='sigmoid')(x)
model = Model( base_model.input, x)
model.compile(optimizer = RMSprop(learning_rate=0.0001),loss = 'binary_crossentropy',metrics = ['acc'])
callbacks = myCallback()
history = model.fit_generator(
train_generator,
validation_data = validation_generator,
steps_per_epoch = 100,
epochs = 10,
validation_steps = 10,
verbose = 2,
callbacks=[callbacks])
错误:
WARNING:tensorflow:Your input ran out of data; interrupting training. Make sure that your dataset or generator can generate at least `steps_per_epoch * epochs` batches (in this case, 1000 batches). You may need to use the repeat() function when building your dataset.
当您使用生成器时,您必须按如下方式计算每个时期的步数:
steps_per_epoch=(data_samples/batch_size)
或者
你可以让 model 算出有多少步可以覆盖一个纪元。 您是否尝试在没有steps_per_epoch
参数的情况下运行它?
让我们知道问题是否仍然存在。 谢谢!
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.