在Tensorflow2中尝试与ResNet一起使用转移学习失败

我正在使用Tensorflow 2和Keras的转移学习训练细分模型。

我有图像(1通道灰度)和蒙版(二进制)。我必须能够复制给定图像的蒙版。

图像和蒙版都具有形状(256,256)。

这是将典型蒙版转换为numpy数组以给您带来想法的热图:

在Tensorflow2中尝试与ResNet一起使用转移学习失败

我正在使用Keras数据生成器进行数据增强。

我的代码如下:

train_datagen = ImageDataGenerator(preprocessing_function = tensorflow.keras.applications.resnet_v2.preprocess_input,rescale=1./255,shear_range=0.1,zoom_range=0.1,horizontal_flip= True)
        
val_datagen = ImageDataGenerator(rescale=1./255)


test_datagen = ImageDataGenerator(rescale=1./255)

train_image_generator = train_datagen.flow_from_dataframe(
dataframe = train_examples_df,x_col = 'images',batch_size = 2,target_size=(256,256),class_mode = None,shuffle = False
#class_mode='categorical'
)

train_mask_generator = train_datagen.flow_from_dataframe(
dataframe = val_examples_df,x_col = 'masks',shuffle = False
#class_mode='categorical'
)

val_image_generator = val_datagen.flow_from_dataframe(
dataframe = train_examples_df,shuffle = False
#class_mode='categorical'
)


val_mask_generator = val_datagen.flow_from_dataframe(
dataframe = val_examples_df,shuffle = False
#class_mode='categorical'
)


test_image_generator = test_datagen.flow_from_dataframe(
dataframe = test_examples_df,batch_size = 1,shuffle = False
#class_mode='categorical'
)


test_mask_generator = test_datagen.flow_from_dataframe(
dataframe = test_examples_df,shuffle = False
#class_mode='categorical'
)

train_gen = zip(train_image_generator,train_mask_generator)

val_gen = zip(val_image_generator,val_mask_generator)

test_gen = zip(test_image_generator,test_mask_generator)

然后,我定义模型,编译和训练模型的代码如下:

base_model = InceptionResnetV2(include_top= False,input_shape = (256,256,3))

x = base_model.output

x = GlobalMaxPooling2D()(x)

x = Dense(256**2,activation = 'relu')(x)

preds = Reshape((256,1))(x)

model = Model(inputs = base_model.input,outputs = preds)

model.compile(optimizer = 'Adam',loss= 'binary_crossentropy',metrics = ['accuracy'])

for layer in model.layers[:780]:
    
    layer.trainable = False

NO_OF_TRAINING_IMAGES = len(os.listdir('data\\train_frames'))
NO_OF_VAL_IMAGES = len(os.listdir('data\\val_frames'))

NO_OF_EPOCHS = 10

BATCH_SIZE = 2

results = model.fit_generator(train_gen,epochs=NO_OF_EPOCHS,steps_per_epoch = (NO_OF_TRAINING_IMAGES//BATCH_SIZE),validation_data=val_gen,validation_steps=(NO_OF_VAL_IMAGES//BATCH_SIZE))

将得到以下输出:

Train for 276 steps,validate for 79 steps
Epoch 1/10
276/276 [==============================] - 241s 874ms/step - loss: -0.0605 - accuracy: 0.0000e+00 - val_loss: 0.0036 - val_accuracy: 0.9852
Epoch 2/10
276/276 [==============================] - 236s 856ms/step - loss: -0.0605 - accuracy: 0.0000e+00 - val_loss: 0.0034 - val_accuracy: 0.9852
Epoch 3/10
276/276 [==============================] - 238s 861ms/step - loss: -0.0605 - accuracy: 0.0000e+00 - val_loss: 0.0033 - val_accuracy: 0.9852
Epoch 4/10
276/276 [==============================] - 247s 894ms/step - loss: -0.0605 - accuracy: 0.0000e+00 - val_loss: 0.0033 - val_accuracy: 0.9852
Epoch 5/10
276/276 [==============================] - 255s 924ms/step - loss: -0.0605 - accuracy: 0.0000e+00 - val_loss: 0.0032 - val_accuracy: 0.9852
Epoch 6/10
276/276 [==============================] - 249s 904ms/step - loss: -0.0605 - accuracy: 0.0000e+00 - val_loss: 0.0031 - val_accuracy: 0.9852
Epoch 7/10
276/276 [==============================] - 252s 912ms/step - loss: -0.0605 - accuracy: 0.0000e+00 - val_loss: 0.0031 - val_accuracy: 0.9852
Epoch 8/10
276/276 [==============================] - 243s 882ms/step - loss: -0.0605 - accuracy: 0.0000e+00 - val_loss: 0.0030 - val_accuracy: 0.9852
Epoch 9/10
276/276 [==============================] - 251s 908ms/step - loss: -0.0605 - accuracy: 0.0000e+00 - val_loss: 0.0030 - val_accuracy: 0.9852
Epoch 10/10
276/276 [==============================] - 242s 878ms/step - loss: -0.0605 - accuracy: 0.0000e+00 - val_loss: 0.0030 - val_accuracy: 0.9852

事实证明该模型无法复制图案。

此外,我无法理解它在火车中报告的损失和准确性。亏损为负?精度为0?

iCMS 回答:在Tensorflow2中尝试与ResNet一起使用转移学习失败

暂时没有好的解决方案,如果你有好的解决方案,请发邮件至:iooj@foxmail.com
本文链接:https://www.f2er.com/1960851.html

大家都在问