我有以下代码:
# Declare the layers
inp1 = Input(shape=input_shape, name="input1")
inp2 = Input(shape=input_shape, name="input2")
# 128 -> 64
conv1_inp1 = Conv2D(start_neurons * 1, 3, activation="relu", padding="same")(inp1)
conv1_inp2 = Conv2D(start_neurons * 1, 3, activation="relu", padding="same")(inp2)
conv1 = Concatenate()([conv1_inp1, conv1_inp2])
conv1 = Conv2D(start_neurons * 1, 3, activation="relu", padding="same")(conv1)
conv1 = MaxPooling2D((2, 2))(conv1)
conv1 = Dropout(0.25)(conv1)
# 64 -> 32
conv2 = Conv2D(start_neurons * 2, (3, 3), activation="relu", padding="same")(conv1)
conv2 = Conv2D(start_neurons * 2, (3, 3), activation="relu", padding="same")(conv2)
pool2 = MaxPooling2D((2, 2))(conv2)
pool2 = Dropout(0.5)(pool2)
# 32 -> 16
conv3 = Conv2D(start_neurons * 4, (3, 3), activation="relu", padding="same")(pool2)
conv3 = Conv2D(start_neurons * 4, (3, 3), activation="relu", padding="same")(conv3)
pool3 = MaxPooling2D((2, 2))(conv3)
pool3 = Dropout(0.5)(pool3)
# 16 -> 8
conv4 = Conv2D(start_neurons * 8, (3, 3), activation="relu", padding="same")(pool3)
conv4 = Conv2D(start_neurons * 8, (3, 3), activation="relu", padding="same")(conv4)
pool4 = MaxPooling2D((2, 2))(conv4)
pool4 = Dropout(0.5)(pool4)
# Middle
convm = Conv2D(start_neurons * 16, (3, 3), activation="relu", padding="same")(pool4)
convm = Conv2D(start_neurons * 16, (3, 3), activation="relu", padding="same")(convm)
# 8 -> 16
deconv4 = Conv2DTranspose(start_neurons * 8, (3, 3), strides=(2, 2), padding="same")(convm)
uconv4 = Concatenate()([deconv4, conv4])
uconv4 = Dropout(0.5)(uconv4)
uconv4 = Conv2D(start_neurons * 8, (3, 3), activation="relu", padding="same")(uconv4)
uconv4 = Conv2D(start_neurons * 8, (3, 3), activation="relu", padding="same")(uconv4)
并产生此错误:
Graph disconnected: cannot obtain value for tensor Tensor("input_28:0", shape=(?, 128, 128, 1), dtype=float32) at layer "input_28". The following previous layers were accessed without issue: []
输入具有相同的形状,在某些论坛中,他们说问题出在以下事实:输入来自2个不同的来源,因此破坏了您之前的链接。
我真的不知道该如何解决。
谁能帮我?
慕桂英546537
月关宝盒
相关分类