这是我的斑点形状和图层:
--------------------------------斑点
data 4096 4.10e+03 (1, 2, 1, 2048)
Convolution1 32736 3.27e+04 (1, 16, 1, 2046)
ReLU1 32736 3.27e+04 (1, 16, 1, 2046)
Convolution2 32704 3.27e+04 (1, 16, 1, 2044)
ReLU2 32704 3.27e+04 (1, 16, 1, 2044)
...
Crop4 4224 4.22e+03 (1, 16, 1, 264)
Concat4 8448 8.45e+03 (1, 32, 1, 264)
Convolution17 4192 4.19e+03 (1, 16, 1, 262)
ReLU21 4192 4.19e+03 (1, 16, 1, 262)
Convolution18 4160 4.16e+03 (1, 16, 1, 260)
unet1 4160 4.16e+03 (1, 16, 1, 260)
ampl0 4096 4.10e+03 (1, 4096)
Reshape0 4096 4.10e+03 (1, 1, 1, 4096)
conv1 65472 6.55e+04 (1, 16, 1, 4092)
conv1_conv1_0_split_0 65472 6.55e+04 (1, 16, 1, 4092)
conv1_conv1_0_split_1 65472 6.55e+04 (1, 16, 1, 4092)
Scale1 65472 6.55e+04 (1, 16, 1, 4092)
ReLU22 65472 6.55e+04 (1, 16, 1, 4092)
Scale2 65472 6.55e+04 (1, 16, 1, 4092)
...
ReLU28 517120 5.17e+05 (1, 128, 8, 505)
Scale8 517120 5.17e+05 (1, 128, 8, 505)
ReLU29 517120 5.17e+05 (1, 128, 8, 505)
crelu4 1034240 1.03e+06 (1, 128, 16, 505)
maxPool4 518144 5.18e+05 (1, 128, 16, 253)
ampl 21 2.10e+01 (1, 21)
我在损失层中得到的错误:
F0416 15:43:21.957676 95620 loss_layer.cpp:19] Check failed: bottom[0]->shape(0) == bottom[1]->shape(0) (1 vs. 10) The data and label should have the same first dimension.
注意:在CNN层的中间添加了一个完全连接的层(ampl0)+整形(Reshape0)层后,出现了错误。没有它们,效果很好!
谢谢你的帮助。
更新:那些完全连接的层和Reshape层是:
layer {
name: "ampl0"
type: "InnerProduct"
bottom: "unet1"
top: "ampl0"
param {
lr_mult: 1
decay_mult: 1
}
inner_product_param {
num_output: 4096
bias_term: false
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.2
}
}
}
layer {
name: "Reshape0"
type: "Reshape"
bottom: "ampl0"
top: "Reshape0"
reshape_param {
shape {
dim: 1
dim: 1
dim: 1
dim:-1
}
}
}
最佳答案
您的 "Reshape"
层将第一个维度(batch_size
)强制为1,因此,当您更改batch_size
时,您的净休息时间。
为避免这种情况,您需要"Reshape"
复制第一个维度:
layer {
name: "reshape"
type: "Reshape"
bottom: "input"
top: "output"
reshape_param {
shape {
dim: 0 # copy the dimension from below <-- !!
dim: 1 # insert singleton dimension
dim: 1
dim: -1 # infer it from the other dimensions
}
}
}
我想
reshape_param { shape { dim: 1 dim: 1 } num_axes: 0 axis: 1 }
可能还会为您拉动技巧。
有关
"Reshape"
参数的更多信息和选项,请参见 caffe.proto
: // axis and num_axes control the portion of the bottom blob's shape that are
// replaced by (included in) the reshape. By default (axis == 0 and
// num_axes == -1), the entire bottom blob shape is included in the reshape,
// and hence the shape field must specify the entire output shape.
//
// axis may be non-zero to retain some portion of the beginning of the input
// shape (and may be negative to index from the end; e.g., -1 to begin the
// reshape after the last axis, including nothing in the reshape,
// -2 to include only the last axis, etc.).
//
// For example, suppose "input" is a 2D blob with shape 2 x 8.
// Then the following ReshapeLayer specifications are all equivalent,
// producing a blob "output" with shape 2 x 2 x 4:
//
// reshape_param { shape { dim: 2 dim: 2 dim: 4 } }
// reshape_param { shape { dim: 2 dim: 4 } axis: 1 }
// reshape_param { shape { dim: 2 dim: 4 } axis: -3 }
//
// num_axes specifies the extent of the reshape.
// If num_axes >= 0 (and axis >= 0), the reshape will be performed only on
// input axes in the range [axis, axis+num_axes].
// num_axes may also be -1, the default, to include all remaining axes
// (starting from axis).
//
// For example, suppose "input" is a 2D blob with shape 2 x 8.
// Then the following ReshapeLayer specifications are equivalent,
// producing a blob "output" with shape 1 x 2 x 8.
//
// reshape_param { shape { dim: 1 dim: 2 dim: 8 } }
// reshape_param { shape { dim: 1 dim: 2 } num_axes: 1 }
// reshape_param { shape { dim: 1 } num_axes: 0 }
//
// On the other hand, these would produce output blob shape 2 x 1 x 8:
//
// reshape_param { shape { dim: 2 dim: 1 dim: 8 } }
// reshape_param { shape { dim: 1 } axis: 1 num_axes: 0 }
//
关于neural-network - 咖啡因丢失错误:检查失败:数据和标签的第一维应相同,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/49859794/