我是Keras的新手,我正努力了解Keras的重要性。我知道如何在Python的Tensorflow中做到这一点。

码:

data = np.array(attributes, 'int64')
target = np.array(labels, 'int64')

feature_columns = [tf.contrib.layers.real_valued_column("", dimension=2, dtype=tf.float32)]
learningRate = 0.1
epoch = 10000

# https://www.tensorflow.org/api_docs/python/tf/metrics
validation_metrics = {
"accuracy": tf.contrib.learn.MetricSpec(metric_fn = tf.contrib.metrics.streaming_accuracy ,
prediction_key = tf.contrib.learn.PredictionKey.CLASSES),
"precision": tf.contrib.learn.MetricSpec(metric_fn = tf.contrib.metrics.streaming_precision ,
prediction_key = tf.contrib.learn.PredictionKey.CLASSES),
"recall": tf.contrib.learn.MetricSpec(metric_fn = tf.contrib.metrics.streaming_recall ,
prediction_key = tf.contrib.learn.PredictionKey.CLASSES),
"mean_absolute_error": tf.contrib.learn.MetricSpec(metric_fn = tf.contrib.metrics.streaming_mean_absolute_error ,
prediction_key = tf.contrib.learn.PredictionKey.CLASSES),
"false_negatives": tf.contrib.learn.MetricSpec(metric_fn = tf.contrib.metrics.streaming_false_negatives ,
prediction_key = tf.contrib.learn.PredictionKey.CLASSES),
"false_positives": tf.contrib.learn.MetricSpec(metric_fn = tf.contrib.metrics.streaming_false_positives ,
prediction_key = tf.contrib.learn.PredictionKey.CLASSES),
"true_positives": tf.contrib.learn.MetricSpec(metric_fn = tf.contrib.metrics.streaming_true_positives ,
prediction_key = tf.contrib.learn.PredictionKey.CLASSES)
}

# validation monitor
validation_monitor = tf.contrib.learn.monitors.ValidationMonitor(data, target, every_n_steps=500,
metrics = validation_metrics)

classifier = tf.contrib.learn.DNNClassifier(
feature_columns = feature_columns,
hidden_units = [3],
activation_fn = tf.nn.sigmoid,
optimizer = tf.train.GradientDescentOptimizer(learningRate),
model_dir = "model",
config = tf.contrib.learn.RunConfig(save_checkpoints_secs = 1)
)

classifier.fit(data, target, steps = epoch,
monitors = [validation_monitor])

# print('Params:', classifier.get_variable_names())
'''
Params: ['dnn/binary_logistic_head/dnn/learning_rate', 'dnn/hiddenlayer_0/biases', 'dnn/hiddenlayer_0/weights', 'dnn/logits/biases', 'dnn/logits/weights', 'global_step']
'''

print('total steps:', classifier.get_variable_value("global_step"))
print('weight from input layer to hidden layer: ', classifier.get_variable_value("dnn/hiddenlayer_0/weights"))
print('weight from hidden layer to output layer: ', classifier.get_variable_value("dnn/logits/weights"))


有没有办法像Tensorflow中那样获得Keras中的权重:


从输入层到隐藏层的权重
从隐藏层到输出层的权重


这是我在Keras中的模型:

model = Sequential()
model.add(Flatten(input_shape=(224,224,3)))
model.add(Dense(256, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(1, activation='sigmoid'))

最佳答案

您可以使用get_weightsset_weights方法访问并设置模型图层的权重或参数。从Keras documentation


  layer.get_weights():将图层的权重作为以下项的列表返回
  块数组。 layer.set_weights(weights):设置
  Numpy数组列表中的图层(与输出的形状相同)
  (get_weights)。


每个Keras模型都有一个layers属性,该属性是模型中所有图层的列表。例如,在您提供的样本模型中,您可以通过运行以下命令获取第一Dense层的权重:

model.layers[1].get_weights()


它会返回两个numpy数组的列表:第一个是Dense层的内核参数,第二个数组是bias参数。

10-08 05:38