每N批次保存一次模型权重

每N批次保存一次模型权重

本文介绍了Python/Keras-每N批次保存一次模型权重的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是Python和Keras的新手,并且我已经成功构建了一个神经网络,该神经网络可以在每个Epoch之后保存重量文件.但是,我希望获得更多的粒度(我正在按时间序列显示图层的权重分布),并希望在每N批而不是每个纪元之后保存权重.

I'm new to Python and Keras, and I have successfully built a neural network that saves weight files after every Epoch. However, I want more granularity (I'm visualizing layer weight distributions in time series) and would like to save the weights after every N batches, rather than every epoch.

有人有什么建议吗?

推荐答案

您可以创建自己的回调( https://keras.io/callbacks/).像这样:

You can create your own callback (https://keras.io/callbacks/). Something like:

from keras.callbacks import Callback

class WeightsSaver(Callback):
    def __init__(self, N):
        self.N = N
        self.batch = 0

    def on_batch_end(self, batch, logs={}):
        if self.batch % self.N == 0:
            name = 'weights%08d.h5' % self.batch
            self.model.save_weights(name)
        self.batch += 1

我使用self.batch而不是所提供的batch参数,因为后一个参数在每个时期都从0重新开始.

I use self.batch instead of the batch argument provided because the later restarts at 0 at each epoch.

然后将其添加到适合您的通话中.例如,要每5个批次节省重量:

Then add it to your fit call. For example, to save weights every 5 batches:

model.fit(X_train, Y_train, callbacks=[WeightsSaver(5)])

这篇关于Python/Keras-每N批次保存一次模型权重的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-21 01:05