问题描述
请您指导我如何解释以下结果?
1)损失<validation_loss2)损失> validation_loss
看来训练损失总是应该小于验证损失.但是,这两种情况都是在训练模型时发生的.
确实是机器学习中的一个基本问题.
如果验证丢失>>训练损失,您可以称其为过拟合.如果验证损失>训练损失,您可以称其为过拟合.如果验证损失<训练损失,您可以称其为不合身.如果验证损失<<训练损失,您可以称其为不合身.
您的目标是使验证损失尽可能小.过度拟合几乎总是一件好事.最终所有重要的事情是:验证损失要尽可能的低.
这通常发生在训练损失低很多的时候.
还要检查
Would you please guide me how to interpret the following results?
1) loss < validation_loss2) loss > validation_loss
It seems that the training loss always should be less than validation loss. But, both of these cases happen when training a model.
Really a fundamental question in machine learning.
If validation loss >> training loss you can call it overfitting.
If validation loss > training loss you can call it some overfitting.
If validation loss < training loss you can call it some underfitting.
If validation loss << training loss you can call it underfitting.
Your aim is to make the validation loss as low as possible.Some overfitting is nearly always a good thing. All that matters in the end is: is the validation loss as low as you can get it.
This often occurs when the training loss is quite a bit lower.
Also check how to prevent overfitting.
这篇关于深度学习中的训练损失和验证损失的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!