本文介绍了避免在构造函数中const引用和rvalue引用呈指数增长的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在为机器学习库编写一些模板化类,并且我经常面对这个问题。我主要使用策略模式,其中类接收不同功能的模板参数策略,例如:

I am coding some templated classes for a machine learning library, and I'm facing this issue a lot of times. I'm using mostly the policy pattern, where classes receive as template argument policies for different functionalities, for example:

template <class Loss, class Optimizer> class LinearClassifier { ... }

问题出在构造函数上。随着策略(模板参数)数量的增长,const引用和rvalue引用的组合呈指数增长。在上一个示例中:

The problem is with the constructors. As the amount of policies (template parameters) grows, the combinations of const references and rvalue references grow exponentially. In the previous example:

LinearClassifier(const Loss& loss, const Optimizer& optimizer) : _loss(loss), _optimizer(optimizer) {}

LinearClassifier(Loss&& loss, const Optimizer& optimizer) : _loss(std::move(loss)), _optimizer(optimizer) {}

LinearClassifier(const Loss& loss, Optimizer&& optimizer) : _loss(loss), _optimizer(std::move(optimizer)) {}

LinearClassifier(Loss&& loss, Optimizer&& optimizer) : _loss(std::move(loss)), _optimizer(std::move(optimizer)) {}

有什么办法可以避免这种情况?

Is there some way to avoid this?

推荐答案

实际上,这就是。将构造函数重写为

Actually, this is the precise reason why perfect forwarding was introduced. Rewrite the constructor as

template <typename L, typename O>
LinearClassifier(L && loss, O && optimizer)
    : _loss(std::forward<L>(loss))
    , _optimizer(std::forward<O>(optimizer))
{}

但这可能会更简单Ilya Popov在他的中的建议。老实说,我通常是这样做的,因为移动的目的是廉价的,再移动一次并不会显着改变事情。

But it will probably be much simpler to do what Ilya Popov suggests in his answer. To be honest, I usually do it this way, since moves are intended to be cheap and one more move does not change things dramatically.

霍华德·辛南特显示了如何处理它。

As Howard Hinnant has told, my method can be SFINAE-unfriendly, since now LinearClassifier accepts any pair of types in constructor. Barry's answer shows how to deal with it.

这篇关于避免在构造函数中const引用和rvalue引用呈指数增长的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-23 08:10
查看更多