微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

将正则化器添加到skflow

我最近从tensorflow切换到skflow.在tensorflow中,我们将lambda * tf.nn.l2_loss(weights)添加到我们的损失中.现在我在skflow中有以下代码

def deep_psi(X, y):
    layers = skflow.ops.dnn(X, [5, 10, 20, 10, 5], keep_prob=0.5)
    preds, loss = skflow.models.logistic_regression(layers, y)
    return preds, loss

def exp_decay(global_step):
    return tf.train.exponential_decay(learning_rate=0.01,
                                      global_step=global_step,
                                      decay_steps=1000,
                                      decay_rate=0.005)

deep_cd = skflow.TensorFlowEstimator(model_fn=deep_psi,
                                    n_classes=2,
                                    steps=10000,
                                    batch_size=10,
                                    learning_rate=exp_decay,
                                    verbose=True,)

如何在此处以及在何处添加正则化器? Illia暗示了here,但我不知道.

解决方法:

您仍然可以向损失中添加其他组件,只需要从dnn / logistic_regression中检索权重并将它们添加到损失中:

def regularize_loss(loss, weights, lambda):
    for weight in weights:
        loss = loss + lambda * tf.nn.l2_loss(weight)
    return loss    


def deep_psi(X, y):
    layers = skflow.ops.dnn(X, [5, 10, 20, 10, 5], keep_prob=0.5)
    preds, loss = skflow.models.logistic_regression(layers, y)

    weights = []
    for layer in range(5): # n layers you passed to dnn
        weights.append(tf.get_variable("dnn/layer%d/linear/Matrix" % layer))
        # biases are also available at dnn/layer%d/linear/Bias
    weights.append(tf.get_variable('logistic_regression/weights'))

    return preds, regularize_loss(loss, weights, lambda)

“`

注意,变量的路径可以是found here.

此外,我们想向所有具有变量的层(例如dnn,conv2d或fully_connected)添加正则化器支持,因此可能在下周晚上的Tensorflow构建中应具有类似dnn(..,regularize = tf.contrib.layers.l2_regularizer( lambda)).发生这种情况时,我将更新此答案.

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐