TensorFlow batch normalization

青旅半醒 2022-06-13 00:29 268阅读 0赞

TensorFlow 1.0 (February 2017)以后 出现了高级API tf.layers.batch_normalization .

使用简单方便

  1. # Set this to True for training and False for testing training = tf.placeholder(tf.bool) x = tf.layers.dense(input_x, units=100) x = tf.layers.batch_normalization(x, training=training) x = tf.nn.relu(x)

除了它为图表添加额外的操作(用于更新其平均值和方差变量),使得它们不会成为训练的依赖

  1. extra_update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS) sess.run([train_op, extra_update_ops], ...)

或者手动添加更新操作作为您的培训操作的依赖关系,然后正常运行训练过程

  1. extra_update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS) with tf.control_dependencies(extra_update_ops): train_op = optimizer.minimize(loss) ... sess.run([train_op], ...)

发表评论

表情:
评论列表 (有 0 条评论,268人围观)

还没有评论,来说两句吧...

相关阅读