Tensorflow中关于随机数生成种子tf.set_random_seed()
tensorflow
一个面向所有人的开源机器学习框架
项目地址:https://gitcode.com/gh_mirrors/te/tensorflow
免费下载资源
·
版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/qq_31878983/article/details/79495810
<link rel="stylesheet" href="https://csdnimg.cn/release/phoenix/template/css/ck_htmledit_views-e2445db1a8.css">
<div class="htmledit_views">
<pre onclick="hljs.copyCode(event)"><code class="language-python hljs">
Tensorflow中的随机数生成种子是在数据流图资源上运作的。每一个数据流图中,我们可以执行针对随机数生成种子应用不同的操作(operation)。事实上,随机数生成种子作为random系列函数的参数之一,可在相应的参数列表进行设置,这就是op-level的操作。与之对应的是graph-leveltf.set_random_seed()的操作tf.set_random_seed(),它管理着同一数据流图下的资源。
我们结合相应代码做简单的分析
- <code class=“language-python”>import tensorflow as tf
- # Repeatedly running this block with the same graph will generate the same
- # sequences of ‘a’ and ‘b’.
- g1 = tf.Graph()
- g2 = tf.Graph()
- print(“Graph 1”)
- with g1.as_default():
- tf.set_random_seed(-1)
- a = tf.random_uniform([1])
- b = tf.random_normal([1])
- print(“Session 1”)
- with tf.Session() as sess1:
- print(sess1.run(a)) # generates ‘A1’
- print(sess1.run(a)) # generates ‘A2’
- print(sess1.run(b)) # generates ‘B1’
- print(sess1.run(b)) # generates ‘B2’
- print(“Session 2”)
- with tf.Session() as sess2:
- print(sess2.run(a)) # generates ‘A1’
- print(sess2.run(a)) # generates ‘A2’
- print(sess2.run(b)) # generates ‘B1’
- print(sess2.run(b)) # generates ‘B2
- print(“————–”)
- print(“Graph 2”)
- with g2.as_default():
- a = tf.random_uniform([1])
- b = tf.random_normal([1],seed=-1)
- print(“Session 3”)
- with tf.Session() as sess3:
- print(sess3.run(a)) # generates ‘A1’
- print(sess3.run(a)) # generates ‘A2’
- print(sess3.run(b)) # generates ‘B1’
- print(sess3.run(b)) # generates ‘B2’
- print(“Session 4”)
- with tf.Session() as sess4:
- print(sess4.run(a)) # generates ‘A3’
- print(sess4.run(a)) # generates ‘A4’
- print(sess4.run(b)) # generates ‘B1’
- print(sess4.run(b)) # generates ‘B2’</code>
-
import tensorflow
as tf
-
# Repeatedly running this block with the same graph will generate the same
-
# sequences of 'a' and 'b'.
-
g1 = tf.Graph()
-
g2 = tf.Graph()
-
print(
"Graph 1")
-
with g1.as_default():
-
tf.set_random_seed(
-1)
-
a = tf.random_uniform([
1])
-
b = tf.random_normal([
1])
-
print(
"Session 1")
-
with tf.Session()
as sess1:
-
print(sess1.run(a))
# generates 'A1'
-
print(sess1.run(a))
# generates 'A2'
-
print(sess1.run(b))
# generates 'B1'
-
print(sess1.run(b))
# generates 'B2'
-
print(
"Session 2")
-
with tf.Session()
as sess2:
-
print(sess2.run(a))
# generates 'A1'
-
print(sess2.run(a))
# generates 'A2'
-
print(sess2.run(b))
# generates 'B1'
-
print(sess2.run(b))
# generates 'B2
-
print(
"--------------")
-
print(
"Graph 2")
-
with g2.as_default():
-
a = tf.random_uniform([
1])
-
b = tf.random_normal([
1],seed=
-1)
-
print(
"Session 3")
-
with tf.Session()
as sess3:
-
print(sess3.run(a))
# generates 'A1'
-
print(sess3.run(a))
# generates 'A2'
-
print(sess3.run(b))
# generates 'B1'
-
print(sess3.run(b))
# generates 'B2'
-
print(
"Session 4")
-
with tf.Session()
as sess4:
-
print(sess4.run(a))
# generates 'A3'
-
print(sess4.run(a))
# generates 'A4'
-
print(sess4.run(b))
# generates 'B1'
-
print(sess4.run(b))
# generates 'B2'
-
#程序输出
-
Graph
1
-
Session
1
-
[
0.45231807]
-
[
0.82921326]
-
[
-0.90662855]
-
[
0.52898115]
-
Session
2
-
[
0.45231807]
-
[
0.82921326]
-
[
-0.90662855]
-
[
0.52898115]
-
--------------
-
Graph
2
-
Session
3
-
[
0.18341184]
-
[
0.42214954]
-
[
-0.96254766]
-
[
-1.088825]
-
Session
4
-
[
0.40388882]
-
[
0.7478839]
-
[
-0.96254766]
-
[
-1.088825]
在Graph1中,我们通过tf.set_random_seed()函数对该图资源下的全局随机数生成种子进行设置,使得不同Session中的random系列函数表现出相对协同的特征,这就是Graph-Level的表现;
在Graph2中,我们仅对张量b进行了seed设置,可以发现,这是Op-Level的表现,仅在执行张量b的情况下,才会有和Graph1类似的协同效果。另外,值得注意的是,seed的传参类型为integer,所以只要是整数(如果过大,源码中会进行截断),就能完成它的设置(可以是1234、87654321等),效果和上述的-1是一样的。
GitHub 加速计划 / te / tensorflow
184.54 K
74.12 K
下载
一个面向所有人的开源机器学习框架
最近提交(Master分支:1 个月前 )
a49e66f2
PiperOrigin-RevId: 663726708
1 个月前
91dac11a
This test overrides disabled_backends, dropping the default
value in the process.
PiperOrigin-RevId: 663711155
1 个月前
更多推荐
已为社区贡献4条内容
所有评论(0)