小言_互联网的博客

Tensorflow2新版的第一23个程序,使用亲测——notebook有效——TensorFlow训练网络过程中出现NAN值

311人阅读  评论(0)
import tensorflow as tf #引入模块
tf.compat.v1.disable_eager_execution()
x = tf.constant([[1.0, 2.0]]) #定义一个 2 阶张量等于[[1.0,2.0]]
w = tf.constant([[3.0], [4.0]]) #定义一个 2 阶张量等于[[3.0],[4.0]]
y = tf.matmul(x, w) #实现 xw 矩阵乘法
print (y) #打印出结果
# 初始化所有变量,也就是上面定义的a/b两个变量
#tf.compat.v1.Session() as session
with tf.compat.v1.Session() as sess:
    print (sess.run(y))

结果输出:
Tensor(“MatMul_1:0”, shape=(1, 1), dtype=float32)
[[11.]]
效果如图:

另外的,
找不到原始地方是哪里了,无脑复制粘贴,跑了再说。
常用案例:

import tensorflow as tf
import numpy as np
import tensorflow.compat.v1 as tf

tf.disable_v2_behavior()

houses = 100
features = 2
 
x_data = np.zeros([houses, 2])
for house in range(houses):
    x_data[house, 0] = np.round(np.random.uniform(50., 150.))
    x_data[house, 1] = np.round(np.random.uniform(3., 7.))
weights = np.array([[2.], [3.]])
y_data = np.dot(x_data, weights)   # 点积
 
x_data_ = tf.placeholder(tf.float32, [None, 2])
y_data_ = tf.placeholder(tf.float32, [None, 1])
weights_ = tf.Variable(np.ones([2, 1]), dtype=tf.float32)
y_model = tf.matmul(x_data_, weights_)
 
loss = tf.reduce_mean(tf.pow((y_model - y_data_), 2))
train_op = tf.train.AdamOptimizer(0.01).minimize(loss)
 
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
sess = tf.Session(config=config)
# sess = tf.Session()
init = tf.global_variables_initializer()
sess.run(init)
 
for _ in range(100):
    for (x, y) in zip(x_data, y_data):
        z1 = x.reshape(1, 2)
        z2 = y.reshape(1, 1)
        sess.run(train_op, feed_dict={
   x_data_: z1, y_data_: z2})
    print("--------------")
    print(sess.run(weights_))

结果输出如下:

[[1.7226804]
 [1.7358012]]
--------------
[[1.995762 ]
 [2.0266747]]
--------------
[[2.0405617]
 [2.0848765]]
--------------
[[2.0428774]
 [2.101384 ]]
--------------
[[2.042211 ]
 [2.1168568]]
--------------
[[2.0414085]
 [2.1340635]]
--------------
[[2.0405357]
 [2.1529143]]
--------------
[[2.0395954]
 [2.1732705]]
--------------
[[2.0385902]
 [2.1950126]]
--------------
[[2.0375223]
 [2.2180226]]
--------------
[[2.0363958]
 [2.2421906]]
--------------
[[2.0352123]
 [2.267404 ]]
--------------
[[2.0339756]
 [2.2935524]]
--------------
[[2.0326884]
 [2.3205247]]
--------------
[[2.0313563]
 [2.3482075]]
--------------
[[2.029984]
 [2.376487]]
--------------
[[2.0285776]
 [2.4052484]]
--------------
[[2.0271437]
 [2.4343727]]
--------------
[[2.02569  ]
 [2.4637434]]
--------------
[[2.0242243]
 [2.4932399]]
--------------
[[2.0227544]
 [2.5227406]]
--------------
[[2.021289 ]
 [2.5521228]]
--------------
[[2.0198367]
 [2.5812676]]
--------------
[[2.0184047]
 [2.6100483]]
--------------
[[2.0170014]
 [2.638347 ]]
--------------
[[2.0156336]
 [2.6660457]]
--------------
[[2.0143073]
 [2.693029 ]]
--------------
[[2.0130286]
 [2.7191868]]
--------------
[[2.0118036]
 [2.7444136]]
--------------
[[2.0106356]
 [2.768612 ]]
--------------
[[2.009528]
 [2.791693]]
--------------
[[2.0084841]
 [2.813578 ]]
--------------
[[2.0075054]
 [2.8342025]]
--------------
[[2.0065937]
 [2.8535097]]
--------------
[[2.00575  ]
 [2.8714588]]
--------------
[[2.0049756]
 [2.8880222]]
--------------
[[2.00427  ]
 [2.9031928]]
--------------
[[2.0036345]
 [2.916972 ]]
--------------
[[2.003067 ]
 [2.9293804]]
--------------
[[2.0025673]
 [2.9404554]]
--------------
[[2.0021315]
 [2.9502413]]
--------------
[[2.0017574]
 [2.9588048]]
--------------
[[2.001439 ]
 [2.9662185]]
--------------
[[2.0011725]
 [2.9725668]]
--------------
[[2.0009499]
 [2.9779415]]
--------------
[[2.0007663]
 [2.9824414]]
--------------
[[2.0006146]
 [2.986167 ]]
--------------
[[2.0004895]
 [2.9892175]]
--------------
[[2.000386 ]
 [2.9916866]]
--------------
[[2.0003006]
 [2.9936678]]
--------------
[[2.0002317]
 [2.9952383]]
--------------
[[2.0001752]
 [2.996469 ]]
--------------
[[2.0001302]
 [2.9974225]]
--------------
[[2.0000956]
 [2.998151 ]]
--------------
[[2.0000684]
 [2.9986963]]
--------------
[[2.0000484]
 [2.9991   ]]
--------------
[[2.0000334]
 [2.99939  ]]
--------------
[[2.0000224]
 [2.9995956]]
--------------
[[2.0000145]
 [2.9997373]]
--------------
[[2.0000093]
 [2.9998343]]
--------------
[[2.0000057]
 [2.9998984]]
--------------
[[2.000003 ]
 [2.9999387]]
--------------
[[2.0000017]
 [2.999962 ]]
--------------
[[2.000001 ]
 [2.9999797]]
--------------
[[2.0000005]
 [2.9999895]]
--------------
[[2.0000002]
 [2.9999952]]
--------------
[[2.0000002]
 [2.9999952]]
--------------
[[2.0000002]
 [2.9999952]]
--------------
[[2.0000002]
 [2.9999952]]
--------------
[[2.0000002]
 [2.9999952]]
--------------
[[2.0000002]
 [2.9999952]]
--------------
[[2.0000002]
 [2.9999955]]
--------------
[[2.0000002]
 [2.9999955]]
--------------
[[2.0000002]
 [2.9999955]]
--------------
[[2.0000002]
 [2.9999955]]
--------------
[[2.0000002]
 [2.9999955]]
--------------
[[2.0000002]
 [2.9999955]]
--------------
[[2.0000002]
 [2.9999955]]
--------------
[[2.0000002]
 [2.9999955]]
--------------
[[2.0000002]
 [2.9999955]]
--------------
[[2.0000002]
 [2.9999955]]
--------------
[[2.0000002]
 [2.999996 ]]
--------------
[[2.0000002]
 [2.9999983]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]
--------------
[[2.       ]
 [2.9999995]]


部分参考
出现问题:
AttributeError: module ‘tensorflow’ has no attribute ‘Session’

原因:
tensorflow2删除了 tf.Session()

解决方法:
tf.Session() 将改为tf.compat.v1.Session()
————————————————
版权声明:本文为CSDN博主「李亮_sdu」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
原文链接:https://blog.csdn.net/qq_40575024/article/details/105862550

tensorFlow训练网络过程中出现NAN值:

在用多元线性回归解决房价预测问题时就遇到过权重出现NAN值问题。先采用的是tf.train.GradientDescentOptimize优化算法,结果训练过程中权重出现了NAN值,后来改用tf.train.AdamOptimize优化算法问题得到了解决。至于为什么还有待深究!

案例分析:如后面的图。
————————————————
版权声明:本文为CSDN博主「The_Thinker_QChen」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
原文链接:https://blog.csdn.net/qq_33254870/article/details/81428441


转载:https://blog.csdn.net/dongbao520/article/details/117479851
查看评论
* 以上用户言论只代表其个人观点,不代表本网站的观点或立场