AI將會變得更加民主化, AI程序開發(fā)遲早會變?yōu)槌绦騿T的必備技能,還猶豫什么,趕緊來學(xué)習(xí). Google的Tensorflow無疑是目前最有前景的框架, 那么Tensorflow到底好不好學(xué)呢?我們拭目以待. 本篇介紹Tensorflow的基本概念.
1. 基本元素
1.1 constant
const的原型是tf.constant(value, dtype=None, shape=None, name='Const', verify_shape=False)
,可以是常數(shù),向量,矩陣等.例子如下:
import tensorflow as tf
def const_literal():
a = tf.constant(2, name='a')
b = tf.constant(3, name='b')
x = tf.add(a, b, name='add')
with tf.Session() as sess:
writer = tf.summary.FileWriter('./graphs', sess.graph)
print(sess.run(x))
writer.close()
def const_tensor():
a = tf.constant([2, 2], name='a')
b = tf.constant([[0, 1], [2, 3]], name='b')
x = tf.add(a, b, name='add')
y = tf.multiply(a, b, name='mul') # element wise multiply
with tf.Session() as sess:
x, y = sess.run([x, y])
print('x:')
print(x)
print('y:')
print(y)
def const_zeros():
"""tf.zeros and tf.ones has same API"""
a = tf.zeros([2, 3], tf.int32)
b = tf.zeros_like(a, tf.float32)
with tf.Session() as sess:
print(sess.run(a))
print(sess.run(b))
def const_fill(val):
"""fill the tensor with a value"""
a = tf.fill([2, 3], val)
with tf.Session() as sess:
print(sess.run(a))
def const_linear(start, stop, num):
"""linear space numbers in [start, stop], only float32, float64 permited"""
a = tf.linspace(start, stop, num)
b = tf.range(start, stop, 1.0)
with tf.Session() as sess:
print(sess.run(a))
print(sess.run(b))
def const_random():
"""
tf.random_normal(shape, mean=0.0, stddev=1.0, dtype=tf.float32, seed=None, name=None)
tf.truncated_normal(shape, mean=0.0, stddev=1.0, dtype=tf.float32, seed=None, name=None)
tf.random_uniform(shape, minval=0, maxval=None, dtype=tf.float32, seed=None, name=None)
tf.random_shuffle(value, seed=None, name=None)
tf.random_crop(value, size, seed=None, name=None)
tf.multinomial(logits, num_samples, seed=None, name=None)
tf.random_gamma(shape, alpha, beta=None, dtype=tf.float32, seed=None, name=None)
"""
pass
def const_graph():
"""don't need to use my_const, it has already been in the compute graph"""
my_const = tf.constant([1.0, 2.0], name='my_const')
with tf.Session() as sess:
print(sess.graph.as_graph_def())
if __name__ == '__main__':
# const_linear(0.0, 99.0, 99)
const_graph()
Tensorboard
Tensorboard查看Graph.
Variable
constant是一個operation,子graph構(gòu)建的時候定義,Variable是一個類,代表變量.constant在圖的的定義里邊,Variable可以在參數(shù)服務(wù)器.
變量在使用前要進行顯示的初始化,否則報未初始化的錯.
可以使用eval()進行求知,只有operation和tensor有eval()函數(shù),Tensor.eval()相當(dāng)于get_default_session().run(t)
.
每一個Variable都有一個initializer,只有Variable被初始化了或者賦值成功了,才可以eval()
import tensorflow as tf
def test_eval():
W = tf.constant(10)
with tf.Session():
print(W.eval()) # 10
def test_eval_Variable():
W = tf.Variable(10)
with tf.Session() as sess:
print(sess.run(W.initializer)) # None <--- 1.
print(W.eval()) # 10
def test_eval_Variable_all():
W = tf.Variable(10)
with tf.Session():
print(W.initializer.eval()) # error: object has no attribute 'eval'
print(W.eval())
def initialize_properly():
W = tf.Variable(10)
with tf.Session() as sess:
#This way
tf.global_variables_initializer().run()
print(W.eval())
print(sess.run(W))
def run_multiple_times():
W = tf.Variable(10)
a_times_two = W.assign(2 * W)
with tf.Session():
tf.global_variables_initializer().run()
print(W.eval()) # 10
print(a_times_two.eval()) # 20
print(a_times_two.eval()) # 40
if __name__ == '__main__':
test_eval()
test_eval_Variable()
test_eval_Variable_all()
Placeholders
placeholder和Variable在普通的編程意義上差不多,不過在tensorflow里邊,placeholder用來表示輸入輸出的數(shù)據(jù),相當(dāng)于C/C++的io, Variable代表在學(xué)習(xí)中可以更新,迭代,存儲的參數(shù),更接近于普通意義上的變量. 具體來說有一下不同:
- Variable需要用Tensor初始化; placeholder不用,也不能初始化
- Variable的數(shù)據(jù)可以在訓(xùn)練中更新
- Variable可以共享,并且可以是nontrainble
- Variable學(xué)習(xí)好的參數(shù)可以保存在磁盤中
- Variable創(chuàng)建的時候有三個op自動創(chuàng)建: variable op, initializer op, ops for the initial value
- Variable是一個class, placeholder是一個function
- 在分布式環(huán)境下,Variable在參數(shù)服務(wù)器里邊,并且在不同的worker里邊共享
- Variable使用前要初始化,在使用的過程中shape是固定的, placeholder在使用的時候要feed數(shù)據(jù).
Session
import tensorflow as tf
x = tf.Variable(3, name='x')
y = tf.Variable(4, name='y')
f = x*x*y + y + 2
with tf.Session() as sess:
x.initializer.run()
y.initializer.run()
# result = f.eval()
# result = sess.run(f)
result = tf.get_default_session().run(f)
tf.reset_default_graph()
print(result)
result = None
- 這段代碼里邊用了三種方法求值
- InteractiveSession自動創(chuàng)建一個Session并且是default Session,不需要with block
Graph操作
x1 = tf.Variable(1)
x1.graph is tf.get_default_graph() # True
graph = tf.Graph()
with graph.as_default():
x2 = tf.Variable(2)
x2.graph is graph # True
x2.graph is tf.get_default_graph() # False
- 任何一個創(chuàng)建的node都會自動放到default graph里邊
- 當(dāng)然也可以給node指定graph,尤其是程序中有多個graph的情形.
Node的生命周期
w = tf.constant(3)
x = w + 2
y = x + 5
z = x * 3
with tf.Session() as sess:
print(y.eval())
print(z.eval())
- 在這個代碼中求y的時候需要先求x和w,求z的時候也是x和w,但是第二次不能復(fù)用第一次的結(jié)果
- 在graph的run函數(shù)調(diào)用后,除了variable之外的數(shù)據(jù)都會被丟棄,variable的聲明周期始于初始化,止于session關(guān)閉.
- 為了讓求值更搞笑,需要在一個run中計算y和z:
with tf.Session() as sesss:
y_val, z_val = sess.run([y, z])
print(y_val)
print(z_val)