# 前言

TensorFlow是以流動的張量為名的神經網路開發庫，所以Google為了讓人們更直觀的瞭解流動的張量的含義，他們做了個TensorBoard讓我們直觀的看到我們寫的框架是怎麼個流動法的（純屬YY）。好了玩笑話不說，Google是怎麼定義TensorBoard的呢？

``````The computations you'll use TensorFlow for -
like training a massive deep neural network -
can be complex and confusing.
To make it easier to understand, debug,
and optimize TensorFlow programs,
we've included a suite of visualization tools called TensorBoard.
You can use TensorBoard to visualize your TensorFlow graph,
plot quantitative metrics about the execution of your graph,
and show additional data like images that pass through it.``````

Tips：TensorBoard官方支援的瀏覽器是Chrome和Firefox，所以最好使用這兩個瀏覽器去開TensorBoard吧。

# 實現graphs

``````import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
def add_layer(layoutname,inputs,in_size,out_size,activatuib_funaction=None):
Weights=tf.Variable(tf.random_normal([in_size,out_size]))
biases=tf.Variable(tf.zeros([1,out_size])) 0.1
Wx_plus_b=tf.matmul(inputs,Weights) biases
if activatuib_funaction is None:
outputs=Wx_plus_b
else :
outputs=activatuib_funaction(Wx_plus_b)
return outputs
x_data=np.linspace(-1,1,300)[:,np.newaxis]
noise=np.random.normal(0,0.05,x_data.shape)
y_data=np.square(x_data)-0.5 noise
xs=tf.placeholder(tf.float32,[None,1])
ys=tf.placeholder(tf.float32,[None,1])
l1=add_layer('first_layer',xs,1,10,activatuib_funaction=tf.nn.relu)
prediction =add_layer('secend_layer',l1,10,1,activatuib_funaction=None)
loss=tf.reduce_mean(tf.reduce_sum(tf.square(ys-prediction),reduction_indices=[1]))
train_step=tf.train.GradientDescentOptimizer(0.1).minimize(loss)
init=tf.global_variables_initializer()
with tf.Session() as sess:
fig=plt.figure()
ax=fig.add_subplot(1,1,1)
ax.scatter(x_data,y_data)
plt.show(block=False)
sess.run(init)
for train in range(1000):
sess.run(train_step,feed_dict={xs:x_data,ys:y_data})
if train%50==0:
try:
ax.lines.remove(lines[0])
except Exception:
pass
print train,sess.run(loss,feed_dict={xs:x_data,ys:y_data})
prediction_value=sess.run(prediction,feed_dict={xs:x_data})
lines=ax.plot(x_data, prediction_value,'r-',lw=5)
plt.pause(0.1)``````

``````with tf.name_scope('inputs'):
xs=tf.placeholder(tf.float32,[None,1],name="x_input")
ys=tf.placeholder(tf.float32,[None,1],name="y_input")
``````

x_input、y_input這兩個顯示值我們只需要在宣告的時候在最後加一個name=’顯示名’就可以顯示出上面的效果了。

``````def add_layer(layoutname,inputs,in_size,out_size,activatuib_funaction=None,):
with tf.name_scope(layoutname):
with tf.name_scope('weights'):
Weights=tf.Variable(tf.random_normal([in_size,out_size]),name='W')
with tf.name_scope('biases'):
biases=tf.Variable(tf.zeros([1,out_size]) 0.1,name='b')
with tf.name_scope('Wx_plus_b'):
Wx_plus_b=tf.add(tf.matmul(inputs,Weights),biases)
if activatuib_funaction is None:
outputs=Wx_plus_b
else :
outputs=activatuib_funaction(Wx_plus_b)
return outputs
``````

layer圖層裡面還有weights和biases、Wx_plus_b等，所以他們也是小圖層，所以我們對他們也要單獨定義，另外給weights，biases給個name值。 激勵函式這些會自動生成，我們不需要去管。

loss和train也是類似，寫完tf.name_scope(”):之後，我們還需要寫檔案，寫檔案的程式碼TensorFlow也幫我們封裝好了，我們只需要呼叫writer=tf.summary.FileWriter(“檔案儲存路徑如：logs/”,sess.graph)，完整程式碼：

``````import tensorflow as tf
import numpy as np
def add_layer(layoutname,inputs,in_size,out_size,activatuib_funaction=None,):
with tf.name_scope(layoutname):
with tf.name_scope('weights'):
Weights=tf.Variable(tf.random_normal([in_size,out_size]),name='W')
with tf.name_scope('biases'):
biases=tf.Variable(tf.zeros([1,out_size]) 0.1,name='b')
with tf.name_scope('Wx_plus_b'):
Wx_plus_b=tf.add(tf.matmul(inputs,Weights),biases)
if activatuib_funaction is None:
outputs=Wx_plus_b
else :
outputs=activatuib_funaction(Wx_plus_b)
return outputs
x_data=np.linspace(-1,1,300)[:,np.newaxis]
noise=np.random.normal(0,0.09,x_data.shape)
y_data=np.square(x_data)-0.05 noise
with tf.name_scope('inputs'):
xs=tf.placeholder(tf.float32,[None,1],name="x_in")
ys=tf.placeholder(tf.float32,[None,1],name="y_in")
l1=add_layer("first_layer",xs,1,10,activatuib_funaction=tf.nn.relu)
prediction =add_layer('second_layout',l1,10,1,activatuib_funaction=None)
with tf.name_scope('loss'):
loss=tf.reduce_mean(tf.reduce_sum(tf.square(ys-prediction),reduction_indices=[1]))
with tf.name_scope('train'):
train_step=tf.train.GradientDescentOptimizer(0.1).minimize(loss)
init=tf.global_variables_initializer()
with tf.Session() as sess:
writer=tf.summary.FileWriter("logs/",sess.graph)
sess.run(init)
``````

python tensorflow/tensorboard/tensorboard.py –logdir=path/to/log-directory

tensorboard –logdir=/path/to/log-directory

# 後記

TensorFlow配合TensorBoard的Graphs來構建我們的框架確實非常好用，牛逼的Google肯定不會只賦予TensorBoard這一個功能，他能監控我們的訓練過程，讓我們視覺化的看到訓練過程中引數的變化，甚至是影象、音訊的變化，還能展示訓練過程的資料分佈圖，所以tensorboard、TensorFlow博大精深，慢慢學習吧！
PS:感謝周莫煩大神出的[機器學習系列]