官术网_书友最值得收藏!

How to do it...

The section covers how to visualize TensorFlow models and output in TernsorBoard.

  1. To visualize summaries and graphs, data from TensorFlow can be exported using the FileWriter command from the summary module. A default session graph can be added using the following command:
# Create Writer Obj for log
log_writer = tf$summary$FileWriter('c:/log', sess$graph)

The graph for logistic regression developed using the preceding code is shown in the following screenshot:

Visualization of the logistic regression graph in TensorBoard
Details about symbol descriptions on TensorBoard can be found at https://www.tensorflow.org/get_started/graph_viz.
  1. Similarly, other variable summaries can be added to the TensorBoard using correct summaries, as shown in the following code:
# Adding histogram summary to weight and bias variable
w_hist = tf$histogram_summary("weights", W)
b_hist = tf$histogram_summary("biases", b)

The summaries can be a very useful way to determine how the model is performing. For example, for the preceding case, the cost function for test and train can be studied to understand optimization performance and convergence.

  1. Create a cross entropy evaluation for test. An example script to generate the cross entropy cost function for test and train is shown in the following command:
# Set-up cross entropy for test
nRowt<-nrow(occupancy_test)
xt <- tf$constant(unlist(occupancy_test[, xFeatures]), shape=c(nRowt, nFeatures), dtype=np$float32)
ypredt <- tf$nn$sigmoid(tf$matmul(xt, W) + b)
yt_ <- tf$constant(unlist(occupancy_test[, yFeatures]), dtype="float32", shape=c(nRowt, 1L))
cross_entropy_tst<-tf$reduce_mean(tf$nn$sigmoid_cross_entropy_with_logits(labels=yt_, logits=ypredt, name="cross_entropy_tst"))

The preceding code is similar to training cross entropy calculations with a different dataset. The effort can be minimized by setting up a function to return tensor objects.

  1. Add summary variables to be collected:
# Add summary ops to collect data
w_hist = tf$summary$histogram("weights", W)
b_hist = tf$summary$histogram("biases", b)
crossEntropySummary<-tf$summary$scalar("costFunction", cross_entropy)
crossEntropyTstSummary<-tf$summary$scalar("costFunction_test", cross_entropy_tst)

The script defines the summary events to be logged in the file.

  1. Open the writing object, log_writer. It writes the default graph to the location, c:/log:
# Create Writer Obj for log
log_writer = tf$summary$FileWriter('c:/log', sess$graph)
  1. Run the optimization and collect the summaries:
for (step in 1:2500) {
sess$run(optimizer)

# Evaluate performance on training and test data after 50 Iteration
if (step %% 50== 0){
### Performance on Train
ypred <- sess$run(tf$nn$sigmoid(tf$matmul(x, W) + b))
roc_obj <- roc(occupancy_train[, yFeatures], as.numeric(ypred))

### Performance on Test
ypredt <- sess$run(tf$nn$sigmoid(tf$matmul(xt, W) + b))
roc_objt <- roc(occupancy_test[, yFeatures], as.numeric(ypredt))
cat("train AUC: ", auc(roc_obj), " Test AUC: ", auc(roc_objt), "n")

# Save summary of Bias and weights
log_writer$add_summary(sess$run(b_hist), global_step=step)
log_writer$add_summary(sess$run(w_hist), global_step=step)
log_writer$add_summary(sess$run(crossEntropySummary), global_step=step)
log_writer$add_summary(sess$run(crossEntropyTstSummary), global_step=step)
} }
  1. Collect all the summaries to a single tensor using themerge_all command from the summary module:
summary = tf$summary$merge_all() 
  1. Write the summaries to the log file using the log_writer object:
log_writer = tf$summary$FileWriter('c:/log', sess$graph)
summary_str = sess$run(summary)
log_writer$add_summary(summary_str, step)
log_writer$close()
主站蜘蛛池模板: 永昌县| 张家口市| 诸暨市| 农安县| 监利县| 皮山县| 徐水县| 茌平县| 莱西市| 龙门县| 曲靖市| 鸡泽县| 龙井市| 龙口市| 余江县| 黄梅县| 惠安县| 康平县| 灵寿县| 平凉市| 自治县| 确山县| 利津县| 江油市| 晋江市| 玉树县| 安远县| 浪卡子县| 信宜市| 长治市| 大姚县| 镇江市| 荥经县| 偏关县| 弥勒县| 汾阳市| 恩平市| 卓尼县| 揭西县| 玉龙| 余干县|