Learn R Programming

keras (version 0.3.5)

callback_tensorboard: TensorBoard basic visualizations

Description

This callback writes a log for TensorBoard, which allows you to visualize dynamic graphs of your training and test metrics, as well as activation histograms for the different layers in your model.

Usage

callback_tensorboard(log_dir = "./logs", histogram_freq = 0,
  write_graph = TRUE, write_images = FALSE, embeddings_freq = 0,
  embeddings_layer_names = NULL, embeddings_metadata = NULL)

Arguments

log_dir

the path of the directory where to save the log files to be parsed by Tensorboard.

histogram_freq

frequency (in epochs) at which to compute activation histograms for the layers of the model. If set to 0, histograms won't be computed.

write_graph

whether to visualize the graph in Tensorboard. The log file can become quite large when write_graph is set to TRUE

write_images

whether to write model weights to visualize as image in Tensorboard.

embeddings_freq

frequency (in epochs) at which selected embedding layers will be saved.

embeddings_layer_names

a list of names of layers to keep eye on. If NULL or empty list all the embedding layers will be watched.

embeddings_metadata

a named list which maps layer name to a file name in which metadata for this embedding layer is saved. See the details about the metadata file format. In case if the same metadata file is used for all embedding layers, string can be passed.

Details

TensorBoard is a visualization tool provided with TensorFlow.

If you have installed TensorFlow with pip, you should be able to launch TensorBoard from the command line:

--logdir=/full_path_to_your_logs 

You can find more information about TensorBoard here.

See Also

Other callbacks: callback_csv_logger, callback_early_stopping, callback_lambda, callback_learning_rate_scheduler, callback_model_checkpoint, callback_progbar_logger, callback_reduce_lr_on_plateau, callback_remote_monitor