sae_encode: Stacked Autoencoder - Encode
Description
Creates an deep learning stacked autoencoder to encode a sequence of observations.
The autoencoder layers are based on DAL Toolbox Vanilla Autoencoder
It wraps the pytorch library.
Usage
sae_encode(
input_size,
encoding_size,
batch_size = 32,
num_epochs = 1000,
learning_rate = 0.001,
k = 3
)
Value
a sae_encode_decode
object.
Arguments
- input_size
input size
- encoding_size
encoding size
- batch_size
size for batch learning
- num_epochs
number of epochs for training
- learning_rate
learning rate
- k
number of AE layers in the stack
Examples
Run this code#See example at https://nbviewer.org/github/cefet-rj-dal/daltoolbox-examples
Run the code above in your browser using DataLab