Learn R Programming

Ruta

Software for unsupervised deep architectures


Get uncomplicated access to unsupervised deep neural networks, from building their architecture to their training and evaluation

Get started

How to install

In order to develop Ruta models, you will need to install its dependencies first and then get the package from CRAN.

Dependencies

Ruta is based in the well known open source deep learning library Keras and its R interface, which is integrated in Tensorflow. In order to install them easily, you can use the keras::install_keras() function. Depending on whether you want to use the system installation, a Conda environment or a Virtualenv, you may need to call use_condaenv() or use_virtualenv() from reticulate.

Another straightforward way to install these dependencies is to use global system-wide (sudo pip install) or user-wide (pip install --user) installation with pip. This is generally not recommended unless you are sure you will not need alternative versions or clash with other packages. The following shell command would install all libraries expected by Keras:

$ pip install --user tensorflow tensorflow-hub tensorflow-datasets scipy requests pyyaml Pillow h5py pandas pydot

Otherwise, you can follow the official installation guides:

Check whether Keras is accesible from R by running:

keras::is_keras_available() # should return TRUE

Ruta package

From an R interpreter such as the R REPL or the RStudio console, run one of the following commands to get the Ruta package:

# Just get Ruta from the CRAN
install.packages("ruta")

# Or get the latest development version from GitHub
devtools::install_github("fdavidcl/ruta")

All R dependencies will be automatically installed. These include the Keras R interface and purrr.

First steps

The easiest way to start working with Ruta is to use the autoencode() function. It allows for selecting a type of autoencoder and transforming the feature space of a data set onto another one with some desirable properties depending on the chosen type.

iris[, 1:4] |> as.matrix() |> autoencode(2, type = "denoising")

You can learn more about different variants of autoencoders by reading A practical tutorial on autoencoders for nonlinear feature fusion.

Ruta provides the functionality to build diverse neural architectures (see autoencoder()), train them as autoencoders (see train()) and perform different tasks with the resulting models (see reconstruct()), including evaluation (see evaluate_mean_squared_error()). The following is a basic example of a natural pipeline with an autoencoder:

library(ruta)

# Shuffle and normalize dataset
x <- iris[, 1:4] |> sample() |> as.matrix() |> scale()
x_train <- x[1:100, ]
x_test <- x[101:150, ]

autoencoder(
  input() + dense(256) + dense(36, "tanh") + dense(256) + output("sigmoid"),
  loss = "mean_squared_error"
) |>
  make_contractive(weight = 1e-4) |>
  train(x_train, epochs = 40) |>
  evaluate_mean_squared_error(x_test)

For more details, see other examples and the documentation.

Copy Link

Version

Install

install.packages('ruta')

Monthly Downloads

40

Version

1.2.0

License

GPL (>= 3) | file LICENSE

Issues

Pull Requests

Stars

Forks

Maintainer

David Charte

Last Published

January 8th, 2023

Functions in ruta (1.2.0)

autoencoder_sparse

Sparse autoencoder
generate.ruta_autoencoder_variational

Generate samples from a generative model
make_sparse

Add sparsity regularization to an autoencoder
dropout

Dropout layer
dense

Create a fully-connected neural layer
input

Create an input layer
evaluate_mean_squared_error

Evaluation metrics
configure

Configure a learner object with the associated Keras objects
evaluation_metric

Custom evaluation metrics
make_denoising

Add denoising behavior to any autoencoder
new_autoencoder

Create an autoencoder learner
decode

Retrieve decoding of encoded data
correntropy

Correntropy loss
is_robust

Detect whether an autoencoder is robust
output

Create an output layer
plot.ruta_network

Draw a neural network
make_robust

Add robust behavior to any autoencoder
make_contractive

Add contractive behavior to any autoencoder
loss_variational

Variational loss
is_sparse

Detect whether an autoencoder is sparse
noise_gaussian

Additive Gaussian noise
noise_ones

Filter to add ones noise
reconstruct

Retrieve reconstructions for input data
print.ruta_autoencoder

Inspect Ruta objects
to_keras.ruta_autoencoder

Extract Keras models from an autoencoder wrapper
noise

Noise generator
is_trained

Detect trained models
is_variational

Detect whether an autoencoder is variational
encoding_index

Get the index of the encoding
encode

Retrieve encoding of data
to_keras.ruta_loss_contraction

Obtain a Keras loss
to_keras.ruta_network

Build a Keras network
weight_decay

Weight decay
is_contractive

Detect whether an autoencoder is contractive
to_keras.ruta_filter

Get a Keras generator from a data filter
new_layer

Layer wrapper constructor
to_keras.ruta_layer_input

Convert Ruta layers onto Keras layers
noise_zeros

Filter to add zero noise
new_network

Sequential network constructor
noise_saltpepper

Filter to add salt-and-pepper noise
to_keras.ruta_layer_variational

Obtain a Keras block of layers for the variational autoencoder
train.ruta_autoencoder

Train a learner object with data
variational_block

Create a variational block of layers
+.ruta_network

Add layers to a network/Join networks
is_denoising

Detect whether an autoencoder is denoising
layer_keras

Custom layer from Keras
noise_cauchy

Additive Cauchy noise
sparsity

Sparsity regularization
save_as

Save and load Ruta models
[.ruta_network

Access subnetworks of a network
to_keras

Convert a Ruta object onto Keras objects and functions
to_keras.ruta_weight_decay

Obtain a Keras weight decay
to_keras.ruta_sparsity

Translate sparsity regularization to Keras regularizer
autoencoder_robust

Create a robust autoencoder
autoencode

Automatically compute an encoding of a data matrix
autoencoder

Create an autoencoder learner
autoencoder_variational

Build a variational autoencoder
contraction

Contractive loss
autoencoder_denoising

Create a denoising autoencoder
apply_filter.ruta_noise_zeros

Apply filters
add_weight_decay

Add weight decay to any autoencoder
as_network

Coercion to ruta_network
as_loss

Coercion to ruta_loss
autoencoder_contractive

Create a contractive autoencoder
conv

Create a convolutional layer