Learn R Programming

darch (version 0.12.0)

maxoutWeightUpdate: Updates the weight on maxout layers

Description

On maxout layers, only the weights of active units are altered, additionally all weights within a pool must be the same.

Usage

maxoutWeightUpdate(darch, layerIndex, weightsInc, biasesInc, ...,
  weightDecay = getParameter(".darch.weightDecay", 0, darch),
  poolSize = getParameter(".darch.maxout.poolSize", 2, darch))

Arguments

darch

'>DArch instance.

layerIndex

Layer index within the network.

weightsInc

Matrix containing scheduled weight updates from the fine-tuning algorithm.

biasesInc

Bias weight updates.

...

Additional parameters, not used.

weightDecay

Weights are multiplied by (1 - weightDecay) before each update. Corresponds to the darch.weightDecay parameter of darch.default.

poolSize

Size of maxout pools, see parameter darch.maxout.poolSize of darch.

Value

The updated weights.

References

Goodfellow, Ian J., David Warde-Farley, Mehdi Mirza, Aaron C. Courville, and Yoshua Bengio (2013). "Maxout Networks". In: Proceedings of the 30th International Conference on Machine Learning, ICML 2013, Atlanta, GA, USA, 16-21 June 2013, pp. 1319-1327. URL: http://jmlr.org/proceedings/papers/v28/goodfellow13.html

See Also

Other weight update functions: weightDecayWeightUpdate

Examples

Run this code
# NOT RUN {
data(iris)
model <- darch(Species ~ ., iris, c(0, 50, 0),
 darch.unitFunction = c("maxoutUnit", "softmaxUnit"),
 darch.maxout.poolSize = 5, darch.maxout.unitFunction = "sigmoidUnit",
 darch.weightUpdateFunction = c("weightDecayWeightUpdate", "maxoutWeightUpdate"))
# }

Run the code above in your browser using DataLab