Learn R Programming

gradDescent (version 3.0)

MGD: Momentum Gradient Descent (MGD) Method Learning Function

Description

A function to build prediction model using Momentum Gradient Descent (MGD) method.

Usage

MGD(dataTrain, alpha = 0.1, maxIter = 10, momentum = 0.9, seed = NULL)

Arguments

dataTrain

a data.frame that representing training data (\(m \times n\)), where \(m\) is the number of instances and \(n\) is the number of variables where the last column is the output variable. dataTrain must have at least two columns and ten rows of data that contain only numbers (integer or float).

alpha

a float value representing learning rate. Default value is 0.1

maxIter

the maximal number of iterations.

momentum

a float value represent momentum give a constant speed to learning process.

seed

a integer value for static random. Default value is NULL, which means the function will not do static random.

Value

a vector matrix of theta (coefficient) for linear model.

Details

This function based on SGD with an optimization to speed-up the learning by adding a constant momentum.

References

N. Qian On the momentum term in gradient descent learning algorithms., Neural networks : the official journal of the International Neural Network Society, pp. 145-151- (1999)

See Also

AGD

Examples

Run this code
# NOT RUN {
##################################
## Learning and Build Model with MGD
## load R Package data
data(gradDescentRData)
## get z-factor data
dataSet <- gradDescentRData$CompressilbilityFactor
## split dataset
splitedDataSet <- splitData(dataSet)
## build model with MGD
MGDmodel <- MGD(splitedDataSet$dataTrain)
#show result
print(MGDmodel)

# }

Run the code above in your browser using DataLab