The regression problem Friedman 2 as described in Friedman (1991) and Breiman (1996). Inputs are 4 independent variables uniformly distributed over the ranges $$0 \le x1 \le 100$$ $$40 \pi \le x2 \le 560 \pi$$ $$0 \le x3 \le 1$$ $$1 \le x4 \le 11$$ The outputs are created according to the formula $$y = (x1^2 + (x2 x3 - (1/(x2 x4)))^2)^{0.5} + e$$ where e is \(N(0,sd^2)\).
sim_Friedman2(n, sd = 125)
Returns a list with components
input values (independent variables)
output values (dependent variable)
number of data points to create
Standard deviation of noise. The default value of 125 gives a signal to noise ratio (i.e., the ratio of the standard deviations) of 3:1. Thus, the variance of the function itself (without noise) accounts for 90% of the total variance.
Breiman, Leo (1996) Bagging predictors. Machine Learning 24,
pages 123-140.
Friedman, Jerome H. (1991) Multivariate adaptive regression
splines. The Annals of Statistics 19 (1), pages 1-67.
Other bark simulation functions:
sim_Friedman1()
,
sim_Friedman3()
,
sim_circle()
Other bark functions:
bark()
,
bark-package
,
bark-package-deprecated
,
sim_Friedman1()
,
sim_Friedman3()
,
sim_circle()