Learn R Programming

fastR (version 1.1)

Many simple linear regressions coefficients: Simple linear regressions coefficients

Description

Simple linear regressions coefficients.

Usage

allbetas(y, x, pvalue = FALSE)

Arguments

y
A numerical vector with the response variable. If the y contains proportions or percentages, i.e. values between 0 and 1, the logit transformation is applied first and the transformed data are used.
x
A matrix with the data, where rows denotes the observations and the columns contain the independent variables.
pvalue
If you want a hypothesis test that each slope (beta coefficient) is equal to zero set this equal to TRUE. It will also produce all the correlations between y and x.

Value

A matrix with the constant (alpha) and the slope (beta) for each simple linear regression.

See Also

correls, univglms, cova, cora

Examples

Run this code
x <- matrix( rnorm(100 * 10000), ncol = 10000 )
y <- rnorm(100)
r <- cor(y, x)  ## correlation of y with each of the xs
a <- allbetas(y, x)  ## the coefficients of each simple linear regression of y with x
b <- matrix(nrow = 10000, ncol = 2)
for (i in 1:10000) b[i, ] = coef( lm.fit( cbind(1,x[,i]), y ) )

x <- matrix( rnorm(100 * 10000), ncol = 10000 )
y <- rnorm(100)
system.time( allbetas(y, x) )
system.time(  for (i in 1:10000) b[i, ] = coef( lm.fit( cbind(1,x[,i]), y ) )  )

Run the code above in your browser using DataLab