This convenience function calculates the A3 results specifically for linear regressions. It uses R's glm function and so supports logistic regressions and other link functions using the family argument. For other forms of models you may use the more general a3 function.
Usage
a3.lm(formula, data, family = gaussian, ...)
Arguments
formula
the regression formula.
data
a data frame containing the data to be used in the model fit.
family
the regression family. Typically 'gaussian' for linear regressions.
...
additional arguments passed to a3.base.
Value
S3 A3 object; see a3.base for details
Examples
## Standard linear regression results:
summary(lm(rating ~ ., attitude))
## A3 linear regression results:
# In practice, p.acc should be <= 0.01 in order
# to obtain fine grained p values.
a3.lm(rating ~ ., attitude, p.acc = 0.1)
# This is equivalent both to:
a3(rating ~ ., attitude, glm, model.args = list(family = gaussian), p.acc = 0.1)
# and also to:
a3(rating ~ ., attitude, lm, p.acc = 0.1)
Results
R version 3.3.1 (2016-06-21) -- "Bug in Your Hair"
Copyright (C) 2016 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)
R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.
R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.
Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.
> library(A3)
Loading required package: xtable
Loading required package: pbapply
> png(filename="/home/ddbj/snapshot/RGM3/R_CC/result/A3/a3.lm.Rd_%03d_medium.png", width=480, height=480)
> ### Name: a3.lm
> ### Title: A3 for Linear Regressions
> ### Aliases: a3.lm
>
> ### ** Examples
>
> ## No test:
> ## Standard linear regression results:
>
> summary(lm(rating ~ ., attitude))
Call:
lm(formula = rating ~ ., data = attitude)
Residuals:
Min 1Q Median 3Q Max
-10.9418 -4.3555 0.3158 5.5425 11.5990
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 10.78708 11.58926 0.931 0.361634
complaints 0.61319 0.16098 3.809 0.000903 ***
privileges -0.07305 0.13572 -0.538 0.595594
learning 0.32033 0.16852 1.901 0.069925 .
raises 0.08173 0.22148 0.369 0.715480
critical 0.03838 0.14700 0.261 0.796334
advance -0.21706 0.17821 -1.218 0.235577
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
Residual standard error: 7.068 on 23 degrees of freedom
Multiple R-squared: 0.7326, Adjusted R-squared: 0.6628
F-statistic: 10.5 on 6 and 23 DF, p-value: 1.24e-05
>
> ## A3 linear regression results:
>
> # In practice, p.acc should be <= 0.01 in order
> # to obtain fine grained p values.
>
> a3.lm(rating ~ ., attitude, p.acc = 0.1)
Average Slope CV R^2 p value
-Full Model- 57.5 % < 0.1
(Intercept) 10.7871 - 4.5 % 0.7
complaints 0.6132 + 21.9 % < 0.1
privileges -0.0731 - 3.7 % 0.7
learning 0.3203 + 8.6 % 0.1
raises 0.0817 - 1.6 % 0.2
critical 0.0384 - 4.1 % 0.4
advance -0.2171 + 2.8 % < 0.1
>
> # This is equivalent both to:
>
> a3(rating ~ ., attitude, glm, model.args = list(family = gaussian), p.acc = 0.1)
Average Slope CV R^2 p value
-Full Model- 57.4 % < 0.1
(Intercept) 10.7871 - 3.9 % 0.8
complaints 0.6132 + 21.5 % < 0.1
privileges -0.0731 - 2.4 % 0.5
learning 0.3203 + 3.3 % 0.1
raises 0.0817 - 2.2 % 0.7
critical 0.0384 - 4.3 % 1.0
advance -0.2171 - 0.6 % 0.2
>
> # and also to:
>
> a3(rating ~ ., attitude, lm, p.acc = 0.1)
Average Slope CV R^2 p value
-Full Model- 50.1 % < 0.1
(Intercept) 10.7871 - 7.3 % 1.0
complaints 0.6132 + 18.3 % < 0.1
privileges -0.0731 - 7.1 % 0.8
learning 0.3203 + 4.2 % 0.2
raises 0.0817 - 4.8 % 0.9
critical 0.0384 - 5.2 % 1.0
advance -0.2171 - 1.6 % 0.4
>
> ## End(No test)
>
>
>
>
>
> dev.off()
null device
1
>