linear.hypothesis {car} | R Documentation |
Generic function for testing a linear hypothesis, and methods
for linear models, generalized linear models, and other models
that have methods for coef
and vcov
.
linear.hypothesis(model, ...) lht(model, ...) ## Default S3 method: linear.hypothesis(model, hypothesis.matrix, rhs=NULL, test=c("Chisq", "F"), vcov.=NULL, verbose=FALSE, ...) ## S3 method for class 'lm': linear.hypothesis(model, hypothesis.matrix, rhs=NULL, test=c("F", "Chisq"), vcov.=NULL, white.adjust=FALSE, ...) ## S3 method for class 'glm': linear.hypothesis(model, ...)
model |
fitted model object. The default method works for models
for which the estimated parameters can be retrieved by coef and
the corresponding estimated covariance matrix by vcov . See the
Details for more information. |
hypothesis.matrix |
matrix (or vector) giving linear combinations of coefficients by rows, or a character vector giving the hypothesis in symbolic form (see Details). |
rhs |
right-hand-side vector for hypothesis, with as many entries as rows in the hypothesis matrix; can be omitted, in which case it defaults to a vector of zeroes. |
test |
character specifying whether to compute the finite sample F statistic (with approximate F distribution) or the large sample Chi-squared statistic (with asymptotic Chi-squared distribution). |
vcov. |
a function for estimating the covariance matrix of the regression
coefficients, e.g., hccm , or an estimated covariance matrix
for model . See also white.adjust . |
white.adjust |
logical or character. Convenience interface to hccm
(instead of using the argument vcov ). Can be set either to a character
specifying the type argument of hccm or TRUE ,
in which case "hc3" is used implicitly. For backwards compatibility. |
verbose |
If TRUE , the hypothesis matrix and right-hand-side
vector are printed to standard output; if FALSE (the default),
the hypothesis is only printed in symbolic form. |
... |
aruments to pass down. |
Computes either a finite sample F statistic or asymptotic Chi-squared
statistic for carrying out a Wald-test-based comparison between a model
and a linearly restricted model. The default method will work with any
model object for which the coefficient vector can be retrieved by
coef
and the coefficient-covariance matrix by vcov
(otherwise
the argument vcov.
has to be set explicitely). For computing the
F statistic (but not the Chi-squared statistic) a df.residual
method needs to be available. If a formula
method exists, it is
used for pretty printing.
The method for "lm"
objects calls the default method, but it
changes the default test to "F"
, supports the convenience argument
white.adjust
(for backwards compatibility), and enhances the output
by residual sums of squares. For "glm"
objects just the default
method is called (bypassing the "lm"
method).
The function lht
also dispatches to linear.hypothesis
.
The hypothesis matrix can be supplied as a numeric matrix (or vector), the rows of which specify linear combinations of the model coefficients, which are tested equal to the corresponding entries in the righ-hand-side vector, which defaults to a vector of zeroes.
Alternatively, the hypothesis can be specified symbolically as a character vector with one or more elements, each of which gives either a linear combination of coefficients, or a linear equation in the coefficients (i.e., with both a left and right side separated by an equals sign). Components of a linear expression or linear equation can consist of numeric constants, or numeric constants multiplying coefficient names (in which case the number precedes the coefficient, and may be separated from it by spaces or an asterisk); constants of 1 or -1 may be omitted. Spaces are always optional. Components are separated by positive or negative signs. See the examples below.
An object of class "anova"
which contains the residual degrees of freedom
in the model, the difference in degrees of freedom, Wald statistic
(either "F"
or "Chisq"
) and corresponding p value.
Achim Zeleis and John Fox jfox@mcmaster.ca
Fox, J. (1997) Applied Regression, Linear Models, and Related Methods. Sage.
anova
, Anova
, waldtest
,
hccm
, vcovHC
, vcovHAC
,
coef
, vcov
mod.davis <- lm(weight~repwt, data=Davis) ## the following are equivalent: linear.hypothesis(mod.davis, diag(2), c(0,1)) linear.hypothesis(mod.davis, c("(Intercept) = 0", "repwt = 1")) linear.hypothesis(mod.davis, c("(Intercept)", "repwt"), c(0,1)) linear.hypothesis(mod.davis, c("(Intercept)", "repwt = 1")) ## use asymptotic Chi-squared statistic linear.hypothesis(mod.davis, c("(Intercept) = 0", "repwt = 1"), test = "Chisq") ## the following are equivalent: ## use HC3 standard errors via white.adjust option linear.hypothesis(mod.davis, c("(Intercept) = 0", "repwt = 1"), white.adjust = TRUE) ## covariance matrix *function* linear.hypothesis(mod.davis, c("(Intercept) = 0", "repwt = 1"), vcov = hccm) ## covariance matrix *estimate* linear.hypothesis(mod.davis, c("(Intercept) = 0", "repwt = 1"), vcov = hccm(mod.davis, type = "hc3")) mod.duncan <- lm(prestige ~ income + education, data=Duncan) ## the following are all equivalent: linear.hypothesis(mod.duncan, "1*income - 1*education = 0") linear.hypothesis(mod.duncan, "income = education") linear.hypothesis(mod.duncan, "income - education") linear.hypothesis(mod.duncan, "1income - 1education = 0") linear.hypothesis(mod.duncan, "0 = 1*income - 1*education") linear.hypothesis(mod.duncan, "income-education=0") linear.hypothesis(mod.duncan, "1*income - 1*education + 1 = 1") linear.hypothesis(mod.duncan, "2income = 2*education") mod.duncan.2 <- lm(prestige ~ type*(income + education), data=Duncan) coefs <- names(coef(mod.duncan.2)) ## test against the null model (i.e., only the intercept is not set to 0) linear.hypothesis(mod.duncan.2, coefs[-1]) ## test all interaction coefficients equal to 0 linear.hypothesis(mod.duncan.2, coefs[grep(":", coefs)], verbose=TRUE)