Difference between revisions of "Linear model"

From Conservapedia
Jump to: navigation, search
Line 6: Line 6:
  
 
In the comparison two different models will be matched to the data, <math>\varepsilon</math> contains the size of the error for how well the model and data match. The larger the <math>\varepsilon</math> the worse the match. However, models must be penalized for the number of free parameters (<math>\beta</math>) that they posses. A theoretical linear model with an infinite number of parameters can perfectly explain any data set, but this is not a valuable model. Usually the liner model a statistician is interested in is compared against the [[null hypothesis]] linear model which has fewer free parameters, as such the more complicated model must have a smaller <math>\varepsilon</math> in proportion to the number of free parameters to be [[statistically significant]]. The measurement of free parameters is referred to as the [[degrees of freedom]].
 
In the comparison two different models will be matched to the data, <math>\varepsilon</math> contains the size of the error for how well the model and data match. The larger the <math>\varepsilon</math> the worse the match. However, models must be penalized for the number of free parameters (<math>\beta</math>) that they posses. A theoretical linear model with an infinite number of parameters can perfectly explain any data set, but this is not a valuable model. Usually the liner model a statistician is interested in is compared against the [[null hypothesis]] linear model which has fewer free parameters, as such the more complicated model must have a smaller <math>\varepsilon</math> in proportion to the number of free parameters to be [[statistically significant]]. The measurement of free parameters is referred to as the [[degrees of freedom]].
 +
[[category:Probability and Statistics]]

Revision as of 03:29, April 24, 2007

Statistics
Matlab 3dplot.jpg
Major approaches
Frequency probability
Bayesian inference
Non-parametric statistics
Common methods
Analysis of variance
Chi-Square test
Students t-test
Z test
Linear regression
Bayesian model selection
Bootstrapping

Linear models and linear comparisons are statistical methods for comparing how well different models match a given set of data. It is usually written as:

In the comparison two different models will be matched to the data, contains the size of the error for how well the model and data match. The larger the the worse the match. However, models must be penalized for the number of free parameters () that they posses. A theoretical linear model with an infinite number of parameters can perfectly explain any data set, but this is not a valuable model. Usually the liner model a statistician is interested in is compared against the null hypothesis linear model which has fewer free parameters, as such the more complicated model must have a smaller in proportion to the number of free parameters to be statistically significant. The measurement of free parameters is referred to as the degrees of freedom.