# Bayes Factor

The Bayes Factor of model class M1 to model class M2 for some set of observations X is the ratio of their associated marginal class likelihoods. The marginal class likelihood of some set of observations is the marginal likelihood of the observations for the model class obtained by marginalizing the joint probability distribution of the observations and the model class parameters by treating the model class parameters θ as nuisance parameters. I.e.,

$m_{\theta}(X|M) = \int_\theta p(X,\theta|M) \, d\theta = \int_\theta p(X|\theta , M) \, p(\theta|M) \, d\theta$

illustrating that it may be calculated directly using the joint probability distribution or from the conditional probability distribution where the second form of the integral was obtained using the Bayesian Product Rule. Thus the Bayes factor for model class M1 with parameters θ1 to model class M2 with parameters θ2 is

$BF(X,M_1,M_2) = \frac{m_{\theta_1}(X|M_1)}{m_{\theta_2}(X|M_2)}$

By integrating out the model class parameters for comparative classes of likelihood models, one is effectively comparing the weighted strengths of the predictions of the two model classes concerning the observed data. In particular, in determining which model class has the largest posterior odds ratio (of which the Bayes factor is a principle component) when compared against all others, the model class or mathematical model which best explains the data among the competition is determined. It is then left to determine the best inference as to the values of that models parameters via parameter estimation.