Part of a series on |
Bayesian statistics |
---|
![]() |
Posterior = Likelihood × Prior ÷ Evidence |
Background |
Model building |
Posterior approximation |
Estimators |
Evidence approximation |
Model evaluation |
The likelihood function (often simply called the likelihood) is the joint probability mass (or probability density) of observed data, but viewed as a function of the parameters of a statistical model.[1][2][3] That is, the likelihood function , which gives the likelihood of a vector of parameters under the assumption that a set of observed data is true, is numerically the same as the probability function , which gives the probability of a set of data under the assumption that a vector of parameters is true.
In maximum likelihood estimation, the arg max (over the parameter ) of the likelihood function serves as a point estimate for , while the Fisher information (often approximated by the likelihood's Hessian matrix) is an indication of the estimate's precision.
In contrast, in Bayesian statistics, parameter estimates are derived from the converse of the likelihood, via a posterior probability calculated via Bayes' rule.[4]
© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search