Category: Bayesian statistics

Bayes' theorem
In probability theory and statistics, Bayes' theorem (alternatively Bayes' law or Bayes' rule), named after Thomas Bayes, describes the probability of an event, based on prior knowledge of conditions
Bayesian model reduction
Bayesian model reduction is a method for computing the evidence and posterior over the parameters of Bayesian models that differ in their priors. A full model is fitted to data using standard approach
Ensemble Kalman filter
The ensemble Kalman filter (EnKF) is a recursive filter suitable for problems with a large number of variables, such as discretizations of partial differential equations in geophysical models. The EnK
Conjugate prior
In Bayesian probability theory, if the posterior distribution is in the same probability distribution family as the prior probability distribution , the prior and posterior are then called conjugate d
G-prior
In statistics, the g-prior is an objective prior for the regression coefficients of a multiple regression. It was introduced by Arnold Zellner.It is a key tool in Bayes and empirical Bayes variable se
Chain rule (probability)
In probability theory, the chain rule (also called the general product rule) permits the calculation of any member of the joint distribution of a set of random variables using only conditional probabi
Prior probability
In Bayesian statistical inference, a prior probability distribution, often simply called the prior, of an uncertain quantity is the probability distribution that would express one's beliefs about this
Neural network Gaussian process
Bayesian networks are a modeling tool for assigning probabilities to events, and thereby characterizing the uncertainty in a model's predictions. Deep learning and artificial neural networks are appro
Bayesian structural time series
Bayesian structural time series (BSTS) model is a statistical technique used for feature selection, time series forecasting, nowcasting, inferring causal impact and other applications. The model is de
Probabilistic soft logic
Probabilistic Soft Logic (PSL) is a statistical relational learning (SRL) framework for modeling probabilistic and relational domains.It is applicable to a variety of machine learning problems, such a
Bayesian regret
In stochastic game theory, Bayesian regret is the expected difference ("regret") between the utility of a Bayesian strategy and that of the optimal strategy (the one with the highest expected payoff).
Credence (statistics)
Credence is a statistical term that expresses how much a person believes that a proposition is true. As an example, a reasonable person will believe with 50% credence that a fair coin will land on hea
Generalised likelihood uncertainty estimation
Generalized likelihood uncertainty estimation (GLUE) is a statistical method used in hydrology for quantifying the uncertainty of model predictions. The method was introduced by Keith Beven and Andrew
Bayesian efficiency
Bayesian efficiency is an analog of Pareto efficiency for situations in which there is incomplete information. Under Pareto efficiency, an allocation of a resource is Pareto efficient if there is no o
A priori probability
An a priori probability is a probability that is derived purely by deductive reasoning. One way of deriving a priori probabilities is the principle of indifference, which has the character of saying t
Bayesian epistemology
Bayesian epistemology is a formal approach to various topics in epistemology that has its roots in Thomas Bayes' work in the field of probability theory. One advantage of its formal method in contrast
Bayes classifier
In statistical classification, the Bayes classifier minimizes the probability of misclassification.
Cromwell's rule
Cromwell's rule, named by statistician Dennis Lindley, states that the use of prior probabilities of 1 ("the event will definitely occur") or 0 ("the event will definitely not occur") should be avoide
Base rate
In probability and statistics, the base rate (also known as prior probabilities) is the class of probabilities unconditional on "featural evidence" (likelihoods). For example, if 1% of the population
Naive Bayes classifier
In statistics, naive Bayes classifiers are a family of simple "probabilistic classifiers" based on applying Bayes' theorem with strong (naive) independence assumptions between the features (see Bayes
Subjective logic
Subjective logic is a type of probabilistic logic that explicitly takes epistemic uncertainty and source trust into account. In general, subjective logic is suitable for modeling and analysing situati
Deviance information criterion
The deviance information criterion (DIC) is a hierarchical modeling generalization of the Akaike information criterion (AIC). It is particularly useful in Bayesian model selection problems where the p
Expectation propagation
Expectation propagation (EP) is a technique in Bayesian machine learning. EP finds approximations to a probability distribution. It uses an iterative approach that uses the factorization structure of
De Finetti's theorem
In probability theory, de Finetti's theorem states that exchangeable observations are conditionally independent relative to some latent variable. An epistemic probability distribution could then be as
Spike-and-slab regression
Spike-and-slab regression is a type of Bayesian linear regression in which a particular hierarchical prior distribution for the regression coefficients is chosen such that only a subset of the possibl
Bayesian statistics
Bayesian statistics is a theory in the field of statistics based on the Bayesian interpretation of probability where probability expresses a degree of belief in an event. The degree of belief may be b
Reference class problem
In statistics, the reference class problem is the problem of deciding what class to use when calculating the probability applicable to a particular case. For example, to estimate the probability of an
Strong prior
In Bayesian statistics, a strong prior is a preceding assumption, theory, concept or idea upon which, after taking account of new information, a current assumption, theory, concept or idea is founded.
Posterior probability
The posterior probability is a type of conditional probability that results from updating the prior probability with information summarized by the likelihood, through an application of Bayes' theorem.
Robust Bayesian analysis
In statistics, robust Bayesian analysis, also called Bayesian sensitivity analysis, is a type of sensitivity analysis applied to the outcome from Bayesian inference or Bayesian optimal decisions.
WinBUGS
WinBUGS is statistical software for Bayesian analysis using Markov chain Monte Carlo (MCMC) methods. It is based on the BUGS (Bayesian inference Using Gibbs Sampling) project started in 1989. It runs
Bayesian program synthesis
In programming languages and machine learning, Bayesian program synthesis (BPS) is a program synthesis technique where Bayesian probabilistic programs automatically construct new Bayesian probabilisti
Gaussian process emulator
In statistics, Gaussian process emulator is one name for a general type of statistical model that has been used in contexts where the problem is to make maximum use of the outputs of a complicated (of
Bayesian history matching
Bayesian history matching is a statistical method for calibrating complex computer models. The equations inside many scientific computer models contain parameters which have a true value, but that tru
Almost sure hypothesis testing
In statistics, almost sure hypothesis testing or a.s. hypothesis testing utilizes almost sure convergence in order to determine the validity of a statistical hypothesis with probability one. This is t
Lewandowski-Kurowicka-Joe distribution
In probability theory and Bayesian statistics, the Lewandowski-Kurowicka-Joe distribution, often referred to as the LKJ distribution, is a continuous probability distribution for a symmetric matrix. I
Bayesian search theory
Bayesian search theory is the application of Bayesian statistics to the search for lost objects. It has been used several times to find lost sea vessels, for example USS Scorpion, and has played a key
Cochran–Mantel–Haenszel statistics
In statistics, the Cochran–Mantel–Haenszel test (CMH) is a test used in the analysis of stratified or matched categorical data. It allows an investigator to test the association between a binary predi
Markov logic network
A Markov logic network (MLN) is a probabilistic logic which applies the ideas of a Markov network to first-order logic, enabling uncertain inference. Markov logic networks generalize first-order logic
Watanabe–Akaike information criterion
In statistics, the widely applicable information criterion (WAIC), also known as Watanabe–Akaike information criterion, is the generalized version of the Akaike information criterion (AIC) onto singul
Coherence (statistics)
In probability theory and statistics, coherence can have several different meanings. Coherence in statistics is an indication of the quality of the information, either within a single data set, or bet
Posterior predictive distribution
In Bayesian statistics, the posterior predictive distribution is the distribution of possible unobserved values conditional on the observed values. Given a set of N i.i.d. observations , a new value w
Admissible decision rule
In statistical decision theory, an admissible decision rule is a rule for making a decision such that there is no other rule that is always "better" than it (or at least sometimes better and never wor
Approximate Bayesian computation
Approximate Bayesian computation (ABC) constitutes a class of computational methods rooted in Bayesian statistics that can be used to estimate the posterior distributions of model parameters. In all m
Continuous Individualized Risk Index
Continuous Individualized Risk Index (CIRI) (initialism pronounced /ˈsɪri/) is to a set of probabilistic risk models utilizing Bayesian statistics for integrating diverse cancer biomarkers over time t
Information field theory
Information field theory (IFT) is a Bayesian statistical field theory relating to signal reconstruction, cosmography, and other related areas. IFT summarizes the information available on a physical fi
Hyperprior
In Bayesian statistics, a hyperprior is a prior distribution on a hyperparameter, that is, on a parameter of a prior distribution. As with the term hyperparameter, the use of hyper is to distinguish i
Bayesian programming
Bayesian programming is a formalism and a methodology for having a technique to specify probabilistic models and solve problems when less than the necessary information is available. Edwin T. Jaynes p
Bayes error rate
In statistical classification, Bayes error rate is the lowest possible error rate for any classifier of a random outcome (into, for example, one of two categories) and is analogous to the irreducible
Bayes linear statistics
Bayes linear statistics is a subjectivist statistical methodology and framework. Traditional subjective Bayesian analysis is based upon fully specified probability distributions, which are very diffic
Lindley's paradox
Lindley's paradox is a counterintuitive situation in statistics in which the Bayesian and frequentist approaches to a hypothesis testing problem give different results for certain choices of the prior
Calibrated probability assessment
Calibrated probability assessments are subjective probabilities assigned by individuals who have been trained to assess probabilities in a way that historically represents their uncertainty. For examp
Bayesian vector autoregression
In statistics and econometrics, Bayesian vector autoregression (BVAR) uses Bayesian methods to estimate a vector autoregression (VAR) model. BVAR differs with standard VAR models in that the model par
Radical probabilism
Radical probabilism is a hypothesis in philosophy, in particular epistemology, and probability theory that holds that no facts are known for certain. That view holds profound implications for statisti
Jeffreys prior
In Bayesian probability, the Jeffreys prior, named after Sir Harold Jeffreys, is a non-informative (objective) prior distribution for a parameter space; its density function is proportional to the squ
Aumann's agreement theorem
Aumann's agreement theorem was stated and proved by Robert Aumann in a paper titled "Agreeing to Disagree", which introduced the set theoretic description of common knowledge. The theorem concerns age
International Society for Bayesian Analysis
The International Society for Bayesian Analysis (ISBA) is a society with the goal of promoting Bayesian analysis for solving problems in the sciences and government. It was formally incorporated as a
Graph cuts in computer vision
As applied in the field of computer vision, graph cut optimization can be employed to efficiently solve a wide variety of low-level computer vision problems (early vision), such as image smoothing, th
Variational Bayesian methods
Variational Bayesian methods are a family of techniques for approximating intractable integrals arising in Bayesian inference and machine learning. They are typically used in complex statistical model
Indian buffet process
In the mathematical theory of probability, the Indian buffet process (IBP) is a stochastic process defining a probability distribution over sparse binary matrices with a finite number of rows and an i
Quantum Bayesianism
In physics and the philosophy of physics, quantum Bayesianism is a collection of related approaches to the interpretation of quantum mechanics, of which the most prominent is QBism (pronounced "cubism
Dependent Dirichlet process
In the mathematical theory of probability, the dependent Dirichlet process (DDP) provides a non-parametric prior over evolving mixture models. A construction of the DDP built on a Poisson point proces
Probability of direction
In Bayesian statistics, the Probability of Direction (pd) is a measure of effect existence representing the certainty with which an effect is positive or negative. This index is numerically similar to
Precision (statistics)
In statistics, the precision matrix or concentration matrix is the matrix inverse of the covariance matrix or dispersion matrix, .For univariate distributions, the precision matrix degenerates into a
Subjectivism
Subjectivism is the doctrine that "our own mental activity is the only unquestionable fact of our experience", instead of shared or communal, and that there is no external or objective truth. The succ
Graphical model
A graphical model or probabilistic graphical model (PGM) or structured probabilistic model is a probabilistic model for which a graph expresses the conditional dependence structure between random vari
Speed prior
The speed prior is a complexity measure similar to Kolmogorov complexity, except that it is based on computation speed as well as programlength.The speed prior complexity of a program is its size in b
Bayesian interpretation of kernel regularization
Within bayesian statistics for machine learning, kernel methods arise from the assumption of an inner product space or similarity structure on inputs. For some such methods, such as support vector mac
Solomonoff's theory of inductive inference
Solomonoff's theory of inductive inference is a mathematical proof that if a universe is generated by an algorithm, then observations of that universe, encoded as a dataset, are best predicted by the
Bayesian survival analysis
Survival analysis is normally carried out using parametric models, semi-parametric models, non-parametric models to estimate the survival rate in clinical research. However recently Bayesian models ar
Evidence under Bayes' theorem
The use of evidence under Bayes' theorem relates to the probability of finding evidence in relation to the accused, where Bayes' theorem concerns the probability of an event and its inverse. Specifica
Prosecutor's fallacy
The prosecutor's fallacy is a fallacy of statistical reasoning involving a test for an occurrence, such as a DNA match. A positive result in the test may paradoxically be more likely to be an erroneou
Principle of maximum caliber
The principle of maximum caliber (MaxCal) or maximum path entropy principle, suggested by E. T. Jaynes, can be considered as a generalization of the principle of maximum entropy. It postulates that th
Bayesian approaches to brain function
Bayesian approaches to brain function investigate the capacity of the nervous system to operate in situations of uncertainty in a fashion that is close to the optimal prescribed by Bayesian statistics
Bayesian econometrics
Bayesian econometrics is a branch of econometrics which applies Bayesian principles to economic modelling. Bayesianism is based on a degree-of-belief interpretation of probability, as opposed to a rel
Cross-species transmission
Cross-species transmission (CST), also called interspecies transmission, host jump, or spillover, is the transmission of an infectious pathogen, such as a virus, between hosts belonging to different s
Principle of maximum entropy
The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy, in the context of precise
Likelihood function
The likelihood function (often simply called the likelihood) is the joint probability of the observed data viewed as a function of the parameters of the chosen statistical model. To emphasize that the
Bayesian probability
Bayesian probability is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation represent
German tank problem
In the statistical theory of estimation, the German tank problem consists of estimating the maximum of a discrete uniform distribution from sampling without replacement. In simple terms, suppose there
Hyperparameter
In Bayesian statistics, a hyperparameter is a parameter of a prior distribution; the term is used to distinguish them from parameters of the model for the underlying system under analysis. For example
Marginal likelihood
A marginal likelihood is a likelihood function that has been integrated over the parameter space. In Bayesian statistics, it represents the probability of generating the observed sample from a prior a
Sparse binary polynomial hashing
Sparse binary polynomial hashing (SBPH) is a generalization of Bayesian spam filtering that can match mutating phrases as well as single words. SBPH is a way of generating a large number of features f
Bayesian game
In game theory, a Bayesian game is a game that models the outcome of player interactions using aspects of Bayesian probability. Bayesian games are notable because they allowed, for the first time in g
Bayesian experimental design
Bayesian experimental design provides a general probability-theoretical framework from which other theories on experimental design can be derived. It is based on Bayesian inference to interpret the ob
Inverse probability
In probability theory, inverse probability is an obsolete term for the probability distribution of an unobserved variable. Today, the problem of determining an unobserved variable (by whatever method)
Kernel (statistics)
The term kernel is used in statistical analysis to refer to a window function. The term "kernel" has several distinct meanings in different branches of statistics.
Extrapolation domain analysis
Extrapolation domain analysis (EDA) is a methodology for identifying geographical areas that seem suitable for adoption of innovative ecosystem management practices on the basis of sites exhibiting si
Sunrise problem
The sunrise problem can be expressed as follows: "What is the probability that the sun will rise tomorrow?" The sunrise problem illustrates the difficulty of using probability theory when evaluating t
Odds ratio
An odds ratio (OR) is a statistic that quantifies the strength of the association between two events, A and B. The odds ratio is defined as the ratio of the odds of A in the presence of B and the odds
Data assimilation
Data assimilation is a mathematical discipline that seeks to optimally combine theory (usually in the form of a numerical model) with observations. There may be a number of different goals sought – fo
Nested sampling algorithm
The nested sampling algorithm is a computational approach to the Bayesian statistics problems of comparing models and generating samples from posterior distributions. It was developed in 2004 by physi
Abductive reasoning
Abductive reasoning (also called abduction, abductive inference, or retroduction) is a form of logical inference formulated and advanced by American philosopher Charles Sanders Peirce beginning in the
Variational autoencoder
In machine learning, a variational autoencoder (VAE), is an artificial neural network architecture introduced by Diederik P. Kingma and Max Welling, belonging to the families of probabilistic graphica