Category: Statistical principles

Sparsity-of-effects principle
In the statistical analysis of the results from factorial experiments, the sparsity-of-effects principle states that a system is usually dominated by main effects and low-order interactions. Thus it i
Pareto principle
The Pareto principle states that for many outcomes, roughly 80% of consequences come from 20% of causes (the "vital few"). Other names for this principle are the 80/20 rule, the law of the vital few,
Principle of indifference
The principle of indifference (also called principle of insufficient reason) is a rule for assigning epistemic probabilities. The principle of indifference states that in the absence of any relevant e
Cromwell's rule
Cromwell's rule, named by statistician Dennis Lindley, states that the use of prior probabilities of 1 ("the event will definitely occur") or 0 ("the event will definitely not occur") should be avoide
Conditionality principle
The conditionality principle is a Fisherian principle of statistical inference that Allan Birnbaum formally defined and studied in his 1962 JASA article. Informally, the conditionality principle can b
1% rule
In Internet culture, the 1% rule is a general rule of thumb pertaining to participation in an internet community, stating that only 1% of the users of a website actively create new content, while the
Coherence (statistics)
In probability theory and statistics, coherence can have several different meanings. Coherence in statistics is an indication of the quality of the information, either within a single data set, or bet
Principle of maximum entropy
The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy, in the context of precise
Statistical principle
No description available.
Sufficient statistic
In statistics, a statistic is sufficient with respect to a statistical model and its associated unknown parameter if "no other statistic that can be calculated from the same sample provides any additi
Likelihood principle
In statistics, the likelihood principle is the proposition that, given a statistical model, all the evidence in a sample relevant to model parameters is contained in the likelihood function. A likelih
Principle of marginality
In statistics, the principle of marginality is the fact that the average (or main) effects, of variables in an analysis are marginal to their interaction effect—that is, the main effect of one explana
Principle of maximum caliber
The principle of maximum caliber (MaxCal) or maximum path entropy principle, suggested by E. T. Jaynes, can be considered as a generalization of the principle of maximum entropy. It postulates that th
Orthogonality principle
In statistics and signal processing, the orthogonality principle is a necessary and sufficient condition for the optimality of a Bayesian estimator. Loosely stated, the orthogonality principle says th
Craps principle
In probability theory, the craps principle is a theorem about event probabilities under repeated iid trials. Let and denote two mutually exclusive events which might occur on a given trial. Then the p