Category: Measures of complexity

Effective complexity
Effective complexity is a measure of complexity defined in a 1996 paper by Murray Gell-Mann and Seth Lloyd that attempts to measure the amount of non-random information in a system. It has been critic
Forecasting complexity
Forecasting complexity is a measure of complexity put forward (under the original name of) by the physicist Peter Grassberger. It was later renamed "statistical complexity" by James P. Crutchfield and
Natarajan dimension
In the theory of Probably Approximately Correct Machine Learning, the dimension characterizes the complexity of learning a set of functions, generalizing from the Vapnik-Chervonenkis dimension for boo
Information fluctuation complexity
Information fluctuation complexity is an information-theoretic quantity defined as the fluctuation of information about entropy. It is derivable from fluctuations in the predominance of order and chao
Sophistication (complexity theory)
In algorithmic information theory, sophistication is a measure of complexity related to algorithmic entropy. When K is the Kolmogorov complexity and c is a constant, the sophistication of x can be def
Kolmogorov complexity
In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program (in a
Diameter (group theory)
In the area of abstract algebra known as group theory, the diameter of a finite group is a measure of its complexity. Consider a finite group , and any set of generators S. Define to be the graph diam
Logical depth
Logical depth is a measure of complexity for individual strings devised by Charles H. Bennett based on the computational complexity of an algorithm that can recreate a given piece of information. It d
Self-dissimilarity
Self-dissimilarity is a measure of complexity defined in a series of papers by David Wolpert and .The degrees of self-dissimilarity between the patterns of a system observed at various scales (e.g. th
Complexity measure
No description available.
Vapnik–Chervonenkis dimension
In Vapnik–Chervonenkis theory, the Vapnik–Chervonenkis (VC) dimension is a measure of the capacity (complexity, expressive power, richness, or flexibility) of a set of functions that can be learned by
Rademacher complexity
In computational learning theory (machine learning and theory of computation), Rademacher complexity, named after Hans Rademacher, measures richness of a class of real-valued functions with respect to
Complexity class
In computational complexity theory, a complexity class is a set of computational problems of related resource-based complexity. The two most commonly analyzed resources are time and memory. In general
Growth function
The growth function, also called the shatter coefficient or the shattering number, measures the richness of a set family. It is especially used in the context of statistical learning theory, where it