metric entropy

metric entropy
мат. метрическая энтропия

Большой англо-русский и русско-английский словарь. 2001.

Игры ⚽ Поможем написать курсовую

Смотреть что такое "metric entropy" в других словарях:

  • Entropy (disambiguation) — Additional relevant articles may be found in the following categories: Thermodynamic entropy Entropy and information Quantum mechanical entropy Entropy, in thermodynamics, is a measure of the energy in a thermodynamic system not available to do… …   Wikipedia

  • Topological entropy — In mathematics, the topological entropy of a topological dynamical system is a nonnegative real number that measures the complexity of the system. Topological entropy was first introduced in 1965 by Adler, Konheim and McAndrew. Their definition… …   Wikipedia

  • Quantum relative entropy — In quantum information theory, quantum relative entropy is a measure of distinguishability between two quantum states. It is the quantum mechanical analog of relative entropy. Motivation For simplicity, it will be assumed that all objects in the… …   Wikipedia

  • Volume entropy — Among the various notions of entropy found in dynamical systems, differential geometry, and geometric group theory, an important role is played by the volume entropy.Let (M,g) be a closed surface with a Riemannian metric g . Denote by ( ilde{M},… …   Wikipedia

  • Income inequality metrics — The concept of inequality is distinct from that of poverty[1] and fairness. Income inequality metrics or income distribution metrics are used by social scientists to measure the distribution of income, and economic inequality among the… …   Wikipedia

  • Mutual information — Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y). In probability theory and information theory, the mutual information (sometimes known by the archaic term… …   Wikipedia

  • Kullback–Leibler divergence — In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence, information gain, relative entropy, or KLIC) is a non symmetric measure of the difference between two probability distributions P …   Wikipedia

  • Black hole — For other uses, see Black hole (disambiguation). Simulated view of a black hole (center) in front of the Large Magellanic Cloud. Note the gravitat …   Wikipedia

  • Systolic geometry — In mathematics, systolic geometry is the study of systolic invariants of manifolds and polyhedra, as initially conceived by Charles Loewner, and developed by Mikhail Gromov and others, in its arithmetic, ergodic, and topological manifestations.… …   Wikipedia

  • Conversion of units — is the conversion between different units of measurement for the same quantity, typically through multiplicative conversion factors. Contents 1 Techniques 1.1 Process 1.2 Multiplication factors …   Wikipedia

  • Degree of anonymity — In anonymity networks (e.g. Tor, Crowds, Mixmaster, Tarzan, etc.) it is important to be able to measure quantitatively the guarantee that is given to the system. The degree of anonymity d is a device that was proposed at the 2002 Privacy… …   Wikipedia


Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»