generalized entropy
Смотреть что такое "generalized entropy" в других словарях:
Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… … Wikipedia
Entropy of mixing — The entropy of mixing is the change in the configuration entropy, an extensive thermodynamic quantity, when two different chemical substances or components are mixed. This entropy change must be positive since there is more uncertainty about the… … Wikipedia
Generalized Helmholtz theorem — The generalized Helmholtz theorem is the multi dimensional generalization of the Helmholtz theorem which is valid only in one dimension. The generalized Helmholtz theorem reads as follows.Let:mathbf{p}=(p 1,p 2,...,p s), :mathbf{q}=(q 1,q 2,...,q … Wikipedia
Generalized inverse Gaussian distribution — Probability distribution name =Generalized inverse Gaussian type =density pdf cdf parameters = a > 0, b > 0, p real support = x > 0 pdf =f(x) = frac{(a/b)^{p/2{2 K p(sqrt{ab})} x^{(p 1)} e^{ (ax + b/x)/2} cdf = mean =frac{sqrt{b} K { 1 p}(sqrt{a… … Wikipedia
Generalized extreme value distribution — Probability distribution name =Generalized extreme value type =density pdf cdf parameters =mu in [ infty,infty] , location (real) sigma in (0,infty] , scale (real) xiin [ infty,infty] , shape (real) support =x>mu sigma/xi,;(xi > 0) x … Wikipedia
Rényi entropy — In information theory, the Rényi entropy, a generalisation of Shannon entropy, is one of a family of functionals for quantifying the diversity, uncertainty or randomness of a system. It is named after Alfréd Rényi. The Rényi entropy of order α,… … Wikipedia
Maximum entropy probability distribution — In statistics and information theory, a maximum entropy probability distribution is a probability distribution whose entropy is at least as great as that of all other members of a specified class of distributions. According to the principle of… … Wikipedia
Differential entropy — (also referred to as continuous entropy) is a concept in information theory that extends the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Contents 1 Definition 2… … Wikipedia
Joint entropy — The joint entropy is an entropy measure used in information theory. The joint entropy measures how much entropy is contained in a joint system of two random variables. If the random variables are X and Y, the joint entropy is written H(X,Y). Like … Wikipedia
Conditional entropy — Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y). In information theory, the conditional entropy (or equivocation) quantifies the remaining entropy (i.e.… … Wikipedia
Atkinson-Maß — Das Atkinson Maß (nach Anthony Atkinson [* 1944]) ist ein Ungleichverteilungsmaß, mit dem beispielsweise die Einkommens oder Vermögensungleichheit in einer Gesellschaft berechnet werden kann. Inhaltsverzeichnis 1 Definition 2 Eigenschaften 3 … Deutsch Wikipedia