logarithmic criterion

logarithmic criterion
мат. логарифмический критерий

Большой англо-русский и русско-английский словарь. 2001.

Игры ⚽ Нужно сделать НИР?

Смотреть что такое "logarithmic criterion" в других словарях:

  • Logarithmic norm — In mathematics, a logarithmic norm or Lozinskiĭ measure is a scalar quantity associated with a complex square matrix and an induced matrix norm. It was independently introduced by Germund Dahlquist and Sergei Lozinskiĭ in 1958. [Torsten Ström, On …   Wikipedia

  • Kelly criterion — In probability theory, the Kelly criterion, or Kelly strategy or Kelly formula, or Kelly bet, is a formula used to determine to optimal size of a series of bets. Under some simplifying assumptions, the Kelly strategy will do better than any… …   Wikipedia

  • Peter Balazs (Mathematiker) — Peter Balazs Peter Balazs (* 11. Dezember 1970 in Tulln an der Donau) ist ein österreichischer Mathematiker am Institut für Schallforschung Wien der Österreichischen Akademie der Wissenschaften. Inhaltsverzeichnis …   Deutsch Wikipedia

  • solids, mechanics of — ▪ physics Introduction       science concerned with the stressing (stress), deformation (deformation and flow), and failure of solid materials and structures.       What, then, is a solid? Any material, fluid or solid, can support normal forces.… …   Universalium

  • Bode plot — ; the straight line approximations are labeled Bode pole ; phase varies from 90° at low frequencies (due to the contribution of the numerator, which is 90° at all frequencies) to 0° at high frequencies (where the phase contribution of the… …   Wikipedia

  • List of mathematics articles (L) — NOTOC L L (complexity) L BFGS L² cohomology L function L game L notation L system L theory L Analyse des Infiniment Petits pour l Intelligence des Lignes Courbes L Hôpital s rule L(R) La Géométrie Labeled graph Labelled enumeration theorem Lack… …   Wikipedia

  • Kullback–Leibler divergence — In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence, information gain, relative entropy, or KLIC) is a non symmetric measure of the difference between two probability distributions P …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

  • Auxiliary function — In mathematics, auxiliary functions are an important construction in transcendental number theory. They are functions which appear in most proofs in this area of mathematics and that have specific, desirable properties, such as taking the value… …   Wikipedia

  • Argument principle — In complex analysis, the Argument principle (or Cauchy s argument principle) states that if f ( z ) is a meromorphic function inside and on some closed contour C , with f having no zeros or poles on C , then the following formula holds: oint {C}… …   Wikipedia

  • Riemann hypothesis — The real part (red) and imaginary part (blue) of the Riemann zeta function along the critical line Re(s) = 1/2. The first non trivial zeros can be seen at Im(s) = ±14.135, ±21.022 and ±25.011 …   Wikipedia


Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»