- logarithmic capacity
- мат. логарифмическая емкость
Большой англо-русский и русско-английский словарь. 2001.
Большой англо-русский и русско-английский словарь. 2001.
Heat capacity — Thermodynamics … Wikipedia
Channel capacity — In electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel. By the noisy channel coding theorem, the … Wikipedia
Conformal radius — In mathematics, the conformal radius is a way to measure the size of a simply connected planar domain D viewed from a point z in it. As opposed to notions using Euclidean distance (say, the radius of the largest inscribed disk with center z),… … Wikipedia
Nuclear weapon yield — Logarithmic scatterplot comparing the yield (in kilotons) and weight (in kilograms) of all nuclear weapons developed by the United States. The explosive yield of a nuclear weapon is the amount of energy discharged when a nuclear weapon is… … Wikipedia
Logarithm — The graph of the logarithm to base 2 crosses the x axis (horizontal axis) at 1 and passes through the points with coordinates (2, 1), (4, 2), and (8, 3) … Wikipedia
Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… … Wikipedia
global warming — an increase in the earth s average atmospheric temperature that causes corresponding changes in climate and that may result from the greenhouse effect. [1975 80] * * * Potential increase in global average surface temperatures resulting from… … Universalium
History of computing hardware — Computing hardware is a platform for information processing (block diagram) The history of computing hardware is the record of the ongoing effort to make computer hardware faster, cheaper, and capable of storing more data. Computing hardware… … Wikipedia
Shannon–Hartley theorem — In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. It is an application of the noisy channel coding… … Wikipedia
Moore's law — Plot of CPU transistor counts against dates of introduction. Note the logarithmic vertical scale; the line corresponds to exponential growth with transistor count doubling every two years … Wikipedia
Physical information — In physics, physical information refers generally to the information that is contained in a physical system. Its usage in quantum mechanics (i.e. quantum information) is important, for example in the concept of quantum entanglement to describe… … Wikipedia