- smoothed distribution
- мат. сглаженное распределение
Большой англо-русский и русско-английский словарь. 2001.
Большой англо-русский и русско-английский словарь. 2001.
Smoothed Particle Hydrodynamics — (SPH, im Deutschen: geglättete Teilchen Hydrodynamik) ist eine numerische Methode, um die Hydrodynamischen Gleichungen zu lösen. Sie wird unter anderem in der Astrophysik, der Ballistik und bei Tsunami Berechnungen eingesetzt. SPH ist eine… … Deutsch Wikipedia
Smoothed analysis — is a way of measuring the complexity of an algorithm. It gives a more realistic analysis of the practical performance of the algorithm, such as its running time, than using worst case or average case scenarios.For instance the simplex algorithm… … Wikipedia
distribution curve — noun : a graph of the frequencies of different values of a variable in a statistical distribution * * * Statistics. the curve or line of a graph in which cumulative frequencies are plotted as ordinates and values of the variate as abscissas. * *… … Useful english dictionary
Plasma (physics) — For other uses, see Plasma. Plasma lamp, illustrating some of the more complex phenomena of a plasma, including filamentation. The colors are a result of relaxation of electrons in excited states to lower energy states after they have recombined… … Wikipedia
Additive smoothing — In the field of statistical language modeling and statistics, additive smoothing is a technique used to smooth a distribution p ( x ) representing, for example, occurrences of a word x in a text.The additively smoothed distribution is defined… … Wikipedia
SPH — Smoothed Particle Hydrodynamics (SPH, im Deutschen: geglättete Teilchen Hydrodynamik) ist eine numerische Methode, um die Hydrodynamischen Gleichungen zu lösen. Sie wird unter anderem in der Astrophysik, der Ballistik und bei Tsunami Berechnungen … Deutsch Wikipedia
Multivariate kernel density estimation — Kernel density estimation is a nonparametric technique for density estimation i.e., estimation of probability density functions, which is one of the fundamental questions in statistics. It can be viewed as a generalisation of histogram density… … Wikipedia
Bootstrapping (statistics) — In statistics, bootstrapping is a modern, computer intensive, general purpose approach to statistical inference, falling within a broader class of resampling methods.Bootstrapping is the practice of estimating properties of an estimator (such as… … Wikipedia
Central limit theorem — This figure demonstrates the central limit theorem. The sample means are generated using a random number generator, which draws numbers between 1 and 100 from a uniform probability distribution. It illustrates that increasing sample sizes result… … Wikipedia
Exponential smoothing — is a technique that can be applied to time series data, either to produce smoothed data for presentation, or to make forecasts. The time series data themselves are a sequence of observations. The observed phenomenon may be an essentially random… … Wikipedia
Kalman filter — Roles of the variables in the Kalman filter. (Larger image here) In statistics, the Kalman filter is a mathematical method named after Rudolf E. Kálmán. Its purpose is to use measurements observed over time, containing noise (random variations)… … Wikipedia