- sufficient estimator
- мат. достаточная оценка
Большой англо-русский и русско-английский словарь. 2001.
Большой англо-русский и русско-английский словарь. 2001.
Sufficient statistic — In statistics, a sufficient statistic is a statistic which has the property of sufficiency with respect to a statistical model and its associated unknown parameter, meaning that no other statistic which can be calculated from the same sample… … Wikipedia
Minimum-variance unbiased estimator — In statistics a uniformly minimum variance unbiased estimator or minimum variance unbiased estimator (UMVUE or MVUE) is an unbiased estimator that has lower variance than any other unbiased estimator for all possible values of the parameter. The… … Wikipedia
Kaplan–Meier estimator — The Kaplan–Meier estimator,[1][2] also known as the product limit estimator, is an estimator for estimating the survival function from life time data. In medical research, it is often used to measure the fraction of patients living for a certain… … Wikipedia
Sufficiency (statistics) — In statistics, sufficiency is the property possessed by a statistic, with respect to a parameter, when no other statistic which can be calculated from the same sample provides any additional information as to the value of the parameter cite… … Wikipedia
Mid-range — For loudspeakers, see mid range speaker. In statistics, the mid range or mid extreme of a set of statistical data values is the arithmetic mean of the maximum and minimum values in a data set,[1] or: As such, it is a measure of central tendency.… … Wikipedia
Rao–Blackwell theorem — In statistics, the Rao–Blackwell theorem is a result which characterizes the transformation of an arbitrarily crude estimator into an estimator that is optimal by the mean squared error criterion or any of a variety of similar criteria. The Rao… … Wikipedia
Maximum likelihood — In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of a statistical model. When applied to a data set and given a statistical model, maximum likelihood estimation provides estimates for the model s… … Wikipedia
Linear regression — Example of simple linear regression, which has one independent variable In statistics, linear regression is an approach to modeling the relationship between a scalar variable y and one or more explanatory variables denoted X. The case of one… … Wikipedia
Orthogonality principle — In statistics and signal processing, the orthogonality principle is a necessary and sufficient condition for the optimality of a Bayesian estimator. Loosely stated, the orthogonality principle says that the error vector of the optimal estimator… … Wikipedia
Sample maximum and minimum — Box plots of the Michelson–Morley experiment, showing sample maximums and minimums. In statistics, the maximum and sample minimum, also called the largest observation, and smallest observation, are the values of the greatest and least elements of … Wikipedia
Fisher information — In statistics and information theory, the Fisher information (denoted mathcal{I}( heta)) is the variance of the score. It is named in honor of its inventor, the statistician R.A. Fisher.DefinitionThe Fisher information is a way of measuring the… … Wikipedia