- recursive smoothing
- рекурсивное сглаживание
Большой англо-русский и русско-английский словарь. 2001.
Большой англо-русский и русско-английский словарь. 2001.
Recursive Bayesian estimation — is a general probabilistic approach for estimating an unknown probability density function recursively over time using incoming measurements and a mathematical process model. Model The true state x is assumed to be an unobserved Markov process,… … Wikipedia
Smoothing — In statistics and image processing, to smooth a data set is to create an approximating function that attempts to capture important patterns in the data, while leaving out noise or other fine scale structures/rapid phenomena. Many different… … Wikipedia
Scale space implementation — Scale space Scale space axioms Scale space implementation Feature detection Edge detection Blob detection Corner detection … Wikipedia
Kalman filter — Roles of the variables in the Kalman filter. (Larger image here) In statistics, the Kalman filter is a mathematical method named after Rudolf E. Kálmán. Its purpose is to use measurements observed over time, containing noise (random variations)… … Wikipedia
Multi-scale approaches — The scale space representation of a signal obtained by Gaussian smoothing satisfies a number of special properties, scale space axioms, which make it into a special form of multi scale representation. There are, however, also other types of multi … Wikipedia
List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… … Wikipedia
Scale space — theory is a framework for multi scale signal representation developed by the computer vision, image processing and signal processing communities with complementary motivations from physics and biological vision. It is a formal theory for handling … Wikipedia
Canny edge detector — The Canny edge detection operator was developed by John F. Canny in 1986 and uses a multi stage algorithm to detect a wide range of edges in images. Most importantly, Canny also produced a computational theory of edge detection explaining why the … Wikipedia
automata theory — Body of physical and logical principles underlying the operation of any electromechanical device (an automaton) that converts information input in one form into another, or into some action, according to an algorithm. Norbert Wiener and Alan M.… … Universalium
Edge detection — is a terminology in image processing and computer vision, particularly in the areas of feature detection and feature extraction, to refer to algorithms which aim at identifying points in a digital image at which the image brightness changes… … Wikipedia
Parsing — In computer science and linguistics, parsing, or, more formally, syntactic analysis, is the process of analyzing a sequence of tokens to determine their grammatical structure with respect to a given (more or less) formal grammar.Parsing is also… … Wikipedia