method of descent
Смотреть что такое "method of descent" в других словарях:
Hadamard's method of descent — In mathematics, the method of descent is the term coined by the French mathematician Jacques Hadamard as a method for solving a partial differential equation in several real or complex variables, by regarding it as the specialisation of an… … Wikipedia
Descent — Not to be confused with Dissent. Descent may refer to: In genealogy and inheritance: Common descent, concept in evolutionary biology Kinship and descent, one of the major concepts of cultural anthropology Pedigree chart or family tree Ancestry… … Wikipedia
Method of steepest descent — For the optimization algorithm, see Gradient descent. In mathematics, the method of steepest descent or stationary phase method or saddle point method is an extension of Laplace s method for approximating an integral, where one deforms a contour… … Wikipedia
descent — /di sent /, n. 1. the act, process, or fact of descending. 2. a downward inclination or slope. 3. a passage or stairway leading down. 4. derivation from an ancestor; lineage; extraction. 5. any passing from higher to lower in degree or state;… … Universalium
Descent direction — In optimization, a descent direction is a vector that, in the sense below, moves us closer towards a local minimum of our objective function . Suppose we are computing by an iterative method, such as line search. We define a descent direction at… … Wikipedia
descent — 1. SYN: descensus. 2. In obstetrics, the passage of the presenting part of the fetus into and through the birth canal. [L. descensus] * * * de·scent di sent n 1) the act or process of des … Medical dictionary
Newton's method in optimization — A comparison of gradient descent (green) and Newton s method (red) for minimizing a function (with small step sizes). Newton s method uses curvature information to take a more direct route. In mathematics, Newton s method is an iterative method… … Wikipedia
Conjugate gradient method — A comparison of the convergence of gradient descent with optimal step size (in green) and conjugate vector (in red) for minimizing a quadratic function associated with a given linear system. Conjugate gradient, assuming exact arithmetic,… … Wikipedia
Gradient descent — For the analytical method called steepest descent see Method of steepest descent. Gradient descent is an optimization algorithm. To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the… … Wikipedia
Nonlinear conjugate gradient method — In numerical optimization, the nonlinear conjugate gradient method generalizes the conjugate gradient method to nonlinear optimization. For a quadratic function : The minimum of f is obtained when the gradient is 0: . Whereas linear conjugate… … Wikipedia
Infinite descent — In mathematics, a proof by infinite descent is a particular kind of proof by contradiction which relies on the fact that the natural numbers are well ordered. One typical application is to show that a given equation has no solutions. Assuming a… … Wikipedia