Gradient

In vector calculus, the gradient of a scalar-valued differentiable function f of several variables is the vector field (or vector-valued function)




f


{\displaystyle \nabla f}
whose value at a point



p


{\displaystyle p}
is the vector whose components are the partial derivatives of



f


{\displaystyle f}
at



p


{\displaystyle p}
. That is, for



f
:


R


n




R



{\displaystyle f\colon \mathbb {R} ^{n}\to \mathbb {R} }
, its gradient




f
:


R


n





R


n




{\displaystyle \nabla f\colon \mathbb {R} ^{n}\to \mathbb {R} ^{n}}
is defined at the point



p
=
(

x

1


,

,

x

n


)


{\displaystyle p=(x_{1},\ldots ,x_{n})}
in n-dimensional space as the vector:





f
(
p
)
=


[







f




x

1





(
p
)













f




x

n





(
p
)



]


.


{\displaystyle \nabla f(p)={\begin{bmatrix}{\frac {\partial f}{\partial x_{1}}}(p)\\\vdots \\{\frac {\partial f}{\partial x_{n}}}(p)\end{bmatrix}}.}
The nabla symbol






{\displaystyle \nabla }
, written as an upside-down triangle and pronounced "del", denotes the vector differential operator.
The gradient is dual to the total derivative



d
f


{\displaystyle df}
: the value of the gradient at a point is a tangent vector – a vector at each point; while the value of the derivative at a point is a cotangent vector – a linear function on vectors. They are related in that the dot product of the gradient of f at a point p with another tangent vector v equals the directional derivative of f at p of the function along v; that is,




f
(
p
)


v

=




f




v




(
p
)
=
d

f


v



(
p
)


{\textstyle \nabla f(p)\cdot \mathbf {v} ={\frac {\partial f}{\partial \mathbf {v} }}(p)=df_{\mathbf {v} }(p)}
.
The gradient vector can be interpreted as the "direction and rate of fastest increase". If the gradient of a function is non-zero at a point p, the direction of the gradient is the direction in which the function increases most quickly from p, and the magnitude of the gradient is the rate of increase in that direction, the greatest absolute directional derivative. Further, the gradient is the zero vector at a point if and only if it is a stationary point (where the derivative vanishes). The gradient thus plays a fundamental role in optimization theory, where it is used to maximize a function by gradient ascent.
The gradient admits multiple generalizations to more general functions on manifolds; see § Generalizations.

View More On Wikipedia.org
Back
Top