What is the derivative of a vector?

In summary: F}(\mathbf{x}) = \mathbf{F}(\mathbf{r}) = \mathbf{F}(\mathbf{r}(\mathbf{x})) = \mathbf{F}(\mathbf{x},\mathbf{x}') = \mathbf{F}(\mathbf{x},d\mathbf{x}) = \mathbf{F}(\mathbf{x},\Delta\mathbf{x}) = \mathbf{F}(\mathbf{x},\Delta\mathbf{x}') = \mathbf{F}(\mathbf{x},\mathbf{r}, \mathbf{r}') In summary, the derivative (gradient) of
  • #1
Lucid Dreamer
25
0
Hello,

In lecture today, my professor told us that the derivative of a row vector is a column vector. I worked with vector calculus before and never came across this. I suspect it is a notational issue but would greatly appreciate it if someone could elaborate on this.
 
Mathematics news on Phys.org
  • #2
I never encountered this convention, and I can't see where it can benefit the presentation.
This seems like a perfectly good thing to ask your prof.
 
  • #3
Strictly speaking, a vector does not have a derivative. However, if you have a vector-valued function (for example a function representing position as a function of time), then you can certainly consider the derivative of that function, it will simply be another function (the velocity function).

Also, some people seem to not be bothered by switching a row vector to a column vector, so I suppose in a sense your prof can be right, but I don't like that approach. Suppose you have a vector v(t), then the derivative with respect to t is simply the gradient of that vector function, which yields a row or column vector, depending on how you had it to start with.

Does this help?
 
  • #4
Lucid Dreamer said:
Hello,

In lecture today, my professor told us that the derivative of a row vector is a column vector. I worked with vector calculus before and never came across this. I suspect it is a notational issue but would greatly appreciate it if someone could elaborate on this.

The derivative (gradient) of a scalar with respect to a vector, i.e. of scalar function of a vector variable, should be expressed as a dual vector. If you represent the vector variable as a column vector of variables then the derivative (gradient) should be written as a row vector of partial derivatives.

Beyond that, (parametric) derivatives of row vectors yields row vectors and likewise with column vectors. (parametric = derivatives w.r.t. a parameter, the vectors being functions of a single variable. [itex] \vec{x}(t)[/itex]).

Row vectors and column vectors live in different (though isomorphic) spaces. However when we work with general vectors (in yet another space) with a given basis we can write the basis expansion using both row and column vectors like this:

[tex] x\mathbf{i}+y\mathbf{j} + z\mathbf{k} = \left(\begin{array}{ccc} \mathbf{i} & \mathbf{j} & \mathbf{k} \end{array}\right)\left[\begin{array}{c}x \\ y \\ z \end{array}\right] = \left(\begin{array}{ccc} x & y & z \end{array}\right)\left[\begin{array}{c} \mathbf{i} \\ \mathbf{j} \\ \mathbf{k} \end{array}\right] [/tex]

This way we can represent general vectors (via a given basis) as either row vectors or column vectors of components and do so interchangibly. I find it convenient to (mostly) use column vectors to represent coordinate vectors and then row vectors to represent the dual gradients. But it is all a matter of convention and convenience.
 
  • #5
Perhaps I should have been more clear in my question. I am looking at dR(w)/dw where w is a vector and R(w) is a scalar function of a vector variable. In this case, jambaugh post seems to make sense as if w is a row vector, than dR(w)/dw is a column vector. Thanks for all your help!
 
  • #6
Lucid Dreamer said:
Perhaps I should have been more clear in my question. I am looking at dR(w)/dw where w is a vector and R(w) is a scalar function of a vector variable. In this case, jambaugh post seems to make sense as if w is a row vector, than dR(w)/dw is a column vector. Thanks for all your help!

Let me further elaborate as to why you get a dual vector. To generalize the idea of a derivative to vector calculus use differentials as local coordinates in the local linear approximation to a function.

Given [itex] y = f(x)[/itex] then the local linear approximation is:
[tex] dy = f'(x)dx \quad\quad\text{ that is to say } y+dy = f(x)+f'(x)dx \approx f(x+dx)[/tex]
Allow either x or y or both to be vectors here. You then have the derivative as a linear operator valued function of x, said linear operator maps [itex]dx[/itex] type objects to [itex]dy[/itex] type objects. We can define it as a limit of a difference quotient if we are careful to avoid what looks like division by a vector:
[tex] \mathbf{dy} = f'(\mathbf{x})\mathbf{dx} \equiv \lim_{h\to 0} \frac{ f(\mathbf{x}+h\mathbf{dx}) - f(\mathbf{x})}{h}[/tex]
Here [itex] h[/itex] is a real number and the difference quotient is well defined provided the range and domain of f are vector spaces (so we can add elements and multiply by scalars h and 1/h). That includes of course the case of 1-dimensional vectors we call scalars.

The nature of [itex]f'(x)[/itex] then is as a linear operator and we have the following cases:
  • dy,dx both scalars: the linear operator [itex]f'[/itex] is just multiplication by a number.
  • dy vector and dx scalar: the linear operator [itex]f'[/itex] maps scalars to vectors and so is multiplication by a vector.
  • dy scalar and dx vector: the linear operator [itex]f'[/itex] maps vectors to scalars and so is a dual vector (linear functional).
  • dy vector and dx vector:[itex] f'[/itex] is a full blown linear operator representable by a matrix.
Now as I mentioned, I prefer to use column vectors as coordinate vectors and row vectors for dual vectors so that in matrix format the action of [itex]f' [/itex] is left multiplication by a matrix.

e.g.
[tex]u = f(x,y);\quad du = f'(x,y)\left[\begin{array}{c}dx \\ dy \end{array}\right] = ( {\scriptsize{\frac{\partial u}{\partial x}\quad \frac{\partial u}{\partial y}}} ) \left[\begin{array}{c}dx \\ dy \end{array}\right][/tex]
[tex]\left[\begin{array}{c}u \\ v\end{array}\right] = \mathbf{F}(x,y,z) ; \quad \left[\begin{array}{c}du \\ dv \end{array}\right]
=\mathbf{F}' (x,y,z)\left[\begin{array}{c}dx \\ dy \\ dz \end{array}\right]= \left[\begin{array}{c c c} \scriptsize{ \frac{\partial u}{\partial x}} &\scriptsize{ \frac{\partial u}{\partial y}} &\scriptsize{ \frac{\partial u}{\partial z}}\\
\scriptsize{ \frac{\partial v}{\partial x}} &\scriptsize{ \frac{\partial v}{\partial y}} &\scriptsize{ \frac{\partial v}{\partial z}}\end{array}\right] \left[\begin{array}{c}dx \\ dy \\ dz \end{array}\right] [/tex]
Note the derivative of a scalar valued function of vectors is just the gradient and in resolving change of variables one gets the correct form of the gradient by preserving the differential relationship: [itex] du = \nabla u \cdot \mathbf{dr}[/itex].

2nd Note: Here I'm using primed notation just to match up with single variable calc. notation. More traditionally one uses the [itex]\nabla[/itex] operator. [itex]F' \to \nabla F[/itex], or one may use a Leibniz type notation.

3rd Note: Reversing my use of rows vs columns would allow one to better express directional derivatives and the differential operator, e.g.:
[tex] \mathbf{d} =\left( \begin{array}{ccc}dx & dy & dz \end{array}\right) \left[ \begin{array}{c} \partial_x \\ \partial_y \\ \partial_z\end{array}\right][/tex]

A final note. Here I am treating the differentials simply as local coordinates and not as differential forms per se and not as infinitesimals. In full blown differential geometry of manifolds we can't add points and differentials become cotangent vectors while the partial derivatives become tangent vectors. What we call "vector" and what we call "dual vector" is relative. The distinction between "tangent vector" and "co-tangent vector" is not.
 
  • #7
I'm not sure if this is right, but here's what I think. Suppose
[tex] \frac{df}{d\vec{x}}: \mathbb{R}^m \rightarrow \mathbb{R}^n [/tex]
Then [itex] \frac{df}{d\vec{x}} [/itex] is an [itex] n \times m [/itex] matrix.

Let [itex] \vec{x} \epsilon \mathbb{R}^m [/itex] so that [itex] \vec{x} [/itex] is a [itex] m \times 1 [/itex] column vector. In the special case where n = 1, [itex] \frac{df}{d\vec{x}} [/itex] is a [itex] 1 \times m [/itex] row vector.

Any thoughts?
 
  • #8
Lucid Dreamer said:
I'm not sure if this is right, but here's what I think. Suppose
[tex] \frac{df}{d\vec{x}}: \mathbb{R}^m \rightarrow \mathbb{R}^n [/tex]
Then [itex] \frac{df}{d\vec{x}} [/itex] is an [itex] n \times m [/itex] matrix.

Let [itex] \vec{x} \epsilon \mathbb{R}^m [/itex] so that [itex] \vec{x} [/itex] is a [itex] m \times 1 [/itex] column vector. In the special case where n = 1, [itex] \frac{df}{d\vec{x}} [/itex] is a [itex] 1 \times m [/itex] row vector.

Any thoughts?
Did you mean to say: [itex] f:\mathbb{R}^m\to \mathbb{R}^n[/itex]?

To be ultra-precise, [itex]\frac{df}{d\vec{x}}[/itex] is an [itex]n\times m[/itex] matrix valued function and so [itex]\frac{df}{d\vec{x}}(\vec{x})[/itex] is an [itex]n\times m[/itex] matrix.

But beyond pedantic trivialities, yes you've got the gist of it.
 
  • #9
[itex] f: \mathbb{R}^m \rightarrow \mathbb{R}^n [/itex] is represented by a [itex] n \times m [/itex] matrix. I don't see how [itex] \frac{df}{d\vec{x}} [/itex] is also represented by a [itex] n \times m [/itex] matrix.

Would you be able to provide a reference text for vector calculus that also does a fair treatment of matricies?
 
  • #10
Lucid Dreamer said:
[itex] f: \mathbb{R}^m \rightarrow \mathbb{R}^n [/itex] is represented by a [itex] n \times m [/itex] matrix.
Only if [itex]f[/itex] itself is a linear mapping!
I don't see how [itex] \frac{df}{d\vec{x}} [/itex] is also represented by a [itex] n \times m [/itex] matrix.
In the linear case, [itex] f(x) = Mx,\quad \frac{df(x)}{dx} = M,\quad \frac{df(x)}{dx} \cdot dx = M\cdot dx [/itex] where [itex]M[/itex] is an [itex]n\times m[/itex] matrix.

Would you be able to provide a reference text for vector calculus that also does a fair treatment of matricies?
I don't know of one offhand. You'll probably get more use from separate texts, one on linear algebra, the other a good calculus text.
 
  • #11
Lucid Dreamer said:
Would you be able to provide a reference text for vector calculus that also does a fair treatment of matricies?
The table of contents of ...
Advanced Calculus of Several Variables
looks pretty good. I haven't seen the book itself.
 
  • #12
Vector Calculus by Colley does an excellent job of explaining multivariable calculus in matrix form, and it also explains the linear algebra you need to manipulate these matrices.
 

FAQ: What is the derivative of a vector?

What is a derivative of a vector?

A derivative of a vector is a mathematical concept that represents the rate of change of a vector quantity with respect to a given variable. It can also be thought of as the slope of a tangent line to a vector function at a specific point.

What are the different types of vector derivatives?

There are two main types of vector derivatives: the derivative of a vector function with respect to a scalar variable, and the derivative of a vector function with respect to a vector variable. The former is known as a scalar derivative, while the latter is known as a vector derivative.

How is the derivative of a vector calculated?

The derivative of a vector can be calculated by taking the derivative of each component of the vector function with respect to the given variable. This results in a new vector with the same number of components as the original vector.

What is the purpose of calculating the derivative of a vector?

The derivative of a vector is useful in many fields of science and engineering, such as physics, mechanics, and computer graphics. It allows us to analyze the behavior of vector quantities and make predictions about their future values.

What are some real-world applications of the derivative of a vector?

The derivative of a vector has many practical applications, such as predicting the trajectory of a moving object, determining the velocity and acceleration of an object, and optimizing the performance of complex systems in engineering and physics.

Similar threads

Replies
11
Views
2K
Replies
1
Views
1K
Replies
1
Views
1K
Replies
6
Views
2K
Replies
2
Views
5K
Replies
5
Views
1K
Back
Top