Existence of Directional Derivative in Normed Linear Space

In summary, the conversation discusses whether, given a finite-dimensional normed linear space, there exists a direction at every point that satisfies a certain condition. The conversation also touches on the distribution of regular and singular matrices and how continuity and density play a role in proving certain statements. The possibility of a hyperplane satisfying the desired properties is also mentioned, but its applicability is questioned.
  • #1
Gear300
1,213
9
Given a finite-dimensional normed linear space ##(L,\lVert \cdot \rVert)##, is there anything that suggests that at every point ##x_0 \in L##, there exists a direction ##\delta \in L## such that that ##\lVert x_0 + t\delta \rVert \geqslant \lVert x_0 \rVert## for all ##t \in \mathbb{R}##?
 
Mathematics news on Phys.org
  • #2
Gear300 said:
Given a finite-dimensional normed linear space ##(L,\lVert \cdot \rVert)##, is there anything that suggests that at every point ##x_0 \in L##, there exists a direction ##\delta \in L## such that that ##\lVert x_0 + t\delta \rVert \geqslant \lVert x_0 \rVert## for all ##t \in \mathbb{R}##?
No. ##1=|1|=|2+ (-1)\cdot 1|=\lVert x_0 + t\delta \rVert < \lVert x_0 \rVert = |2|=2##
 
  • #3
fresh_42 said:
No. ##1=|1|=|2+ (-1)\cdot 1|=\lVert x_0 + t\delta \rVert < \lVert x_0 \rVert = |2|=2##
Point taken. But what if the dimension of the space is ##n \geqslant 2##? I figured there should be ##n - 1## such directions. The reason I ask is because of the attached question. Finding a minimum matrix ##B## has been simple enough for p-like norms, but I haven't found one for arbitrary norms in general. Proving the posted statement felt it should suffice to prove the inquiry. If the norm was differentiable, then I could take directions orthogonal to the gradient and inspect the Hessian, but I suspect differentiability is not granted for all ##x_0 \in L##, such as with the supremum norm ##p = \infty##.
 

Attachments

  • pic1.png
    pic1.png
    9.9 KB · Views: 525
Last edited:
  • #4
Gear300 said:
Point taken. But what if the dimension of the space is ##n \geqslant 2##? I figured there should be ##n - 1## such directions.
Same example: ##||(2,2) - (1,1)||= \sqrt{2} < 2\sqrt{2} = ||(2,2)||\,.##
The reason I ask is because of the attached question. Finding a minimum matrix ##B## has been simple enough for p-like norms, but I haven't found one for arbitrary norms in general. Proving the posted statement felt it should suffice to prove the inquiry. If the norm was differentiable, then I could take directions orthogonal to the gradient and inspect the Hessian, but I suspect differentiability is not granted for all ##x_0 \in L##, such as with the supremum norm ##p = \infty##.
The point is, that regular matrices form a dense subset, so the distance to another regular one is arbitrary small, but to find the next singular one is not as trivial. The singular matrices form a closed subset and the condition number says how they are distributed. As we measure a distance, this depends a lot on the norm.
 
  • #5
fresh_42 said:
The point is, that regular matrices form a dense subset, so the distance to another regular one is arbitrary small, but to find the next singular one is not as trivial. The singular matrices form a closed subset and the condition number says how they are distributed. As we measure a distance, this depends a lot on the norm.
Might this be provable by inspecting the continuity of the determinant as a polynomial function? The zeros would then correspond to a closed subset of singular matrices. I'm just wondering how this is normally proved.
 
  • #6
Gear300 said:
Might this be provable by inspecting the continuity of the determinant as a polynomial function?
Yes. The singular matrices are the preimage of the closed set ##\{\,0\,\}## under the determinant.
The zeros would then correspond to a closed subset of singular matrices. I'm just wondering how this is normally proved.
 
  • Like
Likes Klystron
  • #7
fresh_42 said:
Yes. The singular matrices are the preimage of the closed set ##\{\,0\,\}## under the determinant.
Alright then. You don't mind recommending a book on this :biggrin:? The book I'm using here is Rainer Kress's Numerical Analysis, and I've gotten up to exactly that question with what I know about Banach spaces. So I wouldn't mind additional references.
 
  • #8
Gear300 said:
So that's how it is normally proved? (Apologies for the persistence, but just being sure.)
Well, we need the determinant to be continuous, and that ##\{\,0\,\} \subseteq \mathbb{R}## is closed. Continuity is equivalent to "preimages of closed sets are closed". So ##\{\,M\in \mathbb{M}_n(\mathbb{R})\,|\,\det M = 0\,\} \subseteq \mathbb{M}_n(\mathbb{R})## is closed. This means at the same time, that ##GL_n(\mathbb{R}) = \{\,M\in \mathbb{M}_n(\mathbb{R})\,|\,\det M \neq 0\,\} ## is open. Density might be dependent on the norm which is used, but usually if we change a few matrix entries by arbitrary small amounts, we will always get a regular matrix. Maybe it won't work with the discrete norm.
 
  • #9
fresh_42 said:
Well, we need the determinant to be continuous, and that ##\{\,0\,\} \subseteq \mathbb{R}## is closed. Continuity is equivalent to "preimages of closed sets are closed". So ##\{\,M\in \mathbb{M}_n(\mathbb{R})\,|\,\det M = 0\,\} \subseteq \mathbb{M}_n(\mathbb{R})## is closed. This means at the same time, that ##GL_n(\mathbb{R}) = \{\,M\in \mathbb{M}_n(\mathbb{R})\,|\,\det M \neq 0\,\} ## is open. Density might be dependent on the norm which is used, but usually if we change a few matrix entries by arbitrary small amounts, we will always get a regular matrix. Maybe it won't work with the discrete norm.
An ##n \times n## matrix lives in a linear space of dimension ##n^2##. Since all finite-dimensional norms are equivalent, we can say that all matrix norms are equivalent to the Frobenius norm.

That aside, like you intimated, I think my assertion is true. Funny it took me so long, but for a norm function ##p(x)##, we can consider the surface of constant norm ##p(x_0)##. Since it is convex, we should always be able to come up with a tangent hyperplane of dimension ##n-1## that satisfies the desired properties, correct?
 
  • #10
I still doubt that it works for all ##t\in \mathbb{R}##, but I'm not sure. You claim, that each point is a norm minimum for at least one direction, but with arbitrary ##t##, these are two opposite directions. I cannot really imagine such a situation except for ##x_0=0##.

There is a separation theorem for hypersurfaces in Hilbert spaces IIRC, but we don't have a norm induced by an inner product. I might have thought too much of a metric, which is too weak.
 
  • #11
fresh_42 said:
I still doubt that it works for all ##t\in \mathbb{R}##, but I'm not sure. You claim, that each point is a norm minimum for at least one direction, but with arbitrary ##t##, these are two opposite directions. I cannot really imagine such a situation except for ##x_0=0##.

There is a separation theorem for hypersurfaces in Hilbert spaces IIRC, but we don't have a norm induced by an inner product. I might have thought too much of a metric, which is too weak.
So I did a recourse through Kolmogorov's Real Analysis text and found Problem 5.c on pg 141. The interior of the unit sphere in any normed space is an open convex set, so for all points on the surface, a tangent hyperplane should exist in the sense that it never touches the interior.
 

Attachments

  • 128-129.png
    128-129.png
    57.3 KB · Views: 474
  • 130-131.png
    130-131.png
    47 KB · Views: 476
  • 132-133.png
    132-133.png
    63.8 KB · Views: 466
  • 134-135.png
    134-135.png
    67.1 KB · Views: 470
  • 136-137.png
    136-137.png
    61.2 KB · Views: 474
  • 138-139.png
    138-139.png
    58.4 KB · Views: 478
  • 140-141.png
    140-141.png
    60.4 KB · Views: 457
  • #12
fresh_42 said:
No. ##1=|1|=|2+ (-1)\cdot 1|=\lVert x_0 + t\delta \rVert < \lVert x_0 \rVert = |2|=2##
I don't understand, doesn't he op ask for _a_ direction and not for _every_ direction? Then you canonsider || 2+1(1)||=||3||>2?
 
  • #13
Gear300 said:
An ##n \times n## matrix lives in a linear space of dimension ##n^2##. Since all finite-dimensional norms are equivalent, we can say that all matrix norms are equivalent to the Frobenius norm.

That aside, like you intimated, I think my assertion is true. Funny it took me so long, but for a norm function ##p(x)##, we can consider the surface of constant norm ##p(x_0)##. Since it is convex, we should always be able to come up with a tangent hyperplane of dimension ##n-1## that satisfies the desired properties, correct?
I think this is one of the versions of Hahn-Banach.
 

FAQ: Existence of Directional Derivative in Normed Linear Space

1. What is a directional derivative in a normed linear space?

A directional derivative in a normed linear space is a measure of how a function changes in a particular direction at a specific point. It is a generalization of the concept of derivative from single-variable calculus to higher dimensions. It represents the rate of change of a function in a specific direction.

2. How is the directional derivative defined in a normed linear space?

The directional derivative of a function f at a point x in a normed linear space is defined as follows:

Dvf(x) = lim h → 0 f(x + hv) - f(x) / h

where v is the direction vector and h is a small number representing the change in the direction.

3. What is the significance of the existence of directional derivative in a normed linear space?

The existence of directional derivative in a normed linear space indicates that the function is differentiable at that particular point in the direction of the given vector. This means that the function is smooth and continuous in that direction, and the directional derivative represents the slope of the tangent line to the function at that point in that direction.

4. How is the existence of directional derivative related to the continuity of a function in a normed linear space?

In a normed linear space, the existence of directional derivative implies the continuity of the function in that direction. This means that as the direction changes, the function changes smoothly and continuously, without any abrupt jumps or breaks. The directional derivative is a measure of this continuity and represents the rate of change of the function in that direction.

5. Are there any conditions that must be met for the existence of directional derivative in a normed linear space?

Yes, there are certain conditions that must be met for the existence of directional derivative in a normed linear space. The function must be defined and continuous at the point in question, and the direction vector must be well-defined and non-zero. In addition, the function must be differentiable in all directions at that point, meaning that the limit of the difference quotient must exist for all possible directions.

Back
Top