Vector space of functions defined by a condition

In summary: This too is not continuous at...##f_2(x) =x####f_3(x) = x^2####f_4(x)= x^3##I'm sorry, but no.Those functions are not in the space.
  • #36
Hall said:
I can fix it by changing ##(0,0,0,0,1)## to ##(1,0,0,0,1)## thus converting ##f_5## to
$$
f_5 =
\begin{cases}
x^2 & x \in [0,1]\\
x^3 & x \in [1,2]\\
\end{cases}$$
Let's see now. That gives you: ##\displaystyle f_5=\begin{cases} x^2 & x \in [0,1]\\ x^3 & x \in [1,2 ]\\ \end{cases} \ ## and ##\displaystyle \ f_1=\begin{cases} x & x \in [0,1]\\x^3 & x \in [1,2]\\\end{cases}
##So that ##\displaystyle f_5-f_1=\begin{cases} x^2-x & x \in [0,1]\\0 & x \in [1,2]\\\end{cases} ##
 
Physics news on Phys.org
  • #37
SammyS said:
Let's see now. That gives you: ##\displaystyle f_5=\begin{cases} x^2 & x \in [0,1]\\ x^3 & x \in [1,2 ]\\ \end{cases} \ ## and ##\displaystyle \ f_1=\begin{cases} x & x \in [0,1]\\x^3 & x \in [1,2]\\\end{cases}
##So that ##\displaystyle f_5-f_1=\begin{cases} x^2-x & x \in [0,1]\\0 & x \in [1,2]\\\end{cases} ##
##f_5 - f_1## is continuous at 1, and is really the case of
$$
f(x) =
\begin{cases}
ax^2 +bx & x \in [0,1]\\
Ax^3 + Bx^2 +Cx + D & x \in [1,2]\\
\end{cases}$$
when ##a=1, b= -1, A=B=C=D=0##.
 
  • #38
Hall said:
##f_5 - f_1## is continuous at 1, and is really the case of
$$
f(x) =
\begin{cases}
ax^2 +bx & x \in [0,1]\\
Ax^3 + Bx^2 +Cx + D & x \in [1,2]\\
\end{cases}$$
when ##a=1, b= -1, A=B=C=D=0##.
Yes, it is the correct version of what you tried at first when you wanted to write (0,0,0,0,1).
 
  • #39
@Hall
What you do is propose some solutions seemingly at random. You know you are looking for a basis in a five dimensional space. When you have your five candidates in ##V## check that they are linearly independent. This is sufficient to solve the problem.
 
  • Like
Likes PeroK
  • #40
nuuskur said:
You know you are looking for a basis in a five dimensional space.
That is the issue here. I really don't know how V is five dimensional.

For upto now in my self-studies, I have determined the dimension of a vector space only by first constructing its basis and then counting the number of elements of it. But in this case the construction of basis seems tougher than determining the dimension; and that is the reason why everyone is seeming too obtuse to me.
 
  • Like
Likes Delta2
  • #41
Hall said:
That is the issue here. I really don't know how V is five dimensional.

For upto now in my self-studies, I have determined the dimension of a vector space only by first constructing its basis and then counting the number of elements of it. But in this case the construction of basis seems tougher than determining the dimension; and that is the reason why everyone is seeming too obtuse to me.
If you remove the condition of continuity, then the space is clearly six-dimensional (you pick values for 6 independent constants). Introducing the continuity condition is introducing one linear constraint, removing one from the dimensionality. Hence, five dimensions.
 
Last edited:
  • Informative
  • Like
Likes PeroK and Delta2
  • #42
Hall said:
I have determined the dimension of a vector space only by first constructing its basis and then counting the number of elements of it.
That works. You show that the subset is spanning and linearly independent. It can be quite tedious.
Hall said:
But in this case the construction of basis seems tougher than determining the dimension; and that is the reason why everyone is seeming too obtuse to me.
Indeed, which is why I proposed a much faster solution in my first post. Finite dimensional vector spaces are determined uniquely by their underlying field. So, to understand any ##n## dimensional space over ##\mathbb R##, it suffices to understand ##\mathbb R^n##.

On the topic of selfstudy I recommend G. Strang's course , complete with video lectures and problem sets & solutions. Happy hunting.
 
Last edited:
  • #43
Hall said:
That is the issue here. I really don't know how V is five dimensional
I thought at first it was 7-dimensional. I thought, 2 dimensions in [0,1] times 4 dimensions in [1,2] minus 1 for continuity. I had probably mistaken the space composition as a tensor product. It's not that strange to understand this less quickly.

It's a cartesian product of functions on [0,1] and functions on [1,2], and for cartesian products dimensions add up. Next: Every linearly independent set of n requirements removes n dimensions. We have 1 requirement, so subtract 1.
 
  • Like
Likes Delta2 and Hall
  • #44
@Hall

An important step is to verify that ##V## is a vector space. It's obvious, but you should still make it explicit. You can do that by checking all the axioms of vector spaces (tedious). Alternatively, can you name a vector space that contains ##V## as a subset?
 
  • Like
Likes WWGD and PeroK
  • #45
nuuskur said:
Alternatively, can you name a vector space that contains V as a subset?
Space of all continuous functions on interval ##[0,2]##.
Space of all continuous functions.
Space of all polynomial functions.
 
  • #46
Hall said:
Space of all continuous functions on interval ##[0,2]##.
Space of all continuous functions.
Yes. Also space of all bounded functions on ##[0,2]## is sufficient.
Hall said:
Space of all polynomial functions.
Piecewise polynomials.
 
  • Like
Likes Delta2 and Hall
  • #47
nuuskur said:
Alternatively, can you name a vector space that contains V as a subset?
Being a subset of a vector space is not sufficient. You need to confirm that it is a subspace.
 
  • Love
Likes Delta2
  • #48
Orodruin said:
Being a subset of a vector space is not sufficient. You need to confirm that it is a subspace.
I'm aware of that. I didn't claim it was sufficient, either.
 
  • Like
Likes Delta2
  • #49
nuuskur said:
I'm aware of that. I didn't claim it was sufficient, either.
Not explicitly, but reading your post it appears as an alternative to checking the vector space axioms.
 
  • Like
Likes Delta2
  • #50
@Orodruin
Students have usually no trouble verifying closure with respect to zero vector, addition and multiplication with scalar and to then declare that it's a vector space. It gets dicey when I say "you have checked it's a subspace, but in which vector space is it a subspace?"

Orodruin said:
Not explicitly, but reading your post it appears as an alternative to checking the vector space axioms.
That's exactly the point. Part of my teaching is philosophy is to Not give all the details. Consequently the student also learns to Not assume what I have Not stated.
 
  • Like
Likes Delta2
  • #51
nuuskur said:
That's exactly the point. Part of my teaching is philosophy is to Not give all the details. Consequently the student also learns to Not assume what I have Not stated.
The thing is you presented it in a way that made it appear as an alternative to checking the vector space axioms. That to me is not ”omitting a few details” but ”omitting details in such a way that students are mislead on purpose”.

Things like commutativity of addition of functions is as easy to show directly in the subspace as it is in the general function space.
 
  • #52
Orodruin said:
The thing is you presented it in a way that made it appear as an alternative to checking the vector space axioms. That to me is not ”omitting a few details” but ”omitting details in such a way that students are mislead on purpose”.
The goal is to incite questions/protest. That goal is evidently achieved.
 
  • #53
nuuskur said:
The goal is to incite questions/protest. That goal is evidently achieved.
Protest from me, not the OP… that can hardly be interpreted as a sign of the OP realising the subtextual.
 
  • #54
Orodruin said:
Protest from me, not the OP… that can hardly be interpreted as a sign of the OP realising the subtextual.
Indeed. Perhaps, next time the OP will recall this exchange.

You don't need to worry about what-ifs in the event you hadn't said anything. I would have.
 
  • #55
recall that if there is a surjective linear transformation T from V to W, then the dimension of V equals the sum of the dimension of W plus the dimension of the kernel of T. In your case there is a nice surjective linear transformation from your space V to the 2 dimensional space of functions spanned by x and x^2 on [0,1], namely restriction to [0,1]. So what is the kernel? It should be pretty clear that kernel is 3 dimensional, and hence V is 5 dimensional.

Then to get a basis of V, take a basis of the kernel, and add in any pair of functions in V that restrict respectively to x and x^2 on [0,1]. (There is an obvious choice.)

Apologies if I have missed this hint in earlier discussions. This is of course inspired by what is said earlier.
 
Last edited:
  • Like
Likes dextercioby, Delta2, nuuskur and 3 others
  • #56
mathwonk said:
recall that if there is a surjective linear transformation T from V to W, then the dimension of V equals the sum of the dimension of W plus the dimension of the kernel of T. In your case there is a nice surjective linear transformation from your space V to the 2 dimensional space of functions spanned by x and x^2 on [0,1], namely restriction to [0,1]. So what is the kernel? It should be pretty clear that kernel is 3 dimensional, and hence V is 5 dimensional.

Then to get a basis of V, take a basis of the kernel, and add in any pair of functions in V that restrict respectively to x and x^2 on [0,1]. (There is an obvious choice.)

Apologies if I have missed this hint in earlier discussions. This is of course inspired by what is said earlier.
As w have to maintain continuity at 1, we can write our ##f## as
$$
f(x)=
\begin{cases}
ax^2 + bx & x \in [0,1] \\
A(x^3 -1) + B(x^2 -1) + C(x-1) + a+b & x \in[1,2] \\
\end{cases}$$
(credit: @Steve4Physics)

Let ##T## be the operator which restricts the function to ##[0,1]##, therefore, ##T : V \to W##, where ##V## is the space given in the question, and ##W## is all polynomials with degree less than or equal to 2 and ##p(0)=0##.
##T(f) = ax^2 +bx##

The following function will always yield ##0##
$$
f(x) =
\begin{cases}
0 & x \in [0,1] \\
A(x^3-1) + B(x^2 -1) +C(x-1) & x \in[1,2]\\
\end{cases}
$$
(the continuity is maintained).

So, the null space is the space of all those functions. Its basis is
##
f_1 =
\begin{cases}
0 & x \in[0,1]\\
x^3-1 & x \in [1,2] \\
\end{cases}
##

##
f_2 =
\begin{cases}
0 & x \in[0,1] \\
x^2 -1 & x \in [1,2] \\
\end{cases}
##

##
f_3 =
\begin{cases}
0 & x \in[0,1]\\
x-1 & x \in [1,2]\\
\end{cases}##

The basis for range of ##T## is ##\{x^2, x\}##.

##dim V = 3 + 2 =5##

Basis for ##V## is ##\{f_1, f_2, f_3, f_4, f_5\}##
$$
f_4 =
\begin{cases}
x^2 & x \in [0,1] \\
1 & x \in [1,2]\\
\end{cases}
$$
$$
f_5 =
\begin{cases}
x & x \in[0,1] \\
1 & x \in[1,2]\\
\end{cases}
$$

Your hint was really unique.
 
  • Like
Likes Delta2
  • #57
This solution is for ("attributed to" wouldn't be right because I don't know if he really wanted me to do like this) @Steve4Physics

$$f(x) = \begin{cases} ax^2 + bx & x \in [0,1] \\ Ax^3 + Bx^2 + Cx + (a+b -A-B-C) & x \in [1,2] \end{cases}$$
Can be written as ##(ax^2 + bx, A (x^3 -1) + B(x^2 -1) + C(x-1) +a+b)## where the first coordinate shows the function when ##x \in [0,1]## and second coordinate when ##x \in [1,2]##. Now, it's quite easy to find the basis:
$$
a(x^2,1) + b(x, 1) + A(0, x^3 -1) +B(0, x^2 -1) + C(0, x-1)$$
 
  • Like
Likes Steve4Physics
  • #58
I would have transformed this problem into ##\mathbb R^6## by mapping the polynomial to ##(a,b, A,B,C,D)## with the condition that ##a+b = A+B+C +D##.

Then, for a basis you just take each of ##a,b,A,B,C## equal to ##1## in turn and set ##D## accordingly.
 
  • #59
Hall said:
This solution is for ("attributed to" wouldn't be right because I don't know if he really wanted me to do like this) @Steve4Physics
For information, @Hall is referring to a suggested approach (not a solution) to a similar, simpler problem, in a PM.
 
  • Like
Likes Delta2

Similar threads

Back
Top