- #1
torquerotates
- 207
- 0
I don't understand what my professor told me. He said that if a fourieer series of a function f is not continuous then it doesn't converge uniformly to f.
But the given theorem only states that, "if a sequence of functions is continuous and converges uniformly to f, then f is continuous."
I'm given that f is not continuous. and that the fourieer series of f is not continuous. How does this mean that the fourieer series doesn't converge uniformly to f?
This is just a application of logic here
A&B=> C
not C=> (not A) or (not B)
knowing (not A) doesn't really give me (not B)
But the given theorem only states that, "if a sequence of functions is continuous and converges uniformly to f, then f is continuous."
I'm given that f is not continuous. and that the fourieer series of f is not continuous. How does this mean that the fourieer series doesn't converge uniformly to f?
This is just a application of logic here
A&B=> C
not C=> (not A) or (not B)
knowing (not A) doesn't really give me (not B)