Showing that f(x,y) = √|xy| is not differentiable at (0,0)

In summary: I have continued to use the notation ##\vec{\lambda}## for the expression in the limit, but with the understanding that the expression depends on (t,t) as a parameter.The limit does not exist since ##\lim_{t \rightarrow 0} \frac {|t|}{\sqrt{2} t} = 1/ \sqrt{2}## when we approach 0 from the first or third quadrant, and approach ##-1/\sqrt{2}## as we approach 0 from the second or fourth quadrant.I think your limit expression is wrong
  • #1
Eclair_de_XII
1,083
91

Homework Statement


"Let ##f:ℝ^2\rightarrow ℝ## be defined by ##f(x,y)=\sqrt{|xy|}##. Show that ##f## is not differentiable at ##(0,0)##."

Homework Equations


Differentiability: If ##f:ℝ^n\rightarrow ℝ^m## is differentiable at ##a\in ℝ^n##, then there exists a unique linear transformation such that ##\lim_{h\rightarrow 0} \frac{f(a+h)-f(a)-\lambda (h)}{|h|} = 0##.

The Attempt at a Solution


So far, with ##h=(b,k)##, I have the equality:

##\lim_{h\rightarrow 0} \frac{f(0+h)-f(0)-\lambda (h)}{|h|} = \lim_{h\rightarrow 0} \frac{f(h)-\lambda (h)}{|h|}=\lim_{h\rightarrow 0} \frac{\sqrt{|bk|}-\lambda (h)}{\sqrt{b^2+k^2}}=\lim_{h\rightarrow 0} \frac{\sqrt{|bk|}-\lambda (h)}{\sqrt{b^2+k^2}}=\lim_{h\rightarrow 0} \frac{\sqrt{|\frac{b}{k}|}-\frac{\lambda (h)}{k^2}}{\sqrt{(\frac{b}{k})^2+1}}##

I don't know how to prove that this limit is not zero, though. All I'm thinking about how to solve this by arguing that the only way the limit is zero, is if ##\lambda (h) = \sqrt{|bk|}##, which shouldn't be linear, since neither vector addition nor scalar multiplication with negative numbers holds.
 
Last edited:
Physics news on Phys.org
  • #2
Hint: Differentiability is equivalent with existence and continuity of the partial derivatives, so you might want to investigate them.
 
  • #3
Eclair_de_XII said:

Homework Statement


"Let ##f:ℝ^2\rightarrow ℝ## be defined by ##f(x,y)=\sqrt{|xy|}##. Show that ##f## is not differentiable at ##(0,0)##."

Homework Equations


Differentiability: If ##f:ℝ^n\rightarrow ℝ^m## is differentiable at ##a\in ℝ^n##, then there exists a unique linear transformation such that ##\lim_{h\rightarrow 0} \frac{f(a+h)-f(a)-\lambda (h)}{|h|} = 0##.
Your relevant equation isn't relevant. The function you're working with has two arguments, not one.
Eclair_de_XII said:

The Attempt at a Solution


So far, with ##h=(b,k)##, I have the equality:

##\lim_{h\rightarrow 0} \frac{f(0+h)-f(0)-\lambda (h)}{|h|} = \lim_{h\rightarrow 0} \frac{f(h)-\lambda (h)}{|h|}=\lim_{h\rightarrow 0} \frac{\sqrt{|bk|}-\lambda (h)}{\sqrt{b^2+k^2}}=\lim_{h\rightarrow 0} \frac{\sqrt{|bk|}-\lambda (h)}{\sqrt{b^2+k^2}}=\lim_{h\rightarrow 0} \frac{\sqrt{|\frac{b}{k}|}-\frac{\lambda (h)}{k^2}}{\sqrt{(\frac{b}{k})^2+1}}##
As noted by another person responding, you need to be working with partial derivatives, not the ordinary derivative.
Eclair_de_XII said:
I don't know how to prove that this limit is not zero, though. All I'm thinking about how to solve this by arguing that the only way the limit is zero, is if ##\lambda (h) = \sqrt{|bk|}##, which shouldn't be linear, since neither vector addition nor scalar multiplication with negative numbers holds.
 
  • #4
Eclair_de_XII said:

Homework Statement


"Let ##f:ℝ^2\rightarrow ℝ## be defined by ##f(x,y)=\sqrt{|xy|}##. Show that ##f## is not differentiable at ##(0,0)##."

Homework Equations


Differentiability: If ##f:ℝ^n\rightarrow ℝ^m## is differentiable at ##a\in ℝ^n##, then there exists a unique linear transformation such that ##\lim_{h\rightarrow 0} \frac{f(a+h)-f(a)-\lambda (h)}{|h|} = 0##.

The Attempt at a Solution


So far, with ##h=(b,k)##, I have the equality:

##\lim_{h\rightarrow 0} \frac{f(0+h)-f(0)-\lambda (h)}{|h|} = \lim_{h\rightarrow 0} \frac{f(h)-\lambda (h)}{|h|}=\lim_{h\rightarrow 0} \frac{\sqrt{|bk|}-\lambda (h)}{\sqrt{b^2+k^2}}=\lim_{h\rightarrow 0} \frac{\sqrt{|bk|}-\lambda (h)}{\sqrt{b^2+k^2}}=\lim_{h\rightarrow 0} \frac{\sqrt{|\frac{b}{k}|}-\frac{\lambda (h)}{k^2}}{\sqrt{(\frac{b}{k})^2+1}}##

I don't know how to prove that this limit is not zero, though. All I'm thinking about how to solve this by arguing that the only way the limit is zero, is if ##\lambda (h) = \sqrt{|bk|}##, which shouldn't be linear, since neither vector addition nor scalar multiplication with negative numbers holds.

You are on the right track: you have shown that ##f({\bf x}) =f(x_1,x_2) = \sqrt{|x_1 x_2|}## has a directional derivative ##D_f({\bf p}) = \sqrt{|p_1 p_2|}## at ##{\bf x} = (0,0)## in the direction ##{\bf p} = (p_1, p_2).## That is, you have shown that ##f## is Gateaux differentiable at (0,0) and you have found its Gateaux derivative. However, the question is asking about differentiability in the Frechet sense, and for that to happen the Gateaux differential ##D_f({\bf p})## would need to be a bounded linear function of ##{\bf p}.##

For more on this, see https://en.wikipedia.org/wiki/Fréchet_derivative . Never mind the Banach space stuff in the link; it covers the finite-dimensional case as well. This source gives a simple example which is Frechet differentiable everywhere and yet does NOT have continuous partial derivatives at one point! (Of course, it has partial derivatives at every point, just not everywhere continuous ones.)
 
  • #5
Eclair_de_XII said:

Homework Equations


Differentiability: If ##f:ℝ^n\rightarrow ℝ^m## is differentiable at ##a\in ℝ^n##, then there exists a unique linear transformation such that ##\lim_{h\rightarrow 0} \frac{f(a+h)-f(a)-\lambda (h)}{|h|} = 0##.

You are correct that this could be a relevant definition if you intend ##f## and ##\lambda## to be real valued functions that have two arguments and if you intend ##h## to denote a vector with two components. In that case, it would be clearer to write the limit as:
##\lim_{\vec{h} \rightarrow \vec{0}} \frac{ f(\vec{a} + \vec{h}) - f(\vec{a}) - \lambda(\vec{h})}{||\vec{h}||}##If you have studied partial derivatives and their relation to differentiability, I recommend you consider @Math_QED 's hint. If you have studied directional derivatives, the basic idea is to find some directional derivative that doesn't exist.Proving the nondifferentiability of ##\sqrt{|xy|}## directly from the definition of derivative is a strenuous exercise - it's probably not how your text materials intend you to work the problem. However, we can consider what it would involve.

I don't know how to prove that this limit is not zero,

One way for a limit not to be zero is for it to not to be anything - for the limit not to exist. If ##\vec{h} = (h_x, h_y)## and ##lim_{\vec{h} \rightarrow \vec{0}} Q(\vec{h}) ## exists then various other special limits exist and have the same common value, namely those that express the limits where the variables ##(h_x,h_y)## approach ##\vec{0}## as they move along a particular continuous line or curve.

The contrapostive of that fact is that if you can find special limits of that type that don't exist or exist and have different values, you can conclude ##lim_{\vec{h} \rightarrow \vec{0}} Q(\vec{h}) ## does not exist. (Don't confuse this contrapositive with the converse of the above fact, which is false, as a generality.)

A graph of ##\sqrt{|xy|}## http://mbr-team.net/?attachment_id=61 suggests that if we move along the line with the parametric form ##(t,t)## (ie. the line y = x with ##h_x = t, h_y = t##) then the curve traced out on the surface does not have a unique tangent at (0,0). A tangent defined by approaching (0,0) from the first quadrant appears to have a positive slope , and a tangent defined by approaching (0,0) from the third quadrant appears to have a negative slope. This is analogous to considering the differentiability of the single variable function ##f(x) = \sqrt{|x^2|}## at ##x = 0##.

That intuition suggests we try to show the non-existence of the limit:
##L = \lim_{t \rightarrow 0} \frac { f(0+t,0+t) - f(0,0) - \vec{\lambda}(t,t)} {||(t,t)||} ##
##= \lim_{t \rightarrow 0} \frac{\sqrt{(|t^2|)} - 0 - \vec{\lambda}((t,t)) }{||(t,t)||} ##
##= \lim_{t \rightarrow 0} \frac { |t| - 0 - \vec{\lambda}((t,t))} { \sqrt{( t^2 + t^2)} } ##

For the linear map ##\vec{\lambda}## , we are dealing with case ##f: R^2 \rightarrow R##, so ##\vec{\lambda}## has a 1 dimensional image of the form ##\vec{\lambda}((x,y)) = ax + by## for some constants ##a,b## and ##\vec{\lambda}((t,t)) = at + bt##.

##L = \lim_{t \rightarrow 0} \frac{ |t| - at - bt}{ \sqrt{2}|t|} ##

That makes it clear we are considering a limit involving a single variable.

For ##t \ne 0## the function ##\frac{ |t| - at - bt}{ \sqrt{2}|t|}## is equal to ##\frac{1}{\sqrt{2}} - \frac{ (a+b)}{\sqrt{2}} \frac{ t}{|t|} ##.

##L = \lim_{ t \rightarrow 0} \frac{1}{\sqrt{2}} - \frac{ (a+b) t}{\sqrt{2}|t|} ##.

In the case ##a + b = 0## we get ##L = \frac{1}{\sqrt{2}}##, which is obviously not zero.

In the case ##a +b \ne 0 ## we can do more work to show that ##L## does not exit. The basic idea is that
the function ##\frac{t}{|t|}## is equal to the "step" function ##g(t)## defined by: ##g(t) = -1 ## if ##t < 0## and ## g(t) = 1 ## if ##t > 0##. The behavior of this function implies that we are taking the limit of the function ##s(t)## defined by

##s(t) = \frac{1}{\sqrt{2}} + \frac{ (a+b) }{\sqrt{2}}## if ## t < 0##
##s(t) = \frac{1}{\sqrt{2}} - \frac{ (a+b) }{\sqrt{2}}## if ## t > 0##

We might get away with saying that it is "obvious" ##lim_{t \rightarrow 0} s(t) ## doesn't exist. If an instructor expected more detail, we'd have to more work to do!
 
Last edited:
  • Like
Likes member 587159
  • #6
Math_QED said:
Hint: Differentiability is equivalent with existence and continuity of the partial derivatives, so you might want to investigate them.

To add to that:

Looking at graph of ##\sqrt{|xy|}## if we approach the origin along the x or y axis, we are on curves whose slope at (0,0) is unambiguously 0. In fact, the partial derivatives appear to be continuous at (0,0).

However if we consider any open set containing (0,0) and a partial derivative defined at , say, (x,0) for some non-zero x, it may not exist. So the question of the existence and continuity of partial derivatives in an open set containing (0,0) should be emphasized. The existence and continuity of partial derivatives of ##f## in an open set containing (0,0) implies ##f## is differentiable at (0,0), but what are the technicalities of the converse of that statement? Suppose the partial derivatives exist and are continuous at (0,0), but each open set containing (0,0) contains at least one other point (x,y) where one partial derivative doesn't exist or exists and isn't continuous. What theorem says ##f## is not differentiable? (I myself don't know.)
 
  • #7
Math_QED said:
Hint: Differentiability is equivalent with existence and continuity of the partial derivatives, so you might want to investigate them.

Not quite: the link I gave in #4 gives a simple two-variable example that is differentiable everywhere, but has partial derivatives that are not continuous at one point.
 
  • #8
Ray Vickson said:
Not quite: the link I gave in #4 gives a simple two-variable example that is differentiable everywhere, but has partial derivatives that are not continuous at one point.

You are right. It is equivalent with continuously differentiable.
 
  • #9
Some classmate showed me something I should have remembered from Calculus III. I really should remember this, but I took it over two years ago, and don't have a good memory, in general. With ##u=(h,k)##:

So basically, approaching ##(0,0)## from the ##x##-direction, holding ##y## fixed:
##\lim_{(h,0)\rightarrow (0,0)} \frac{f((0,0)+(h,0))-f(0,0)}{|(h,0)|} = \lim_{(h,0)\rightarrow (0,0)} \frac{f(h,0)-f(0,0)}{|(h,0)|} = \lim_{(h,0)\rightarrow (0,0)} \frac{\sqrt{|0h|}-0}{|(h,0)|} = 0##

And approaching ##(0,0)## from the ##y##-direction, holding ##x## fixed:
##\lim_{(0,k)\rightarrow (0,0)} \frac{f((0,0)+(0,k))-f(0,0)}{|(0,k)|} = \lim_{(0,k)\rightarrow (0,0)} \frac{f(0,k)-f(0,0)}{|(0,k)|} = \lim_{(0,k)\rightarrow (0,0)} \frac{\sqrt{|k0|}-0}{|(0,k)|} = 0##

But, as ##k \rightarrow h## as ##h\rightarrow 0##:
##\lim_{h\rightarrow 0}[\lim_{k\rightarrow h} \frac{f(h,k) - f(0,0)}{|(h,k)|}]=\lim_{h\rightarrow 0}[\lim_{k\rightarrow h} \frac{\sqrt{|hk|}}{|(h,k)|}]##
##\lim_{h\rightarrow 0}[\lim_{k\rightarrow h} \frac{\sqrt{|hk|}}{\sqrt{h^2+k^2}}]=\lim_{h\rightarrow 0}\frac{|h|}{\sqrt{h^2+h^2}}]=\lim_{h\rightarrow 0}\frac{|h|}{|h|\sqrt{2}}]=\frac{1}{\sqrt{2}}##

So the limit does not exist because the values as ##f## approaches ##(0,0)## from different directions aren't the same:
##\lim_{h\rightarrow 0}[\lim_{k\rightarrow h} \frac{\sqrt{|hk|}}{\sqrt{h^2+k^2}}]=\frac{1}{\sqrt{2}} \neq 0 = \lim_{(h,0)\rightarrow (0,0)} \frac{f((0,0)+(h,0))-f(0,0)}{|(h,0)|} =\lim_{(0,k)\rightarrow (0,0)} \frac{f((0,0)+(0,k))-f(0,0)}{|(0,k)|}##

In short, it's kind of what Stephen Tashi said, I think.
 
Last edited:
  • #10
Alternatively, I also remembered I can convert the equation to polar coordinates to show that the limit cannot exist:

##\lim_{r\rightarrow 0} \frac{f(r,\theta) - f(0,\theta)}{|r|}=\lim_{r\rightarrow 0} \frac{\sqrt{r^2 sin(\theta)cos(\theta)}}{|r|}=\lim_{r\rightarrow 0} \frac{|r|\sqrt{|sin(2\theta)|}}{|r|}=\frac{1}{\sqrt{2}} \sqrt{|sin(2\theta)|}##

I think this is essentially the same, but I cannot be too sure. Anyway, the limit shouldn't exist, I think, since it varies with ##\theta##.
 
Last edited:

FAQ: Showing that f(x,y) = √|xy| is not differentiable at (0,0)

1. What is the definition of differentiability?

The definition of differentiability is a mathematical property of a function that means it has a well-defined tangent line at every point in its domain.

2. How do you prove that a function is not differentiable at a certain point?

To prove that a function is not differentiable at a certain point, you must show that the function does not have a well-defined tangent line at that point. This can be done by showing that the limit of the difference quotient, which is used to find the slope of the tangent line, does not exist at that point.

3. What is the difference between being continuous and being differentiable?

Being continuous means that a function has no breaks or gaps in its graph, while being differentiable means that the function has a well-defined tangent line at every point. A function can be continuous but not differentiable, but it cannot be differentiable if it is not continuous.

4. Can a function be differentiable at some points but not others?

Yes, a function can be differentiable at some points but not others. This is because a function may have a well-defined tangent line at some points, but not at others. This is often the case with functions that have sharp turns or corners in their graphs.

5. What is the significance of proving that a function is not differentiable at a certain point?

Proving that a function is not differentiable at a certain point can help us better understand the behavior of the function at that point. It can also help us identify critical points, which are points where the function is not differentiable but have important properties in relation to the function's maximum and minimum values.

Similar threads

Replies
19
Views
1K
Replies
10
Views
2K
Replies
5
Views
948
Replies
11
Views
2K
Replies
7
Views
856
Replies
4
Views
974
Back
Top