- #1
jmcgraw
- 70
- 0
I'm trying to get a handle on seeing whether a vector field (in the x-y plane for simplicity) has zero divergence at a point or is divergence free altogether. I'm having some trouble with this.
The way I am thinking about it is to mentally draw a little box around my point and then see whether the arrows coming into my box are smaller than the arrows coming out of my box. If the arrows are smaller coming out, the divergence is negative because their is a net inflow. If the arrows coming out of the box are bigger than the ones coming in, the divergence is positive because of a net outflow. If the arrows coming in and out are equal, then the divergence is zero because of no net in/outflow.
Is this the correct way to look at it? My text makes me think that it is, but then when I get to divergence free fields I learn that r/(||r||^3) (where r is the position vector) has a divergence of zero everywhere except the origin where it is undefined. Yet, if I drew a box (or circle) around any point in this field, the arrows flowing in would be bigger than the arrows flowing out (magnitude varies inversely with the square of the distance), so I would guess from looking at the vector field that the field has negative divergence at that point (and at every point, in fact). But it's divergence free!
In what way am I not seeing this correctly?
As an example of how I am thinking about this, here is a problem:
http://img384.imageshack.us/img384/3148/3fields3pm.jpg
My answers:
(a) is positive because there is a net outflow; (b) is 0 because there is no variation in direction of flow; and (c) is negative because there is a net inflow.
Am I correct? And if so, how does this square with my question about r/(||r||^3)?
Thanks so much!
The way I am thinking about it is to mentally draw a little box around my point and then see whether the arrows coming into my box are smaller than the arrows coming out of my box. If the arrows are smaller coming out, the divergence is negative because their is a net inflow. If the arrows coming out of the box are bigger than the ones coming in, the divergence is positive because of a net outflow. If the arrows coming in and out are equal, then the divergence is zero because of no net in/outflow.
Is this the correct way to look at it? My text makes me think that it is, but then when I get to divergence free fields I learn that r/(||r||^3) (where r is the position vector) has a divergence of zero everywhere except the origin where it is undefined. Yet, if I drew a box (or circle) around any point in this field, the arrows flowing in would be bigger than the arrows flowing out (magnitude varies inversely with the square of the distance), so I would guess from looking at the vector field that the field has negative divergence at that point (and at every point, in fact). But it's divergence free!
In what way am I not seeing this correctly?
As an example of how I am thinking about this, here is a problem:
http://img384.imageshack.us/img384/3148/3fields3pm.jpg
My answers:
(a) is positive because there is a net outflow; (b) is 0 because there is no variation in direction of flow; and (c) is negative because there is a net inflow.
Am I correct? And if so, how does this square with my question about r/(||r||^3)?
Thanks so much!
Last edited by a moderator: