- #1
Twoacross
- 9
- 0
Homework Statement
1. Consider the graph of a function defined parametrically by x = g(t) and y = h(t). The
slope of the curve at point (g(t), h(t)) is given by h′(t)
g′(t) . Use this result and the standard
parametrization of a polar curve r = f(delta) given above to show that the slope of a polar
graph at the point (f(delta),delta ) is given by:
f′(x)*sin(x) + f(x)*cos(x)
f′(x)*cos(x) − f(x)*sin(x)
.
In the lab, a polar curve will be given as the graph of the function r = f(delta) in polar
coordinates. You will need to be able to match points on the curve with their corresponding
values of in the interval [a, b]. For example, points in the first quadrant will corresond to
values of in the interval [0, /2] when f() is positive on this interval.
2. What condition on r will force points in the first quadrant to correspond to values of delta
between Pi and 3Pi / 2 ? Explain.
3. Given a polar graph of a function r = f(delta), how can you determine which values of
correspond to points (f(delta),delta ) where the curve crosses the x -axis?
Homework Equations
The Attempt at a Solution
I had done a question like this but it did not involve sin so i tried using identities and from there, i was stumped. I've attached my assignment to this post, its the first three questions, the rest is using maple. All help is appreciated, I am still trying to make sense of it
Thank you!