- #1
xdrgnh
- 417
- 0
Homework Statement
The linear mass density in a string is given by μ = μ0[1 + cos(x/R)] where R is a constant. If one averages this density over the large size L it becomes uniform: <μ> = μ0, where <…> means averaging. What is the minimum size L (in terms of R) such that the density can be considered uniform with an error less than 1% ?
Homework Equations
The Attempt at a Solution
So I intergrate with respect to dx over the range o to L then divide by L because I'm averaging and what I get is U+UR/L*sin(L/R). However this is my problem. The initial mass density makes no sense. When x=R*pi the density is zero. How can the density be zero on a freaking string. That makes no sense. Besides that I don't now what is meant by error. Should I equal the U+UR/L*sin(L/R) to .99U then solve?