- #1
a3sportback54
- 4
- 0
I'm having a bit of trouble getting the right answer for a problem in my electrical engineering textbook. The question is:
At t=0, the current flowing in a 0.6H inductance is 8A. What constant voltage must be applied to reduce the current to 0A at t=0.4s.
I'm using the equation:
[tex]i(t)=\frac{1}{L}\int\ v(t).dt+i(t_{0})[/tex] with limits 0.4s and 0s
The answer I'm getting is 12V, and the book says 10V.. which is right??
I'm using L=0.6H, i(t0)=8A, and i(t)=0A
Please help!
At t=0, the current flowing in a 0.6H inductance is 8A. What constant voltage must be applied to reduce the current to 0A at t=0.4s.
I'm using the equation:
[tex]i(t)=\frac{1}{L}\int\ v(t).dt+i(t_{0})[/tex] with limits 0.4s and 0s
The answer I'm getting is 12V, and the book says 10V.. which is right??
I'm using L=0.6H, i(t0)=8A, and i(t)=0A
Please help!