- #1
bengaltiger14
- 138
- 0
I am to design a simple voltage regulator(resistors only) to deliver 7.00 volts to a load with a 10v power supply. The circuit will be 1 resistor in series with two parallel resistors. The last parallel resistor being Rload (which can vary from 1000 to 1500 ohms).
Through experiment, I found that I could use a 100 ohm resistor for R2 which would make R1 a 40 ohm resistor. My problem is determining the value for R1 through calculation.
R2 and Rload at 1000 ohms is 91 ohms. So I use the formula: (91/(91+R1)) * 10 = 7.0. Is the right in determing R1? If you plug 40 for R1 is works.
I just need calculations proving this and I think I am having algebra problems solving for R1.
Through experiment, I found that I could use a 100 ohm resistor for R2 which would make R1 a 40 ohm resistor. My problem is determining the value for R1 through calculation.
R2 and Rload at 1000 ohms is 91 ohms. So I use the formula: (91/(91+R1)) * 10 = 7.0. Is the right in determing R1? If you plug 40 for R1 is works.
I just need calculations proving this and I think I am having algebra problems solving for R1.