Understanding Transformers: The Relationship Between Voltage and Amperage

  • Thread starter yoga face
  • Start date
  • Tags
    Transformer
In summary, when a transformer increases voltage, it decreases the amperage in order to maintain the same power output. This is due to the relationship between voltage, amperage, and resistance. The transformer steps up the voltage, resulting in a decrease in amperage. This is not free energy, as the power output remains the same. The transformer decreases the potential amperage, not the actual amperage. This may seem counterintuitive, but it is a result of the transformer's function.
  • #1
yoga face
24
0
here is something I do not understand when a transformer increases voltage it lowers amperage

that way there is no free energy

as power is Volt * amps (watts)but amperage is not preset as it depends on ohms (the resistance to the current or amperage) as amperage = volts / ohms

so what amperage are they talking about here when they say when a transformer increases voltage it lowers amperage?

for example, if there is no usage of electricity after the transformer transforms then there is no ohms (the resistance is infinite meaning no current or amperage ) so how can a transformer decrease that which does not exist ?to readdress my question from another angle
if a line has 120 v and 20 ohms resistance then amperage is 120/20 or 5 amps

if a transformer is introduced that raises voltage to 240 then 240/20 is 12 amps

so increasing the volts increases the amperage it does not decrease it !

the only answer I can think of is that when increasing voltage you increase the resistance to the current ( even though it is the same thing still resisting IE same lightbulb)
 
Last edited:
Physics news on Phys.org
  • #2
Increasing the voltage at the output winding of a transformer does not decrease the current it decreases the POTENTIAL current.

That is, if a transformer input can handle 120 volts at 5 amps, then if the output is 240 volts, the potential current is 2.5 amps. Any attempt to draw more current than that will overload the transformer and lower the output voltage, or melt things or blow fuses, or something like that.
 
  • #3
phinds said:
Increasing the voltage at the output winding of a transformer does not decrease the current it decreases the POTENTIAL current.

That is, if a transformer input can handle 120 volts at 5 amps, then if the output is 240 volts, the potential current is 2.5 amps. Any attempt to draw more current than that will overload the transformer and lower the output voltage, or melt things or blow fuses, or something like that.

ok

I just toured the Niagara Falls power plant which is at Queenston
there they have the most dramatic depth of gorge creating greater potential energy from falling water which is taken from the river before the falls by a manmade grid


I asked the guide what a transformer does then I said is that not free energy then he said no because it decreases the amperage (not potential amperage)

this made no sense so I thought he was making stuff up so I did not query further


cheers
 
  • #4
yoga face said:
WHAT AM I MISSING HERE CONCEPTUALLY??

A lot.
Remember that the main purpose of the transformer is to step up or step down the voltage. The fact that the current also gets stepped down or stepped up(respectively) is a consequence of this.
First, some terminology - the side of the transformer which is connected to the supply mains is called the primary, and the side connected to the resistor or device is called the secondary.
Now, coming to your example, you have correctly used a transformer to step up the voltage so that you can supply more power to the resistor. What you are doing is comparing the secondary current before you connected the transformer to the secondary current after you connected the transformer. What you should be doing is comparing the primary and secondary current after you connected the transformer. So if you measure the primary current, you'll find that it equals 24A, or twice of what it is in on the secondary side, so it does step down the current.
Remember that the equation ##V_1I_1 = V_2I_2## is an approximate relation. In reality, the correct equation would be ##V_1I_1 = V_2I_2## + power lost in the iron core of the transformer due to production of eddy currents there and magnetic hysteresis.
Under normal operating conditions, the power lost is a small fraction of the power consumed so the first relation mentioned above would have a good validity. However, if you leave the secondary open circuited there is no power consumed so that approximate relationship will no longer be valid, and the entire power supplied to the primary would equal the power lost in the iron core.
On a side note, 120/20 = 6, not 5.
Edit: A typo, primary current = 24A(not 6A) if secondary is 12A.
 
Last edited:
  • #5
yoga face said:
here is something I do not understand


when a transformer increases voltage it lowers amperage

that way there is no free energy

as power is Volt * amps (watts)


but amperage is not preset as it depends on ohms (the resistance to the current or amperage) as amperage = volts / ohms

so what amperage are they talking about here when they say when a transformer increases voltage it lowers amperage?

for example, if there is no usage of electricity after the transformer transforms then there is no ohms (the resistance is infinite meaning no current or amperage ) so how can a transformer decrease that which does not exist ?


to readdress my question from another angle



if a line has 120 v and 20 ohms resistance then amperage is 120/20 or 5 amps

if a transformer is introduced that raises voltage to 240 then 240/20 is 12 amps

so increasing the volts increases the amperage it does not decrease it !

the only answer I can think of is that when increasing voltage you increase the resistance to the current ( even though it is the same thing still resisting IE same lightbulb)

You start with 20 ohms that draws 6 amps at 120v.

When you introduce the step-up transformer the 20 ohm resistor sees 240v and draws 12 amps. However this means that the line must now supply 24 amps. ( rather than the six amps it supplied when directly wired to the resistor). You have 120v and 24 amps on the line side and 240v and 12 amp on the load side, and you end up with the line supplying 2.88 kw and the resistor dissipating 2.88 kw.

The voltage steps up from line to load, and the current steps up from load to line.

You can look at as the transformer stepping up voltage and stepping down amperage from line to load, but in reality, it is the load that determines the current.

Another way to look at it is that the transformer "steps down" the resistance of the load that the voltage source "sees". In the example, the voltage source supplies as much current as if it were directly supplying a 10 ohm resistor.

So
 
  • #6
physwizard said:
A lot.
Remember that the main purpose of the transformer is to step up or step down the voltage. The fact that the current also gets stepped down or stepped up(respectively) is a consequence of this.
First, some terminology - the side of the transformer which is connected to the supply mains is called the primary, and the side connected to the resistor or device is called the secondary.
Now, coming to your example, you have correctly used a transformer to step up the voltage so that you can supply more power to the resistor. What you are doing is comparing the secondary current before you connected the transformer to the secondary current after you connected the transformer. What you should be doing is comparing the primary and secondary current after you connected the transformer. So if you measure the primary current, you'll find that it equals 24A, or twice of what it is in on the secondary side, so it does step down the current.
Remember that the equation ##V_1I_1 = V_2I_2## is an approximate relation. In reality, the correct equation would be ##V_1I_1 = V_2I_2## + power lost in the iron core of the transformer due to production of eddy currents there and magnetic hysteresis.
Under normal operating conditions, the power lost is a small fraction of the power consumed so the first relation mentioned above would have a good validity. However, if you leave the secondary open circuited there is no power consumed so that approximate relationship will no longer be valid, and the entire power supplied to the primary would equal the power lost in the iron core.
On a side note, 120/20 = 6, not 5.
Edit: A typo, primary current = 24A(not 6A) if secondary is 12A.

ah

the lightbulb inside my head just went on


cheers
 
  • #7
So, this is watt happens ...

if you leave the secondary circuit open there is no power consumed, therefore no current on either side of the transformer

however the current on the primary side will always be double the current on the secondary side if the transformer has doubled the voltage and if there is a current on the secondary side

does this sound correct ?or do you find my answer shocking
 
  • #8
yoga face said:
So, this is watt happens ...

if you leave the secondary circuit open there is no power consumed, therefore no current on either side of the transformer

however the current on the primary side will always be double the current on the secondary side if the transformer has doubled the voltage and if there is a current on the secondary side

does this sound correct ?or do you find my answer shocking

No, it does not sound correct. You continue to misunderstand what transformers do. IF you were to maintain the same load and IF the transformer is capable of doubling the current and IF the output voltage is doubled, THEN , yes, the current will double.
 
  • #9
phinds said:
No, it does not sound correct. You continue to misunderstand what transformers do. IF you were to maintain the same load and IF the transformer is capable of doubling the current and IF the output voltage is doubled, THEN , yes, the current will double.

but step up transformers do not increase current on the secondary side

they increase voltage and decrease current thereby maintaining same wattage (no free lunch )

step down transformers increase current but decrease voltage


transformers never increase voltage and current both at the same time or they would violate the conservation of energy law
 
  • #10
yoga face said:
transformers never increase voltage and current both at the same time or they would violate the conservation of energy law

You have the right idea, BUT if the load is small enough that the transformer is not saturated, THEN you CAN get twice as much power on the output of a step-up transformer as you would if you did not have the transformer.

EXAMPLE:

A step-up transformer doubles voltage from 120 to 240 volts and is capable of transforming 10 amps on the input to 5 amps on the output.

If the load on the transformer is 120 ohms, then without the transformer the current through the load would be 1 amp and WITH the transformer the current through the load will be two amps.

This is not a free lunch because the input and output power of the transformer is the same.
 
  • #11
phinds said:
You have the right idea, BUT if the load is small enough that the transformer is not saturated, THEN you CAN get twice as much power on the output of a step-up transformer as you would if you did not have the transformer.

EXAMPLE:

A step-up transformer doubles voltage from 120 to 240 volts and is capable of transforming 10 amps on the input to 5 amps on the output.

If the load on the transformer is 120 ohms, then without the transformer the current through the load would be 1 amp and WITH the transformer the current through the load will be two amps.

This is not a free lunch because the input and output power of the transformer is the same.

ok

so

the current on the primary side always more than the current on the secondary side so that wattage (V*I) is the same on both sides
 
Last edited:
  • #12
If you want to step up current then the primary side will have less current then secondary side,one rule with transformers if you step up voltage you will step down current and if you step down voltage you will step up current as the net power remains same on the both the sides .

Transformers transfer energy from one side (primary) to other side (secondary),the net power remains same on the both the sides ,the only thing differs is the voltage to current ratio.

Good Luck
 
  • #13
debjit625 said:
If you want to step up current then the primary side will have less current then secondary side

Good Luck

do not you mean the primary has more current ? as it has increased voltage to the secondary it must increase amperage to the primary to keep wattage in / wattage out the same
 
  • #14
In a transformer the input side is called the primary and the output side is called the secondary.The word primary and secondary are only terms ,to identify primary and secondary winding of a transformer,physically a transformer will have two winding one for input another for output.

For example a step down transformer 220 VAC - 12 VAC,it will have two winding one for 220 VAC (large winding) and another for 12 VAC (small winding) ,now as mentioned its a step down you can call the 220 VAC winding the primary and 12 VAC winding the secondary.But if you reverse it like you apply 12 VAC on secondary then you will get 220 VAC on primary.You will be using that transformer as step up transformer and the you have to call the 12 VAC winding primary and 220 VAC winding secondary.

Now about the current in short
OK now when you apply voltage to the input of the transformer ,the primary winding must produce (as its inductor )
an opposing and equal amount of voltage.Now to produce the opposing voltage the primary winding (inductor) needs current i.e.. their will be a current in the primary winding.

A transformer only transfer power from one side to another.
Power = Voltage * Current
Now this voltage and current in the primary makes the power(energy) which will be transferred to secondary,the secondary will have same power but as the dimension of the indutor/coil of the secondary winding is different from the primary winding of a 220 VAC - 12 VAC step down transformer their will be a change in voltage to current ratio as still the power will be equal.

For example let say a 220 VAC - 22 VAC step down transformer,you input 220 VAC in primary you will get 22 VAC at the output on the secondary side,but in primary their will be a current as said earlier let say their will be 1 Ampere of current in primary ,that means the power in the primary side is 220 VAC * 1 Ampere = 220 Watts ,that also means their will be 220 Watts in secondary but we know secondary have 22 VAC so the current in secondary will be 220 Watts / 22 VAC = 10 Amperes.

Now use that same transformer as step up configuration (the primary and secondary have been reversed) , you input 22 VAC in primary then their will be 10 Amperes of current in primary winding and you will get 220 VAC and 1 Ampere at output.

Good Luck
 
  • #15
Let's see if I understand this correctly.

Say the primary circuit is capable of supplying up to 10 amps and the circuit runs at 100 volts. The transformer steps up the voltage in the secondary to 200 volts. This means that whatever is on the secondary side always has 200v applied to it and, depending on the resistance/inductance of the load, the secondary can supply any amount of current up to 5 amps and no more.

Is that correct?
 
  • #16
Yes, if the primary circuit (primary winding) is capable of supplying max up to 10 amps.

One more thing ,the primary winding will not always carry 10 Amp ,its totally dependent on the load at the secondary side,if their is less current on secondary side their will be less current on the primary side.Remember at any instance the power of primary and secondary are equal.

Now their will be a limit upon the current on both the sides because of the ampacity of the inductor's/coil's wires at primary and secondary winding,in you case it can be 10 Ampere.

Good Luck
 

FAQ: Understanding Transformers: The Relationship Between Voltage and Amperage

1. What is the relationship between voltage and amperage in a transformer?

The relationship between voltage and amperage in a transformer is known as Ohm's Law, which states that voltage is directly proportional to amperage, with resistance being the constant factor. This means that as voltage increases, so does amperage, and vice versa.

2. How does a transformer convert voltage and amperage?

A transformer converts voltage and amperage through the principle of electromagnetic induction. This involves two coils of wire, known as the primary and secondary coils, being placed close together but not touching. When an alternating current (AC) passes through the primary coil, it creates a varying magnetic field, which then induces an electrical current in the secondary coil, thus converting the voltage and amperage.

3. What is the difference between step-up and step-down transformers?

A step-up transformer increases the output voltage and decreases the output amperage, while a step-down transformer decreases the output voltage and increases the output amperage. This is achieved by adjusting the ratio of the number of turns in the primary and secondary coils.

4. What are some common applications of transformers?

Transformers are used in a variety of everyday devices, such as chargers for electronic devices, power adapters for laptops, and power supplies for appliances. They are also used in power distribution systems to step-up or step-down the voltage for efficient transmission.

5. Can a transformer be used to increase the power output?

No, a transformer cannot increase the power output. It only converts the voltage and amperage, but the overall power output remains the same. This is because of the principle of energy conservation, which states that energy cannot be created or destroyed, only converted from one form to another.

Back
Top