- #281
The Electrician
Gold Member
- 1,402
- 210
tim9000 said:hah, ok, it's a bit of a mystery proof.
That does sound like a useful book though.So an under-used transformer will have most of the losses in teh core, and a fully loaded transformer will have most of the losses in the copper. So are you saying the designer of a distribution transformer will try and average out where the losses in the TX are, based on how loaded the TX is? For instance, say it was at rated power 100% of the time, THEN you'd want copper and iron losses to be equal, but say it was at rated power 20% of the time and over rated at 80%, then you'd have a preference to minimising copper losses? Conversely if it was at rated power 20% of the time and under rated power 80%, then you'd have a preference to design it to minimise core losses?
Is that how I should be interpretting your statement?
Not quite. It wouldn't be a good idea to run a transformer over its rating 80% of the time.
If it's running at less than full rated power most of the time, which is typical for distribution transformers, you would use less copper, so the copper losses would be higher at full rated power.
There can be other considerations that determine the allocation of copper and core losses.
For example, here's a transformer from a 4000 watt sine wave inverter:
This transformer has 12 watts of core loss at no load, and at full load. The copper loss at full load is about 200 watts. The reason for this disparity is that customers seldom run their inverters at full load, and they want very low no load and light load losses, core and copper combined. The no load losses determine how fast your battery will run down when only the nite lite is being powered.