How frequency play vital role in AC?

In summary: AC has certain frequency set for it., in India 50hz and in America 60hz is kept as standard frequency.., y is that kept so?? is it possible to keep frequency value higher than this??It's a compromise. Higher frequencies could make use of smaller transformers but they are not good for transmission, too much power is radiated. 50-60Hz are frequencies at which it was easy to make induction motors back in the 1900's. Europe uses 50Hz, the USA uses 60Hz. There is no good reason to choose one or the other it is just historical.In summary, the frequency is kept at 50 or 60 Hz because it's a compromise between higher frequencies and smaller transform
  • #1
Srini karthik
13
0
how frequency play vital role in AC??

AC has certain frequency set for it., in India 50hz and in America 60hz is kept as standard frequency.., y is that kept so?? is it possible to keep frequency value higher than this??
 
Engineering news on Phys.org
  • #2


It's a compromise. Higher frequencies could make use of smaller transformers but they are not good for transmission, too much power is radiated. 50-60Hz are frequencies at which it was easy to make induction motors back in the 1900's. Europe uses 50Hz, the USA uses 60Hz. There is no good reason to choose one or the other it is just historical. I think Japan uses both! Once a standard was established for a country it was effectively fixed, because all the machinery would be made to match the synchronous motor speed for that supply.
 
  • #3


y is that kept so??

In the UK the frequency is set at anominal 50 cycles per second.
If you actually monitor this you will find that this fugure is not perfect. The actual value varies with the time of day and thus the load demand.
The suppliers may reduce the frequency slightly when the load is high so the consumer actually receives slightly less power - note that equations for power in AC devices contain a frequency component. This has some advantages over a voltage reduction for the same reason, although voltage reduction is also practised.

The supply of electricity laws (legal not physical) were written in the days when the mains was extensively used for electric clocks and other timing devices. So they say that over a 24hour period the correct number of cycles must be supplied. Note I am using cycles, not Hz as the laws were written in terms of cycles.

is it possible to keep frequency value higher than this??

Of course it is possible.
The electricity supply in aircraft is often 400 or 440 Hz and the supply from an auto alternator varies with engine speed.
On this note, generators will have a narrow speed range at which they are most efficient so they are designed to operate within this range so it would be increasingly inefficient for a 50 cycle generator to stray too far from this figure.

go well
 
  • #4


The suppliers may reduce the frequency slightly when the load is high so the consumer actually receives slightly less power - note that equations for power in AC devices contain a frequency component. This has some advantages over a voltage reduction for the same reason, although voltage reduction is also practised.

This is simply not true. The generators at power stations operate with speed governors which hold the output frequency as close to the nominal frequency as possible.

Reducing the frequency would have no effect on power drawn by resistive loads and would increase the current drawn by inductive loads.
 
  • #5


vk6kro said:
Reducing the frequency would have no effect on power drawn by resistive loads and would increase the current drawn by inductive loads.

I thought there might be a difference, could you elaborate?

When people ask the difference between volts, amps, watts, etc. i usually give a response something like:

Beyond mathematics, and semantics, the important part to me would be to think of it in terms of what unit everyday items are rated in. Each type of compenent is rated by a different unit, because that unit of measure best explains what you are trying to do with the items.

For example:
Voltage: This unit is used for current conductors (wires.) Wires are rated in voltage because high voltages require expensive insulation. Also; voltage drop, is based on wire size. That is; high voltage can use smaller wire size, but will need expensive insulation.

Current: Most load devises are rated in "amps." This is because most devices are designed to draw a certain number of amps. Increasing the voltage will increase the horsepower (or watts). Many motors are controlled by VFD (Variable frequency drives, which adjust the 60hz cycle or the voltage. this is also true of light bulbs. So in most loads the work (watts) can be variable, so can the voltage, but typically the amperage (current) is constant.

Power: Watts is the preferred unit of measure for generators. This is important because watts measures the amount of work that can be performed. A generator can use a transformer to change the voltage or the amperage to virtually anything. Wattage is the only thing that is constant, and consequently the most important unit because it measures the amount of work that can be done (e.g. the number of light bulbs, houses, air conditioners, etc. that can be run.)

I've always thought i was somewhat incorrect in regards to Amps in this response, or was missing something. In particular, i am not sure if you are familiar with VFD's, but i work in construction, and have some experience with them. I have not really understood why to regulate fan motors, the frequency is modulated instead of the voltage. I think it does both, but the name of it is "Variable frequency drive", not "variable voltage drive."
 
  • #6


A VFD generally needs to maintain a constant frequency/voltage ratio below motor nameplate speed. Above that speed the voltage is generally NOT increased, only frequency.
 
  • #7


Averagesupernova said:
A VFD generally needs to maintain a constant frequency/voltage ratio below motor nameplate speed. Above that speed the voltage is generally NOT increased, only frequency.

Does this have something to do with the difference between watts and volt-amps? That is, changing the power factor to increase fan speed?
Sorry if that didn't make sense, power factor is a component of electricity i don't really understand, and Wikipedia isn't much help.
 
  • #8


Reducing the frequency would have no effect on power drawn by resistive loads and would increase the current drawn by inductive loads.

I thought there might be a difference, could you elaborate?

The power used by a resistive load just depends on the voltage and the resistance of the load.

For example, a 120 ohm resistor with 120 volts RMS across it will draw 1 amp.

The current taken by an inductor will depend on the reactance of the inductor, given by XL = 2 * π * F * L.
Assuming it has no resistance, the current taken by an inductor of 318 mH will be
120 volts / ( 2 * π * 60 * 318 mH) or 1 amp.

If the frequency was reduced to 50 Hz, the same 318 mH inductor would draw
120 volts / ( 2 * π * 50 * 318 mH) or 1.2 amps.

So, you can see that reducing the mains frequency would not work to reduce the power consumed.

If the load was an induction motor, the speed of the motor would decrease and the motor would generate less back EMF, so it would draw more current.
 
  • #9


if frequency of domestic AC increased to 150 or 200 hz wat will happen to the home appliances,equipments?
 
  • #10


Srini karthik said:
if frequency of domestic AC increased to 150 or 200 hz wat will happen to the home appliances,equipments?

Surprisingly little.

Most things would still work just fine.

Lamps, TVs, electric stoves, toasters, ovens and computers would all work.

Problems may arise where motors are used in equipment. Electric drills and vacuum cleaners would be OK as they use universal motors.
Refrigerators and washing machines and electric fans would try to run at 3 or 4 times normal speed and probably fail. These run on induction or squirrel cage motors which synchronize with the mains frequency (minus slip).

Mains synchronized electric clocks would run fast, but most clocks now would be quartz crystal controlled.
 
  • #11


I actually have a answer for this one. Didn't read all answers, so if somebody mentioned it...

50 Hz is used, because below that frequency, you can see the flickering of the light bulb. Why is 60 Hz used in America? That I don't know, but it is over 50Hz.

And its reasonable to make the turbine spin that fast 3000 revs/minute. 150 Hz, well you do the math, you would have tremendous torque.
 
Last edited:
  • #12


y is that kept so??

I am thoroughly disappointed by the response of the Science Advisor to my post on this, especially as he did not address the original question, shown above.

Whilst I agree that the current decreases with increasing frequency for an inductive circuit, he is looking at the question from the wrong point of view.

It is the supplier not the consumer that sets the frequency. So we need to look at this from the supplier's point of view.

To the supplier, power factor is all, and the power factor falls dramatically with increasing frequency.

Since the supplier always wishes to raise the power factor, lowering the frequency in times of heavy demand, will raise the power factor.

The example given in post#8 is actually not real world as no real power is drawn since there is no resistance I have appended a spreadsheet showing the variation of current and more importantly, power factor, for a typical 240 volt fan heater with series inductance and resistance. The formulae are included so you can experiment with changing the parameters.
A frequency range of zero to 500 Hz is included.

I will leave it to the Science Advisor to explain what happens (from the supplier's point of view) when 1 million consumers connect their fan heaters in parallel.
 

Attachments

  • powerfactor3.xls
    16.5 KB · Views: 338
  • #13


in US, utilities do not deviate intentionally from 60 hz.

if they are short of power they either disconnect blocks of load , aka rolling blackouts, or lower the voltage.

if frequency is not 60.0 hz something is very wrong and the condition won't last long.

big turbines are exceedingly sensitive to off-frequency operation. the blades and shafts have mechanical resonances that MUST be avoided.

our big turbine had underfrequency protection - it'd trip after two minutes at 58 hz to disconnect itself from a grid that's obviously in immense trouble..

at 62 hz the governor would have completely closed the steam inlet valves trying to slow turbine down

at 66 hz turbine would trip immediately to protect itself from excessive centrifugal force.
 
Last edited:
  • #14


I have been monitoring my mains frequency since this morning.

0900 49.9Hz

1200 49.9

1600 49.9

2100 49.7

2120 49.8

2125 49.9

all zulu time.

Note it is possible (likely?) that my counter is slightly off absolute values, but it has been connected all day so will show up differences.

Thank you Jim for a proper discussion on the subject. I'm sure that, like other electrical practices UK and US differ so we can swop notes.

go well
 
Last edited:
  • #15


I thought I was pretty gentle with you.

I watched a TV show called "Britain from above" where they showed what was happening in a power station control room as the TV program "East Enders" was finishing.
Because everyone then migrated to the kitchen to turn on the kettles for a cuppa, the demand went through the roof.

The way they were using to tell the load was becoming excessive was to monitor the frequency.
As the frequency dropped to 49 point something, they switched in power from hydroelectric stations in England and Scotland and then even from France.

There was absolutely no way they were going to let that frequency drop below 49 Hz.

Suggesting that they would deliberately drop the frequency to save power was absurd.
 
  • #16


Studiot said:
I am thoroughly disappointed by the response of the Science Advisor to my post on this, especially as he did not address the original question, shown above.

Whilst I agree that the current decreases with increasing frequency for an inductive circuit, he is looking at the question from the wrong point of view.

It is the supplier not the consumer that sets the frequency. So we need to look at this from the supplier's point of view.

To the supplier, power factor is all, and the power factor falls dramatically with increasing frequency.

Since the supplier always wishes to raise the power factor, lowering the frequency in times of heavy demand, will raise the power factor.

The example given in post#8 is actually not real world as no real power is drawn since there is no resistance I have appended a spreadsheet showing the variation of current and more importantly, power factor, for a typical 240 volt fan heater with series inductance and resistance. The formulae are included so you can experiment with changing the parameters.
A frequency range of zero to 500 Hz is included.

I will leave it to the Science Advisor to explain what happens (from the supplier's point of view) when 1 million consumers connect their fan heaters in parallel.

Where did you get your figure for series (inductance) for your fan heater? Did you calculate this for a coiled heating element? 100mH? If this represents the motor, you would expect it to appear in parallel, no?
 

FAQ: How frequency play vital role in AC?

How does frequency affect the efficiency of an AC system?

The frequency of an AC system directly impacts its efficiency. A higher frequency means that the AC system can transfer more energy in a shorter amount of time, making it more efficient. However, if the frequency is too high, it can cause excessive heat and damage the system.

What is the ideal frequency for an AC system?

The ideal frequency for an AC system depends on various factors such as the type of system, its capacity, and the power supply. Typically, most AC systems operate at a frequency of 60 Hz in the United States and 50 Hz in Europe.

How does the frequency affect the voltage in an AC system?

The frequency of an AC system is directly proportional to its voltage. This means that as the frequency increases, the voltage also increases. This is why higher frequency systems require higher voltage levels to operate efficiently.

Can changing the frequency affect the performance of an AC system?

Yes, changing the frequency can significantly impact the performance of an AC system. If the frequency is too high, it can cause overheating and damage to the system. On the other hand, if the frequency is too low, it can lead to inefficient operation and decreased performance.

How does frequency impact the cost of an AC system?

The frequency of an AC system can affect its cost in several ways. Firstly, higher frequency systems require more expensive equipment to handle the increased voltage levels. Secondly, the frequency can impact the efficiency of the system, resulting in higher energy costs. Therefore, selecting the right frequency for an AC system is crucial in terms of cost-effectiveness.

Similar threads

Back
Top