Wouldn't using AC to power a lamp result in flickering?

In summary, using AC to supply power to a lamp causes it to flicker due to the constantly changing potential difference and resulting in a continuous process of giving and taking electrons. However, the flicker is usually too fast to be perceived by the human eye. The type of bulb being used also affects the flicker, with incandescent bulbs having a slower response time and LED bulbs having a sharper on/off characteristic. The choice of frequency and voltage also plays a role in the visibility of flicker. Other factors, such as the persistence of phosphors, can also impact the flicker of devices such as CRT monitors and TVs.
  • #1
Yealtas
Hello,

I was thinking that using AC to supply power to a lamp would cause it to flicker, considering that the potential difference keeps changing signs.

This would mean that the current supplies voltage to the lamp, only to take it back after a really small amount of time. Voltage say's something about the amount of elektrons per time unit that a component 'recieves'. So it's really just a constant process of giving and taking elektrons from the lamp.

Upping the frequency would mean you simply repeat this process quicker. Would that not mean that lamps are actually flickering at all times, unless you use DC? Obviously the light would flicker really fast so you wouldn't be able to notice it.

My English is ok, but not amazing, so I hope the jargon used is actually correct English aswell.

-Y
 
Physics news on Phys.org
  • #2
You're right, it does. However, the blink rate (2x power frequency) is too fast to see in most circumstances and too fast for most people to see.

When I was younger and my eyes were much better, I could sometimes see flicker in fluorescent lights. I recall noting that I could see flicker in Europe (with 50Hz power) but not in America (with 60Hz power). That was the limit of my visual perception. By the time I reached middle age, my eyes were no longer good enough to see flicker.

It also effects the lifetime of incandescent bulbs, because the filaments head/cool expand/shrink with each cycle. In the Thomas Edison museum in Menlo Park, NJ some of Edison's DC light bulbs have been burning continuously for 120 years. That's much easier to do with DC.
 
  • Like
Likes GhostLoveScore
  • #3
Yealtas said:
I was thinking that using AC to supply power to a lamp would cause it to flicker..

The current heats the filament in both directions. The AC is normally 50 or 60 cycles per second so the lamp is heated 100 or 120 times per second. The time between heating pulses is very short so the filament doesn't cool down enough to see the flicker.
 
  • #4
You need to be specific about the type of bulb you are talking about.
An ordinary incandescent bulb filament heats up the same regardless of the direction of current flow. A smoothly alternating current goes through zero current as it changes from one current direction to the other, so the filament would briefly be heated less at that time. But it probably cools off such a small amount that it would be better to say that the brightness is "wavering" rather than "flickering".

Florescent bulbs usually flicker. I don't know if LEDs do or not. It probably depends on the electronics that converts standard wall socket power to the power supplied to the LED.
 
  • #5
Yes LEDs can also flicker. The ones that look like filaments seem to be the worse. Some LED respond so fast it's possible to make them deliberately flicker at high frequencies and use that to make a network. Think it's called lifi?
 
  • Like
Likes FactChecker
  • #7
CWatters said:
The AC is normally 50 or 60 cycles per second
Those two frequencies fall around a very critical value, concerning the sensitivity of our eyes to flicker. The US chose 60Hz, which produces 120 brightness peaks per second. The UK chose 50Hz, which produces just 100 peaks per second. 60 Hz is a lot less visible. There is another factor which used to make flicker in the US less visible. US uses about half the mains voltage that UK does. That means the lamp filaments tend to be more massive and they heat up and cool down slower, which reduces the temperature swing (and hence the brightness). Double whammy; lighting in the US (filament lighting) was much more satisfactory. Likewise, the (analogue) TV frame repetition rate was higher in the US and the pictures flickered less.

LEDs have a much sharper on/off characteristic (also fluorescent tubes) and the flicker is more noticeable. I notice it particularly with drips falling from a water tap. They appear as a set of bright beads rather than a single stream.

Yealtas said:
Voltage say's something about the amount of elektrons per time unit that a component 'recieves'. So it's really just a constant process of giving and taking elektrons from the lamp.
I don't t think that's an accurate enough description of how electrons are involved in 'electricity'. It is more likely to mis-lead than help anyone. At any time, there are the same number of electrons in a lamp.
 
  • Like
Likes Asymptotic and anorlunda
  • #9
As a matter of interest, a flashlamp bulb can respond to audio frequencies, and can be used for optical communication.
 
  • #10
tech99 said:
As a matter of interest, a flashlamp bulb can respond to audio frequencies, and can be used for optical communication.
Yep. Long before LEDs and 'Optical Communications'. Not a very linear system though but it definitely works.
 
  • #11
As posted above, in the case of incandescent bulbs, the filaments have a relatively slow response time, sort of a video equivalent to reverb, also described as persistence, where the filament continues to glow for a while even after current is shut off. The end result is the intensity doesn't vary much and isn't that noticeable. The worst case flicker I've seen are half-wave LED Christmas lights, which just use a diode to only allow half of each AC cycle to power the LED's. Full wave LED lights use a rectifier circuit to power the LED's on both halves of an AC cycle. Some LED circuits may include capacitors to level out the voltage / current to reduce flicker.

CRT monitor flicker is affected by refresh rate and persistence of the phosphors. There's a trade off between flicker and smearing of images with the persistence of the phosphors used in a CRT monitor. CRT TV's have a slower effective refresh rate than CRT computer monitors, so the CRT TV's use somewhat longer persistence phosphors. Most CRT computer monitors will have some flicker at 60 hz, and 75 hz or 85 hz is needed to virtually eliminate flicker since that is what the phosphors persistence is set for. The main exception is the old IBM 3270 series CRT monochrome (green) monitors. The 3270 monitors were "block oriented" terminals, typically displaying a fixed text screen with fields to be filled in by the operator. The persistence of the phosphors on these monitors was about 1.5 seconds: moving the cursor at 10 characters per second left a trail of about 15 or so cursor images of diminishing brightness. If the screen display was changed to a new set of fields, it would take about 1.5 seconds for the prior image to fade away. In addition to eliminating flicker, the 3720 monochrome monitors used a very thin and sharp font.
 
  • Like
Likes OmCheeto, FactChecker and sophiecentaur
  • #12
There is a measurable 120 Hz flicker in US incandescent lighting, a stronger flicker at the same rate in fluorescent illumination, and a maximal on-off flicker in AC-driven LED light. However, the cyclic variation of brightness is too fast for most people to perceive without a device to count the on/off cycles.

Back in the days of music pressed into vinyl disks, stroboscopic calibration disks (with a central hole like a record) were often used to check a turntable's rate of rotation. Under AC illumination, a series of radial lines on the turning disk would appear to stand still when the phonograph turntable was accurately rotating 33 1/3 times per minute. The standing-still effect was visible under incandescent lighting, but could be seen more clearly under fluorescent light.
 
  • Like
Likes Nik_2213, Asymptotic, FactChecker and 1 other person
  • #13
Ralph Dratman said:
Back in the days of music pressed into vinyl disks, stroboscopic calibration disks (with a central hole like a record) were often used to check a turntable's rate of rotation.
My (pretty high class) Garrard (401?) record playing deck had a pattern round the outside of the turntable and a magnetic brake to vary the speed with. No crystal controlled drives in those days in the home.
 
  • #14
Ralph Dratman said:
Back in the days of music pressed into vinyl disks, stroboscopic calibration disks (with a central hole like a record) were often used to check a turntable's rate of rotation. Under AC illumination, a series of radial lines on the turning disk would appear to stand still when the phonograph turntable was accurately rotating 33 1/3 times per minute. The standing-still effect was visible under incandescent lighting, but could be seen more clearly under fluorescent light.

Thanks! That is an excellent illustration of the topic.
 
  • Like
Likes Nik_2213
  • #16
tech99 said:
As a matter of interest, a flashlamp bulb can respond to audio frequencies, and can be used for optical communication.
Indeed! Our physics class (1969/70 ?) did a 'scrap-heap challenge' demo of the system. The receiver's sensor was an OC71 transistor (with the black paint scraped off). We never got it to work over any significant distance in daylight but after dark it worked fine provided you had good optical alignment and tweaked the OC71's bias to cope with what residual background light remained. The best we managed was about 150 metres.
 
  • Like
Likes Merlin3189 and Nik_2213
  • #17
rcgldr said:
...
The worst case flicker I've seen are half-wave LED Christmas lights, which just use a diode to only allow half of each AC cycle to power the LED's.
...

Currently doing a test, as I think I have one of these sets of lights:

2017.12.02.pf.science.png


@dlgoff , did I do this right?
 

Attachments

  • 2017.12.02.pf.science.png
    2017.12.02.pf.science.png
    104.3 KB · Views: 887
  • #18
OmCheeto said:
@dlgoff , did I do this right?
Looks good to me.
 
  • #19
OmCheeto said:
Currently doing a test, as I think I have one of these sets of lights:

did I do this right?
Is the display showing zero volts at the top (and I assume negative voltage "spikes")? I would expect a bit over half of the time spent at zero volts on a half wave LED lights. Each LED has a small circuit, and there may be some type of capacitor to help a bit. I have both half wave and full wave LED lights, and the difference is very visible.
 
  • #20
I believe 60Hz was set by Nicola Tesla to reduce the cost of electrical transformers for high tension transmission lines. 50Hz transformers would be larger and more expensive. Making it higher than 60Hz reduces transformer efficiency. I think Europe chose 50Hz to avoid Tesla's patents. As far a flicker is concerned 60Hz is better than 50. Flicker is smoothed by the our optical rods and cones that have a response time curve covering about 50 milliseconds however, since this a ramp and decay curve you can still detect flicker at 20 times a second or greater. Some people are quite sensitive to the flicker of florescent lights that flicker at 120 times a second. They don't actually see the flicker but their eyes tire quickly because their iris is attempting to respond and getting mixed signals. Your iris, when you are young responds quite quickly as a measure to protect you from bright lights but with age this response slows.
 
  • #22
rcgldr said:
Is the display showing zero volts at the top (and I assume negative voltage "spikes")? I would expect a bit over half of the time spent at zero volts on a half wave LED lights. Each LED has a small circuit, and there may be some type of capacitor to help a bit. I have both half wave and full wave LED lights, and the difference is very visible.
Actually, it's upside-down. I did tests on several lamps, and I've completely forgotten how to use an o-scope. (It's been 30 years!) I ended up reversing the leads in an attempt to make sense of what was going on.

Anyways, it was set to "AC", and I'm measuring the light output with a solar panel, so it "floated" down.

With the o-scope set to "DC", and the probe connections corrected, it looks like this:

2017.12.02.pf.science.plus.to.plus.DC.png


None of the other lamps I tested bottomed out at zero volts. Though, they had interesting "noise" patterns, which I can't visually perceive.

ps. And I just discovered that my $5 garage sale o-scope's second channel DOES work. Yippie!
 

Attachments

  • 2017.12.02.pf.science.plus.to.plus.DC.png
    2017.12.02.pf.science.plus.to.plus.DC.png
    48 KB · Views: 753
  • Like
Likes Asymptotic, rcgldr and dlgoff
  • #23
This effect caused us some head-scratching when a family member developed photo-sensitive epilepsy. Although UK's 50 Hz was not a photic-driver frequency, moving gaze through such flicker could trigger a fit given prior kindling.

Trad incandescent filament bulbs were not a problem, but the more powerful variety have been phased out.
CFLs, the replacement 'compact fluorescents', have a nasty habit of flickering at start-up, especially in the cold or towards 'end of life'.

One three-lamp cluster over a stair-well was a real hazard; it was cooler than main house, the light was off until required, and it was in 'line of sight'. It was also hard to access, so failing lamps were left longer than we'd prefer...

After some thought, I fitted two CFLs plus a 'rough service' incandescent. The latter, trading longer life for reduced efficiency, lit within one cycle, and masked the CFLs' start-up. After LED lamps improved enough to be trustworthy, I've begun replacing CFLs and 'rough service' incandescents alike...

Along the same lines, I've replaced several 5-foot (~1500 mm) strip-lights with LED equivalents. Remember to replace 'blinky' starter with 'by-pass' supplied ! Now, they're 'Instant On', no darkening at ends, scant handling risk...
 
  • Like
Likes Asymptotic
  • #24
You can get some vivid illustrations of the behaviour of lights if you look at city lights through binoculars or similar instruments at night at such a distance that the scene is dark, with the lights appearing as spots against a black background.
When you move the instrument the lights describe lines determined by your movements. If you look at lights that glow continuously, such as flames and DC incandescents, they describe lines of relatively consistent width. Lights that plainly flicker, such as some discharge lights and LEDs appear as dotted or dashed lines.
"Ordinary" AC incandescents produce continuous lines, but with distinct constrictions of intervals determined by how fast you move the instrument. Fluorescents vary depending on the details of the lamps, because the fluorescence tends to smear the discharge pulses out into lines.
Stars don't blink unless you use a telescope that can show you pulsar behaviour (good luck with that one! :wink: )

Sometimes one can get good effects with a camera. Using a digital camera, I once managed to capture some shots of a swarm of sunlit midges flying opposite a contrasting shadowed background, using fixed focus. Their wingbeats and motion described pulsed lines, in which I could count the beats, and calculate their frequency from the number of beats for the speed of the shots (roughly 1 kHz for that species!)

As an exercise, playing with such a toy can be quite instructive.
 
  • #25
I lived and worked in Europe and did notice the lower frequency flicker and found it annoying. I was in my 50s and my eyes picked it up. Even LEDs fed by automotive DC circuits can flicker depending on how they're being driver. You notice it when your eyes scan by LED taillights on some cars.
 
  • #26
The "flicker " in AC bulbs is extremely difficult to detect visually since the difference between maximum and minimum is only a few percent.

As for fluorescents, if they use a magnetic ballast then you have flicker (0-100% difference) at twice the mains frequency (100Hz in Europe, 120Hz in US). If its a newer fluorescent then it should have an electronic ballast and the flicker frequency is going to be 20-40kHz. Good luck trying to see that.

As for LEDs, the frequency can vary. Most decent LED lamps will use a switching power supply which pushes the major artifacts into few hundred to few thousand Hz range. The filament style bulbs are the worst and will be as bad as magnetic ballasted fluorescents. Even a small capacitor though can improve the perception. Go for a cheap LED bulb/fixture and you will get exactly what you pay for.
 
  • #27
anorlunda said:
You're right, it does. However, the blink rate (2x power frequency) is too fast to see in most circumstances and too fast for most people to see.

When I was younger and my eyes were much better, I could sometimes see flicker in fluorescent lights. I recall noting that I could see flicker in Europe (with 50Hz power) but not in America (with 60Hz power). That was the limit of my visual perception. By the time I reached middle age, my eyes were no longer good enough to see flicker.

It also effects the lifetime of incandescent bulbs, because the filaments head/cool expand/shrink with each cycle. In the Thomas Edison museum in Menlo Park, NJ some of Edison's DC light bulbs have been burning continuously for 120 years. That's much easier to do with DC.

I am told that this was particularly a problem in the early days (even before my time), when some systems used 25 Hz.
 
  • #28
Dr.D said:
I am told that this was particularly a problem in the early days (even before my time), when some systems used 25 Hz.

I'm sure you're correct. This is interesting.

https://en.wikipedia.org/wiki/Utility_frequency#History said:
Very early isolated AC generating schemes used arbitrary frequencies based on convenience for steam engine, water turbine and electrical generatordesign. Frequencies between 16⅔ Hz and 133⅓ Hz were used on different systems.
...
After observing flicker of lamps operated by the 40 Hz power transmitted by the Lauffen-Frankfurt link in 1891, AEG raised their standard frequency to 50 Hz in 1891.

I recall that 25 Hz at Niagara Falls continued until the 1990s. But not for lighting, but rather to power motors in the local mills. The motors were so rugged, that they needed no maintenance since 1895 other than a few drops of oil. There was no motivation to replace them with modern motors
 
  • Like
Likes sophiecentaur
  • #29
anorlunda said:
I recall that 25 Hz at Niagara Falls continued until the 1990s. But not for lighting, but rather to power motors in the local mills. The motors were so rugged, that they needed no maintenance since 1895 other than a few drops of oil. There was no motivation to replace them with modern motors

Now that is some impressive design! To last almost a century without much maintenance is really amazing. If the original 25 Hz were no longer available, they would have had to scrap these great motors, or at the very least, do a major rebuild. Do you know what kind of mills they were powering (textiles, steel, feed, ?)?
 
  • #30
Dr.D said:
Now that is some impressive design! To last almost a century without much maintenance is really amazing. If the original 25 Hz were no longer available, they would have had to scrap these great motors, or at the very least, do a major rebuild. Do you know what kind of mills they were powering (textiles, steel, feed, ?)?

I think it was textiles and flour mills. I used to have pictures of them but I can't find them. However, a 3 HP motor in 1895 was about 10 feet tall.
3a.jpg


http://www.wnyhistory.org/portfolios/businessindustry/george_urban_flour/george_urban_flour.html
 

Attachments

  • 3a.jpg
    3a.jpg
    38.3 KB · Views: 483
  • #31

Attachments

  • 2017.12.03.lamp.flutter.pf.friendly.png
    2017.12.03.lamp.flutter.pf.friendly.png
    48.1 KB · Views: 533
  • #32
OmCheeto said:
The only light source I can get a (humanely) perceptible flicker from is my old LED x-mas lights.

I wonder if you were 18, if you could see the flicker. Got any teens in the house?
 
  • #33
anorlunda said:
I wonder if you were 18, if you could see the flicker. Got any teens in the house?
Nope. Just me.

ps. Haven't had a teen in my house since my housewarming party, 30 years ago, when I was about 30. Damn 25 year olds invited their younger friends, who invited their younger friends, who invited their younger friends...

<Betty Davis voice>What. a. mess... </Betty Davis voice>

I found Cheez-whiz dripping down my walls the next morning.

Never again...
 
  • #34
SF cookie said:
Indeed! Our physics class (1969/70 ?) did a 'scrap-heap challenge' demo of the system. The receiver's sensor was an OC71 transistor (with the black paint scraped off). We never got it to work over any significant distance in daylight but after dark it worked fine provided you had good optical alignment and tweaked the OC71's bias to cope with what residual background light remained. The best we managed was about 150 metres.

Yes I too affectionately remember doing an experiment just like that in the late sixties. The output light was an old 12V car side-light bulb . It needed to be fed with a constant DC bias which made it glow dim yellow and the audio input came from the wires that would have powered the 3 ohm speaker from an old valve radio. The receiver was a 6 inch diameter magnifying glass which threw a real image onto a phototransistor (OC71 type) with some amplifying circuitry feeding headphones. What staggered me was how undistorted the lower frequencies were from speech. When it came to music the higher frequencies got fuzzy but to this day I am still amazed that that car-bulb filament could cool down sufficiently in (I estimated) less than two thousands of a second so I could hear notes with frequencies of over 2000 hz. And I also was able to pick the signal up at night over 100 metres away , but at that distance keeping that tiny spot image focussed on the phototransistor became quite difficult I remember.
Actually on thinking about it more now , I must have fed the filament with audio signal from a halfwave rectifier to stop the frequency doubling effect of 2 heat-ups per cycle.
 
Last edited:
  • #35
Gary Feierbach said:
I believe 60Hz was set by Nicola Tesla to reduce the cost of electrical transformers for high tension transmission lines. 50Hz transformers would be larger and more expensive. Making it higher than 60Hz reduces transformer efficiency. I think Europe chose 50Hz to avoid Tesla's patents. As far a flicker is concerned 60Hz is better than 50. Flicker is smoothed by the our optical rods and cones that have a response time curve covering about 50 milliseconds however, since this a ramp and decay curve you can still detect flicker at 20 times a second or greater. Some people are quite sensitive to the flicker of florescent lights that flicker at 120 times a second. They don't actually see the flicker but their eyes tire quickly because their iris is attempting to respond and getting mixed signals. Your iris, when you are young responds quite quickly as a measure to protect you from bright lights but with age this response slows.
I think the silicon-iron alloys used then could have worked well at twice that frequency which would have been both good news and bad news. The good news would have been smaller lighter transformers and the bad news would have been a mains hum becoming an unpopular bumble-bee buzz everywhere.
 
Back
Top