Is there a way to improve CRT raster scan efficiency by scanning both ways?

  • Thread starter artis
  • Start date
  • Tags
    Crt
In summary, the conversation discusses a proposed modification to the original electron gun raster scan pattern used in television displays. The modification would have the scan pattern move from left to right and then back from right to left in alternating rows, eliminating the need for the electron beam to reset and saving time. However, the idea faced objections due to potential issues with non-parallel lines and the impact on existing technology. The conversation also touches upon the historical development of the scan pattern and its influence on reducing flicker.
  • #71
All this stuff about color is beyond the original question, as none of that was known when the standards were set.
artis said:
Were there any ideas to have the raster scan pattern from left to right and then in the next row from right to left back and then again from left to right etc. This way instead of moving the beam back and starting a new line a new line would simply be drawn from the side where the beam left the previous line.
The quick answer: What you are describing is technically possible, and might be a more efficient use of bandwidth. However, vacuum tubes in the 1940s were not up to the task. (TVs needed to be affordable by ordinary people, so the number of vacuum tubes needed had to be kept to a minimum.) In all the analog standards, the horizontal and vertical positioning are achieved by simple sawtooth waves. One at 50/60 hz, the other at about 15,000hz. The back-and-forth pattern you describe would be easy: Just replace the 15,000hz sawtooth wave with a triangle wave at the same frequency. But the consequences for vertical positioning would be surprisingly complex, as other posters have mentioned. It would require subtle voltage control that those tubes could not achieve.

Once the standards were set they could not be changed without some massive benefit to offset the cost. The transition to Digital TV is when that happened.

Edit: Oops, I didn't see the other page. Sorry if this is redundant.
 
Last edited:
Engineering news on Phys.org
  • #72
Algr said:
The transition to Digital TV is when that happened.
Absolutely. You only need to look at the results of analogue standards conversion between US and European TV signals (and Telecine) to see the limitations of the methods available. To be fair, though, they got some very reasonable and watchable results, which allowed us access to all that rich vein of US TV culture when we had been starved for years, after the war.

Digital processing that's sufficiently powerful, allows you to accept TV images of any standard, to get the best 'quality' information about the original moving scene and to replay it in any other standard. 'Legacy' becomes less and less of a problem.
 
  • Like
Likes Algr
  • #73
sophiecentaur said:
To be fair, though, they got some very reasonable and watchable results, which allowed us access to all that rich vein of US TV culture when we had been starved for years, after the war.
True. When PAL was converted to NTSC in the 70's to 90's, the result gained a subtle motion shimmer that resembled a film transfer. IMHO this could improve the video sequences and make them rather more cinematic. It made a mess of sports coverage though - I remember the olympics not looking so good if it was PAL-to-NTSC. We still have this issue with 50hz to 60 hz conversions. But I suppose motion interpolation can eliminate converter lag most of the time.
 
  • Like
Likes sophiecentaur
  • #74
Algr said:
But I suppose motion interpolation can eliminate converter lag most of the time.
I wouldn't say it "eliminated" the lag - perhaps it 'mitigated' the problem a bit. I only ever saw NTSC pictures that had been converted to PAL so your experience of conversion in the other direction may imply something about the relative qualities of the two source systems.
But the way it had to convert the the field rate difference was necessarily crude due to the limited technology of the ultrasonic field delay lines. The old systems were seriously struggling whereas the newest systems have greater capability than actually needed.
 
Last edited:

Similar threads

Back
Top