- #1
- 10,825
- 3,690
This is the first post in a series I plan on creating a Video Codec based only on AI.
First a while ago now I did a post on 8k vs 4k
https://www.physicsforums.com/threa...choose-the-right-tv-size.982481/#post-6283071
One of the takeaways was:
'First despite what others will tell you, you can see a difference in 8k TV’s compared to 4k. The question is why - and that’s where its surprising. What Harmonic found from 8k and 4k sources into 8k TV’s is 8K looked substantially better. But then they applied (hopefully high quality) down-scaling on the the 8k to make it 4k and compared that to the 4k direct. Surprisingly the down-scaled 8K looked substantially better on the 4k or 8k TV. Now it was very close to the 8K direct into both TV’s. In fact people then found it very difficult to tell the difference between the two on normal size screens of say 65 inchs (the one I have). Conclusion - Harmonic believes we will switch to production in 8K but watch it in 4k - at least initially until screen sizes increase substantially.'
It's one thing to notice something, it is another to understand why. It perplexed me until I understood how TV cameras work, and the Bryce Filter.
It allows sensors (which only record light intensity) to record light wavelength and is used in nearly all modern digital cameras. This filter uses a mosaic pattern of two parts, green, one part red, and one part blue, to interpret the colour information arriving at the sensor. Two pixels of green is used because the eye is more sensitive to green. Once recorded, digital algorithms are applied to interpolate or "demosaic" the resulting Bayer pattern and turn it into full-fledged colour data for the image. This means an 8k camera is not 8k - it gives an 8k output, but that is done by trickery to make it seem like 8k. However, if you downscale it to 4k, you get a better source than a 4k camera that uses the same trickery. That's why you get most of the benefits of 8k at 4k.
Thanks
Bill
First a while ago now I did a post on 8k vs 4k
https://www.physicsforums.com/threa...choose-the-right-tv-size.982481/#post-6283071
One of the takeaways was:
'First despite what others will tell you, you can see a difference in 8k TV’s compared to 4k. The question is why - and that’s where its surprising. What Harmonic found from 8k and 4k sources into 8k TV’s is 8K looked substantially better. But then they applied (hopefully high quality) down-scaling on the the 8k to make it 4k and compared that to the 4k direct. Surprisingly the down-scaled 8K looked substantially better on the 4k or 8k TV. Now it was very close to the 8K direct into both TV’s. In fact people then found it very difficult to tell the difference between the two on normal size screens of say 65 inchs (the one I have). Conclusion - Harmonic believes we will switch to production in 8K but watch it in 4k - at least initially until screen sizes increase substantially.'
It's one thing to notice something, it is another to understand why. It perplexed me until I understood how TV cameras work, and the Bryce Filter.
It allows sensors (which only record light intensity) to record light wavelength and is used in nearly all modern digital cameras. This filter uses a mosaic pattern of two parts, green, one part red, and one part blue, to interpret the colour information arriving at the sensor. Two pixels of green is used because the eye is more sensitive to green. Once recorded, digital algorithms are applied to interpolate or "demosaic" the resulting Bayer pattern and turn it into full-fledged colour data for the image. This means an 8k camera is not 8k - it gives an 8k output, but that is done by trickery to make it seem like 8k. However, if you downscale it to 4k, you get a better source than a 4k camera that uses the same trickery. That's why you get most of the benefits of 8k at 4k.
Thanks
Bill
Last edited: