- #1
jman5000
- 28
- 2
- TL;DR Summary
- What were the limits of crt display in terms of resolution, brightness, frequency, ect...
I've been reading about how color crt displays work because I've become interested in trying to make one, even if it isn't going to be very good compared to the ones of the past. (yes I know they have high voltage that can arc through material/ air and that voltage drop and electron collisions emit x-rays.)
The main attributes I'm interested in is resolution and brightness. I know that a higher resolution requires less beam current since less electrons means less repulsion, but say I wanted to fit a resolution of 3840x2160 into a 34 inch screen, this is a much higher beam current per area than any crt I've heard of. Do you think a strong enough focusing apparatus could be used to stop the electrons from scattering and be able to keep the brightness of your average crt, say 150 nits while still maintaining that high resolution I outlined above?
Also, I've read that crt phosphors only convert ~30% of energy into visible light which seems really low but I don't know how that compares to the phosphors used in oled displays. What makes crt phosphors so inefficient and could a phosphor be made that was able to convert the electrons of the beam more efficiently? I mean what is different to how an oled phosphor lights up compared to a crt? Don't they both have electrons hitting it to light up? A more efficient phosphor seems like the easiest solution to overcoming crt limitations since you wouldn't have to worry about increasing voltage which would wear out the cathode and require even thicker screens to stop xrays.
Now this next part is going into super hypothetical territory which I'd never want to actually test. Is there any reason why I couldn't have super high brightness of like 600+ nits if I was okay with having multiple feet of strontium glass? The way we talk about xray blocking, saying things like the strontium glass screen blocks 99.8% of xrays means that some amount gets through, right? So having a 600 nit crt would be emitting way higher energy xrays regardless of the screen used. I'd think the glass would reduce the energy of xrays a lot but the potential for a xray to go through the glass completely unimpeded is exists right? Additionally, do you think that would degrade the image quality in any way? I'd think the glass would have slight imperfections that aren't frequent enough to affect the image at lower thickness but would accumulate at thicker glass sizes.
Let me know of any other limitations you know of that would make this a dead out of water idea.
The main attributes I'm interested in is resolution and brightness. I know that a higher resolution requires less beam current since less electrons means less repulsion, but say I wanted to fit a resolution of 3840x2160 into a 34 inch screen, this is a much higher beam current per area than any crt I've heard of. Do you think a strong enough focusing apparatus could be used to stop the electrons from scattering and be able to keep the brightness of your average crt, say 150 nits while still maintaining that high resolution I outlined above?
Also, I've read that crt phosphors only convert ~30% of energy into visible light which seems really low but I don't know how that compares to the phosphors used in oled displays. What makes crt phosphors so inefficient and could a phosphor be made that was able to convert the electrons of the beam more efficiently? I mean what is different to how an oled phosphor lights up compared to a crt? Don't they both have electrons hitting it to light up? A more efficient phosphor seems like the easiest solution to overcoming crt limitations since you wouldn't have to worry about increasing voltage which would wear out the cathode and require even thicker screens to stop xrays.
Now this next part is going into super hypothetical territory which I'd never want to actually test. Is there any reason why I couldn't have super high brightness of like 600+ nits if I was okay with having multiple feet of strontium glass? The way we talk about xray blocking, saying things like the strontium glass screen blocks 99.8% of xrays means that some amount gets through, right? So having a 600 nit crt would be emitting way higher energy xrays regardless of the screen used. I'd think the glass would reduce the energy of xrays a lot but the potential for a xray to go through the glass completely unimpeded is exists right? Additionally, do you think that would degrade the image quality in any way? I'd think the glass would have slight imperfections that aren't frequent enough to affect the image at lower thickness but would accumulate at thicker glass sizes.
Let me know of any other limitations you know of that would make this a dead out of water idea.