CRTs don’t have pixels so the resolution of the signal isn’t that important. It’s about the inherent softness you get from the technology. It’s better than any anti-aliasing we have today.
CRTs do have pixels. If they didn’t, you could run an SVGA signal (800x600 at 60 Hz) directly into any CRT. If you tried this, it would likely damage the tube beyond repair.
I certainly saw aliasing problems on CRTs, though usually on computer monitors that had higher resolution and better connection standards. The image being inherently “soft” is related to limited resolution and shitty connections. SCART with RGB connections will bring out all the jagginess. The exact same display running on composite will soften it and make it go away, but at the cost of a lot of other things looking like shit.
CRTs do have pixels. If they didn’t, you could run an SVGA signal (800x600 at 60 Hz) directly into any CRT. If you tried this, it would likely damage the tube beyond repair.
Would it, though? I’m skeptical.
If it did, it wouldn’t be because they have “pixels,” though; it would be because overdriving the deflection yoke with higher-frequency signals would generate too much heat for the TV to handle.
Otherwise (if it didn’t overheat), it should “work.” The result might look weird if the modulation of the signal didn’t line up with the apertures in the shadow mask right, but I don’t see any reason why sweeping the beam across faster would damage the phosphors. (Also, I’m not convinced a black & white TV would have any problem at all.)
It will tend to turn the beam on when it’s off to the side, outside the normal range of the screen. X Windows users in the mid 90s had to put in their exact scanline information or else the screen could blow up. That went away with a combination of multiscan monitors and monitors being able to communicate their preferred settings, but those came pretty late in the CRT era.
Edit: in any case, color screens need to have at least bands of red/green/blue phosphor. At a minimum, there will be breaks along either the horizontal or vertical lines, if not both.
When you say “blow up” do you mean the tube would literally explode, it would burn through phosphors, a circuit board would let the magic smoke out, or something else?
I remember configuring mode lines in X. Luckily, I never found out the hard way what happened if you got it wrong.
I grew up on 2600 on a tv in the 70’s. Computer graphics on crts were incredibly jagged. If you used a magnifying glass on a pixel it was blurred misconverged spot because it didn’t hit the shadow mask exactly on target.
The jaggedness of the 2600 wasn’t because the TV itself was jagged; it was because the 2600 was so low-resolution (160x192, maximum) that it had to be upscaled – naively, with no antialiasing! – even just to get to NTSC (480 scanlines, give or take).
So yeah, when each “pixel” is three scanlines tall, of course it’s going to look jagged even after the CRT blurs it!
A high resolution LCD with anti aliasing will do a better job than a low resolution crt. Crt shadowmasks defined the limits of pixels and it wasn’t good even on computers that could output higher than 2600 resolution.
CRTs don’t have pixels so the resolution of the signal isn’t that important. It’s about the inherent softness you get from the technology. It’s better than any anti-aliasing we have today.
CRTs do have pixels. If they didn’t, you could run an SVGA signal (800x600 at 60 Hz) directly into any CRT. If you tried this, it would likely damage the tube beyond repair.
The exact mechanism varied between manufacturers and types: http://filthypants.blogspot.com/2020/02/crt-shader-masks.html
I certainly saw aliasing problems on CRTs, though usually on computer monitors that had higher resolution and better connection standards. The image being inherently “soft” is related to limited resolution and shitty connections. SCART with RGB connections will bring out all the jagginess. The exact same display running on composite will soften it and make it go away, but at the cost of a lot of other things looking like shit.
Would it, though? I’m skeptical.
If it did, it wouldn’t be because they have “pixels,” though; it would be because overdriving the deflection yoke with higher-frequency signals would generate too much heat for the TV to handle.
Otherwise (if it didn’t overheat), it should “work.” The result might look weird if the modulation of the signal didn’t line up with the apertures in the shadow mask right, but I don’t see any reason why sweeping the beam across faster would damage the phosphors. (Also, I’m not convinced a black & white TV would have any problem at all.)
It will tend to turn the beam on when it’s off to the side, outside the normal range of the screen. X Windows users in the mid 90s had to put in their exact scanline information or else the screen could blow up. That went away with a combination of multiscan monitors and monitors being able to communicate their preferred settings, but those came pretty late in the CRT era.
Edit: in any case, color screens need to have at least bands of red/green/blue phosphor. At a minimum, there will be breaks along either the horizontal or vertical lines, if not both.
When you say “blow up” do you mean the tube would literally explode, it would burn through phosphors, a circuit board would let the magic smoke out, or something else?
I remember configuring mode lines in X. Luckily, I never found out the hard way what happened if you got it wrong.
Literally blow up the tube in the worst cases.
Interesting, TIL!
Crts were jagged and blurry. A misconverged pixel isn’t good anti aliasing.
https://en.m.wikipedia.org/wiki/Missile_Command#/media/File%3AA5200_Missile_Command.png
That image is a digital rendering of the raw data, not a photo of how a CRT would render it.
CRTs were nothing if not the opposite of jagged.
I grew up on 2600 on a tv in the 70’s. Computer graphics on crts were incredibly jagged. If you used a magnifying glass on a pixel it was blurred misconverged spot because it didn’t hit the shadow mask exactly on target.
Look at that rope: https://www.deviantart.com/gameuniverso/art/Review-of-Pitfall-Atari-5200-761326088
“Blurred” is the opposite of “jagged,” though.
The jaggedness of the 2600 wasn’t because the TV itself was jagged; it was because the 2600 was so low-resolution (160x192, maximum) that it had to be upscaled – naively, with no antialiasing! – even just to get to NTSC (480 scanlines, give or take).
So yeah, when each “pixel” is three scanlines tall, of course it’s going to look jagged even after the CRT blurs it!
A high resolution LCD with anti aliasing will do a better job than a low resolution crt. Crt shadowmasks defined the limits of pixels and it wasn’t good even on computers that could output higher than 2600 resolution.