The core argument for CRT's is their low latency and high refresh rate. Nobody is using a CRT for picture quality (outside the aforementioned blurred retro gaming look)
However this article is from 2019, and I know they have some pretty snazzy gaming monitors today that might well be better than old top-end CRTs.
The low latency only works when the source is sending an analog signal which is essentially controlling the electron gun. An NES for example, could decide between scan lines what the next scaling to emit would look like.
Most modern systems (and computers) are sending digital signals where entire frames need to be constructed and reconstructed in a buffer before they can be displayed. Many HD CRT televisions have terrible latency because they’ll take the analog scan line signal, buffer it into a frame, scale it to the native CRT resolution, then scan it back out to the screen. A high end PVM might allow a straight path, but there is maybe one Toshiba HD CRT that doesn’t introduce several frames of latency (iirc).
That said, from 1999 to 2008 I ran 1600x1200 on 19in CRTs and except for professional LCDs, nothing had resolution, pitch, and color that came close. For 2008 was the inflection point where cost and quality of LCDs exceeded CRTs.
> Most modern systems (and computers) are sending digital signals where entire frames need to be constructed and reconstructed in a buffer before they can be displayed.
HDMI and DVI-D are a digital representation of a tv signal --- there's all the blanking and porches and stuff (audio data is often interleaved into the blanking periods). You could process that in realtime, even though most displays don't (including CRT HDTVs; I trust your maybe one doesn't).
What you're describing is fully a product of the display controller architecture, not the digital signal. You can "chase the beam" with HDMI/DVI just as well (or better) as you can with analog. There's no need to buffer the whole screen and be even one frame latent.
I've done it in Verilog on FPGA, for example.
But we do so on our machines because that's how the GPU/display controller pipeline works easiest from a software POV. That latency would be present on a CRT as well. What would be missing is the internal latency present in some monitors.
I was doing the "chase the beam" approach, but never got it off the ground (home project, mostly curiosity). But always on the back of my mind was: even if you're doing everything in low latency HDMI signals, there's no guarantee that the display isn't still buffering it, even if you have "game mode" on on your TV.
This is a thing in the last 5-10 years, but was absolutely not a concern to the majority of display manufacturers before gamers complained. 100ms is significant in twitch games where network latencies are 20ms or lower.
It makes little sense to buffer a full frame before up scaling. Why would you do that? It’s a total waste of DRAM bandwidth too.
The latency incurred for upscaling depends on number of the vertical filter taps and the horizontal scan time. We’re talking order of 10 microseconds.
The only exception is if you’re using a super fancy machine learning whole-frame upscaling algorithm, but that’s not something you’ll find in an old CRT.
> The core argument for CRT's is their low latency and high refresh rate
The article is outdated (like you said) because LCD/OLED displays have long since surpassed CRTs in latency and refresh rate.
A modern gaming LCD can refresh the entire screen multiple times over before an old CRT scans the entire frame and returns to the top to redraw the next one.
Yeah, and 90% of the commenters in this discussion clearly haven't looked at TV or computer monitor made in the last few years; they're at least 5 years behind the market.
There are Samsung TVs that have around ~6ms input delay, and many, many gaming monitors have 1-2ms input delay.
If you spend $400 you can get a 1440p monitor with 1ms input delay, 2-3ms grey-to-grey timing, that will do 240hz.
All with more contrast, color gamut, and resolution.
Point being? Your modern LCD still uses buffers and sample and hold. They will always have inherent input latency compared to CRT regardless how fast they can refresh.
As someone who almost never plays video games but owns several CRTS for the sake of media I can attest to the fact that at least some of us purely own CRT's for their picture quality, or in my case the fact that I think pre 2000's 4:3 media was also kind of intended to be watched on screens like that (much in the same way I see video gamers arguing)
I'll take a wild guess though that my group (crt media watchers) is slightly less easy to take advantage of with actual hardware than video gamers which I'd guess to be the reason why there are few articles or HOLY GRAIL crt's like this fw900 widescreen in the media watching community. Not that we aren't often suckers for things like ORIGINAL VINTAGE POSTERS and SEALED MEDIA lol.
However this article is from 2019, and I know they have some pretty snazzy gaming monitors today that might well be better than old top-end CRTs.